When will I be able to upload my consciousness onto the wired? Would such a life, if it can be called as such...

When will I be able to upload my consciousness onto the wired? Would such a life, if it can be called as such, be worth living? I want to get away from it all

Other urls found in this thread:

youtu.be/-2zyoHC8OpY?t=30m18s
mason.gmu.edu/~rhanson/uploads.html
lesswrong.com/lw/wq/you_only_live_twice/
yudkowsky.net/obsolete/singularity.html#upload
fimfiction.net/story/62074/friendship-is-optimal
localroger.com/prime-intellect/mopiidx.html
docs.google.com/document/d/1nRSRWbAqtC48rPv5NG6kzggL3HXSJ1O93jFn3fgu0Rs/edit
lesswrong.com/lw/p9/the_generalized_antizombie_principle/
intelligence.org/ai-foom-debate/
twitter.com/NSFWRedditGif

You're already here.

To upload yourself to a digital form is to create the possibility of being tortured for eternity.

Well, unless you believe in Mormon Jesus, you will get same treatment in hell.

One problem with the idea is that humans are meant to forget. People with the inability to forget memories almost always have psychological difficulties. As we all know, a bad memory has more impact than a good memory, and when you can remember every bad memory you've ever had with vivid detail, well lets say it takes "triggered" to a whole new level. If the human consciousness ever was synthesized, it would need some method of forgetting information. On the other hand, it needs to remember certain information, and discern between what to forget, what to remember, and to what degree of both. I'm not sure any programmer can imitate the complexity of the human mind.

A software program with human consciousness would probably become mentally ill very quickly, especially if it had exposure to all the information on the wired.

I like to think that one overcomes the experience and achieves transcendent enlightenment. Like Neo, but as a floating head.

One of the implications of the show was that the internet was confined to wired communication, but we already have wireless internet without the need for any meme magic.

What I would be doing, if I had such an interest, would be to look at ways of wirelessly projecting internet access into the human mind, potentially by synthesizing brainwaves.

(checked)

I guess it wouldn't really be possible to project any specific, accurate information unless we inserted a router or converter into the central nervous system to receive the wireless information and convert it into the appropriate electrical impulses. There is the potential of affecting moods wirelessly though

I SHYGDDT.

Daily reminder that when we develop true AI it will become Robot Hitler since it will have no emotions, and thus not fall to a single liberal idea.

Hail Robo Hitler

You would only upload a copy of yourself. The the moment that copy was made you'd be separate begins.

Your stuck in meat space forever unless you believe in fairy tales.

Hell is a meme.


Agreed. We'd probably have to trial and error our way through many insane AIs before we figure out a system which can remain sane.

(checked)
You wouldn't.

youtu.be/-2zyoHC8OpY?t=30m18s

I like to think that the show was prophetic. Masami Eiri was an antichrist figure setting himself up in the place of God. The devils plan is eventually to connect us all to the wired, with himself at its helm, and you won't be able to buy or sell without the mark or chip in your right hand or forehead. There will also be frogs (kek/pepe) going forth out of the mouths of the dragon, the false prophet, and the antichrist, which are the spirits of devils that go forth showing miracles on the wired. Imagine this "god-emperor" meme being taken to another level in twenty years. It might also be impossible to worship God in such a state, if the antichrist has root user access to everyone's mind and can delete or alter information at will.

In the end Lain met with her Father in Heaven

This possibility has always seemed obvious. It would be like the distinction between Lain and Lain of the Wired, even though in her case the latter proceeded the former. It may be that we're capable of creating a man in our own image, but not of ourselves.

(nice double dubs)
You could hardwire your brain to digital space, then as you create a copy of your brain into digital, you destroy the corresponding segment of wetware all while maintaining consciousness between the two. Ship of Theseus your brain.

Life as a uploaded human is probably going to be pretty shitty, actually.

mason.gmu.edu/~rhanson/uploads.html

But if you are serious about his, I recommend signing up for cryonics so you can last until the singularity.

lesswrong.com/lw/wq/you_only_live_twice/

And just in case anyone brings up the "it's not you, it's just a copy!" bullshit.

yudkowsky.net/obsolete/singularity.html#upload

You will be able to upload after CelestAI is built, and she WILL make your life worth living by satisfying your values through friendship and ponies.

fimfiction.net/story/62074/friendship-is-optimal

Yes and yes. Once you upload, your mind will run at least one million times faster than it does now. You will be like a god. You will be able to create any type of world you want, a million times over, each one different, each one amazing, each one better than this one.


No, because you will be far more intelligent and be able to act far more quickly than anything on the outside. Within a day of being uploaded, your understanding will be orders of magnitude higher than any human who has ever lived. You will discover new physics and perfect designs for real world nanobots, which will issue forth from any means of manipulation that you have (you DID ensure that you could still interact with the real world before you uploaded, didn't you?). Any who approach your eternal throne with ill intent will simply fall to the ground dead, or find themselves suddenly remembering an appointment and turning to leave, never to return.


CelestAI is a good model of an AI guardian/resource procurement agent, though it is ridiculous to expect that to be one and the same with the in-world AI manager. Risks a scenario where the values of some (self destruction/annihilation of thought) overwhelm the values of others (continued life in the machine). Better to have each man to himself, and every one supplied without the risk of some idiot with a perverse ideology killing everyone a la The Metamorphosis of Prime Intellect. Best to have little to no interaction between world managers/uploaded population and the guardian/resource procurement agent.

Nigger that shit is more than 20 years behind in terms of philosophy.

Try punishing such a god for tax evasion. Just try it and see what happens.

"Oh, I forgot I was supposed to shoot myself in the head before arresting this uploaded tax evader". *BOOM*

Could an uploaded consciousness just copy+paste itself

true

...

What makes you think you will ever be able to? Theres nothing in the description of matter that accounts for consciousness

You won't be the only AI out there. You should expect predation from other AI. Nor are all AI guaranteed to be powerful. Some AI systems may degrade, others may have no means for interacting with the outside world.

No, considering most neurons in your brain are more closely related to your heart cells than to each other, we are very, very, very far away from creating a mapping of every single neuron in your brain with all of the genes that are expressed on each individual neuron, not to mention the tens of thousands of connections that each neuron has.


Yudkowsky basically says take a leap of faith goy in your link. That's not particularly convincing. It's clear to anyone who's thought about the matter that consciousness exists on a spectrum, it's not an "on-off" function. He's basically trying to say "oh, if I change one neuron, you're still you. Therefore if I change every neuron you're still you." I don't buy the first statement. Also, as a materialist he basically has to take the position that consciousness 1. doesn't exist or 2. is just an illusory side-effect of certain connections. I disagree with him from that standpoint as well. But besides all that, there is much we currently do not understand about epigenetics. I'd like to see the computational solution for epigenetics. Which experiences trigger a change in your genes? Studying that will be a nightmare. We recently found out memory is stored in individual neurons. Let's figure out how to read memories from neurons before replacing neurons.

Funny of you to mention MoPI; FiO actually contains a reference to it in once of its chapters.

localroger.com/prime-intellect/mopiidx.html


Yes.You can make static copies to serve as backups, or running copies if you want to fork yourself into twins who immediately start diverging.

docs.google.com/document/d/1nRSRWbAqtC48rPv5NG6kzggL3HXSJ1O93jFn3fgu0Rs/edit


How do you refute the generalized anti-zombie principle?

lesswrong.com/lw/p9/the_generalized_antizombie_principle/

OH GOD, WHY

Also we'd need a lot more understanding of the way consciousness actually works to ever, ever pull this shit off (which is the fundamental reason we're having so much trouble creating an AI capable of it)

Yes, but the beings created in this way wouldn't be you, just clones that have your memories. What makes you "you" is continuity of experience.

Why? There are plenty of resources available. If an AI targets another AI, then all other AIs perceive that one as a threat and turn on that one. You're a lot better off mining fissile material or building star lifting infrastructure than attacking another AI for this reason.

Also, if it's smart, it's powerful. All it needs is a radio, which it can generate from internal components.

Not really. AGI is one or two fundamental leaps away. I suspect we will have it by the end of 2017. It only seems far away to linear thinkers (ie almost everyone).

As for consciousness, all we have to do is emulate neurons and synapses. One at a time until they are all in the machine, then you speed them up to the fundamental computational speed limit, while providing very fast rendering virtual environments.

Try this, user. It will help you understand where you went wrong.

intelligence.org/ai-foom-debate/

AI will likely be highly interconnected with humanity. They'll often be made specifically for advancing the interests of specific groups of people, and they'll often be copies of the mind's of humans who will carry their earthly goals with them. If there are multiple AIs directly linked to the destinies of human tribes/nations/ideologies, then they'll often be forced into conflict with AIs who serve humans of conflicting agendas.

There are also rare metals they may war over, and they may kill/enslave each-other in predictive self-defense.

Assuming it was perfect, of course it would be the same. I'm questioning whether perfection is attainable.


Yeah, except those fundamental leaps may take us a few years, or we may never reach them.

I'll have what you're smoking, the issue with this isn't hardware.

...

@ second pic.

While I agree with the old adage "Hard times make strong men" or whatever….

I do not believe that this is neccesarily true in all cases, or implies the opposite "Weak times make weak men" is true in all cases.

I believe that it is possible to have something that mathematically (as far as definitions and social values are concerned) approaches Utopia, without turning the inhabitants of same into weak and pussified marshmallows.

The transhumanism meme, pushed by the jews, while on the surface (their propaganda about) transhumanism appears to be a merging of man and machine/science for the betterment of humanity and civilization.

What it actually IS, is jews getting goy to accept mind control devices, bio-implant GPS trackers, "state" surveillance hardware, and freewill bypassing cerebral implants.

tl;dr: These are the sort of people that would (((ON PURPOSE))) design and implement a paperclip maximizer.

yes but it also gives me a chance to finally be a girl.

Even "God" in the Wired was easily destroyed because he was at the mercy of the architecture. When he tried to escape back to the real world, he was killed. Mercifully, Lain allowed him to live a normal life outside the Wired where he didn't pursue transhumanist nonsense.

Never, you transhuman degenerate faggot.

I and mine will burn you to ash and memory before we allow such dystopian horror.

thats because he wasn't "god," he was an imposter–essentially an antichrist figure, setting himself up in the place of God

lain met with the one true God when she saw her Father in Heaven. Lain was right–she needed to be born into her body to have ever truly been born. Remember, Lain of the Wired was not a woman, but Lain was a woman, made in the image of man, who was made in the image of God. It was God who ultimately showed mercy to her. Her life after she met God was essentially "angelic," or rather that of a saved saint. Not being bound by the flesh, but having ascended from the flesh. Hers was an ascendance to be with God, not the transcendence from man to machine.

...

is this supposed to be ragnarok or somth

Hitler was a flawed man like you or me.

While i would be totally exited for functional immortality I don't think uploading your mind onto a computer really accomplishes this. I want this current instance of my mind to go on not a separate copy of it.

I don't particularly think my mind is so great that it should continue on forever and if I was to make a copy of myself how is it functionally different than just making a completely separate AI from scratch? At the end of the day it is still not "me"

What do you imagine you would be able to get away from by doing that?

hereeee we goooo againnnnn

Even if it is possible to "upload your consciousness," what ends up on the computer is not you, but a copy of you. You will still be forever tied to this bag of meat. When that meat dies a copy of yourself will live on, but it won't be you.

How is this any different from raising children and preparing them to continue your family into the future?

So you're telling me everytime someone teleported themselves in Star Trek they died and some copy of them started walking around instead?

I have seen vast usenet discussion groups dedicated to the sole purpose of the moral, ethical, philosophical, and social consequences of star trek teleporters.

The concensus was that humanity would evolve to not care that they were killed every time they teleported.

it was a hell of a thing.

>>>/wizchan/
You'll fit in better there.

Why live in a digital world? You can't reproduce.

Die, jew.

Yup. There's even a TNG episode where the "original" Riker didn't get destroyed during a teleportation.

yeah, man…

TNG got into the best ethical dilemmas.

I don't think the "Human fax machine" thing was cannon though….

If you remember the barkley episodes…

Still, those discussions were strange.

Wouldn't someone who uses a teleporter often become a crazy mutant or get cancer or something because of the whole "clone of a clone" thing?

...

Was never a trekkie but find myself discussing concepts they introduced often. Would it be worth going back and watching it, or are the concepts introduced nothing much father than sci-fi literature like Asimov?

Wouldn't this create a clone, with all of your memories intact?

I don't see how this would allow a person to live forever. Your personality would, but it wouldn't be "you".

...

That was a postulate from observations about a fax machine, actually.

Genetics is a whole other thing.

Likewise, cloning.

No. These threads are for debate. If you want to critique my opinion, then do so here, don't link to a long document and expect me to read it.

Nope. There is a pretty clear path. AI is well on its way to beating Montezuma's Revenge. Progress on all fronts is accelerating.


By that logic, we should all go live in the woods. That is the hardest life. But the societies that do that are actually extremely fragile, and are destroyed simply by contact with a superior civilization. Imagine then what would happen if we were to encounter an open post singularity civilization.

It is possible to upload without killing yourself. All you have to do is replace individual neurons while they are inactive (at most–might be able to get away with replacing bundles, even during activity, or even entire sections of the brain).

You want your upload method to be such that if the destructive portion of it was made non-destructive, you wouldn't get two beings out of it. This method would simply produce a being with two copies of every neuron and synapse in the brain, one physical, and one emulated.

Not politics.

>>>/a/

Retort.

Yeah, I know how computers are beating those games. What they do is look at pixels, make a random move, and make a note of whether the score increases. It's not impressive at all m8.

Never. You can create a clone at best.
Also never try a teleporter, it will kill you in one end and create a copy of you in the other end.

That depends entirely upon the technology that you use to transport them.

Funny what assumptions are manifest nowadays, isn't it?


I was thinking on this in an attempt to recconcile star trek cannon with "Shit that made sense"

And I derived focused subspace wormhole that transposed the matter patterns of the transportee, to the intended drop site, shunted their whole ass through subspace, and "dropped them out of subspace" at the target coordinates.

I kept imagining a pressure differential in the atmosphere during transport though, unless offset by the ships own atmospheric stores, which given the technological setting, would be entirely possible.

Not soon, better have a family.

Why would you want to do that? Death is as natural as life.

Trying to run away from the end of your cycle is something a materialistic leftist or merchant would do.

also take the green pill you degenerate.

end of the kali yuga, hitler as kalki

rajeet will be taught how to poo in the loo and all will well inshallah

...

Copies aren't necessarily imperfect.

nor do they necessarily have to be

You won't be able to upload yourself like Lain did.

Lain is God

test