Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
Imagine that tomorrow we perfect mind-upload. Your entire brain, and with it your identity and memories get cloned into an AI. You get to meet the AI, it’s really you. But the physical you, the meat you, still exists. The AI is a clone. I presume there would be no need to kill yourself, but would you no really longer have fear of death in your physical body? I doubt it. The thing about us living forever is that even if it happens in your lifetime, it probably won’t be ‘you’ living forever.
That is a strict improvement over the status-quo.
I'm not a biological chauvinist, and I think that the upload has equal claim to my name and assets. I also expect that unless things go really awry, the human version would probably end up acquiring biological immortality eventually. Destructive scans seem much easier than ones that preserve the original, but the latter isn't a bad thing as far as I'm concerned. It always leaves the option of another upload, one when the original is closer to death.
Even if that wasn't the case, I'd rest easier. Growing old and dying of old age sucks, but it is a great comfort to have seen your grandkids be born, follow in your footsteps and flourish. You can die with fewer regrets. In the same manner, if I had a backup, even one that would outlive me, I'd wish it well, and know that it would share most of my goals and desires, and would mourn my passing.
Or feel relief about not having some progenitor who's seen as more-real-than-you hanging aroung anymore.
I suspect there will be all kinds of dysfunctions with the uploads themselves and revolving around them. The psychologists of the future will have their quackery cut out for them.
The best person to speculate about a copy of myself would be me. And I don't think that would happen.
The copying process may not be perfect. There could be bit flips. And that's assuming that your pattern will actually think like you go; that the difference in hardware will not matter and that the uploading process actually captures everything that makes you tick.
We're talking about a hypothetical technology, one that might pass through decades of improvement and iteration.
The minimum viable product is one that isn't a gibbering mess after destructive scanning. The ideal is one that is absolutely indistinguishable (outside of being in a computer) from the original human and leaves the OG meat intact.
There's plenty of room in between. I'm not the exact same person when I go to bed and wake up the next day, and I don't write a will before I do. Or when I consider what I'll be like in a year, or even decades.
I can't say with 100% certainty what I'll be like in a few decades as is, but I'm reasonably confident that short of a brain disease, I won't become a serial killer or try and suffocate my parents in their sleep.
Scanning the brain in any fashion is the hard bit. Making sure the hardware and simulated environment is congenial is trivial in comparison. If we diverge over time, that's no surprise, even twins do, or the me of today and next year.
If you don't consider yourself the same after you go to sleep then I don't see how you can justify having any sentimental attachment to the mind upload, who immediately deviates by virtue of existing in a different context.
I don't consider myself to be exactly the same person after any period of time that can be measured on a clock. I think it's quite clear that I'm not particularly bothered by this.
If I can identify with and care about the wellbeing of the person "I" will be in decades, then I can identify with an upload.
The difference is I will never be the upload.
This strikes me as one of the failures of rationalism: attempting to reconstruct the value system based on abstract and inevident first principles such as "pattern theory", then when someone's common intuition fails to align with those, the rationalist declares common intuition wrong. Not referring to you, since you seem pretty open to other people having different opinions, but rather someone like EY calling it "muggle thinking".
I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
False premise, this isn't perfect mind upload.
The state of the art in sci-fi, last time I checked, was that you stay conscious as they disconnect your brain cells one by one (or some small enough increment) and replace them with the artificial ones, slowly so that you can fill the gaps in your memories back in Ship of Theseus style and have no doubt you're staying yourself.
Imagine if it was perfect mind upload, and you find yourself back in your meat body after the mind upload is complete. You can kill yourself, but you have to do it yourself. Now answer the question.
Refer to the edit. In the process I described, the meat body is wiped by the process, if it failed the only way I could end up is "dead".
If it was the mind upload you described, I would not undergo it as it's pointless. Or rather, I would see it as some self-fetishistic form of procreation and would do it only as soon as I wanted to bear a digital child who was a copy of me. Naturally, I wouldn't like to share my bank account with them.
More options
Context Copy link
You been playing soma recently?
SOMA is one of those Muggle Plots that immediately gets solved once you accept the pattern theory of identity.
Literally just put the original in dreamless sleep before making the copy (shouldn't be hard since the original is already an upload; just pause the hardware! ), make the copy, and then destroy the original without waking it up.
Before the process, there was one of you in the old body. After the process, there is one of you in the new substrate, which is what we wanted. No one had to experiencebeing left behind . No need for an existential crisis; it is now no different than the Star Trek transporter disassembling your atoms, beaming the information over, and re-assembling you out of new atoms at the target location.
EDIT: Original post defining the term.
This is raised in-game. That's whythere's a suicide cult who kill themselves as soon as they're uploaded and why you have to choose if you're going to mercy-kill your unconscious original before you go down into the abyss. It's mostly played for tragedy because the original is a perfectly healthy human being who kills themselves or gets murdered, and is essentially thrown away like garbage. The existence of a happy, healthy copy doesn't magically turn scanning someone's brain into a transfer of consciousness rather than the creation of one life and destruction of another.
In terms of the game, you're stuckplaying through the memories of abyss-Simon. You get to play through the experiences of car-crash Simon and woke-up-under-the-sea Simon because those got copied over and now form part of abyss-Simon's memories. The appearance of transferring between them is pure illusion - you possess their memories but you were never either of them. There's no process on earth that can transfer you into the mind of ark-Simon because there's no process that can transfer you into anyone's mind.
More options
Context Copy link
I scrolled 1 screen down in your first link and the concept as proposed by EY already looks retarded. According to him, an unliving database entry is the same as a human (complete with deleting it being murder) because it's a "unique store of knowledge and experience".
This is exactly what the existential crisis is about. If Star Trek fans didn't mind it back in the day, I can only guess it was because they weren't very philosophical about the setting.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link