Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
That is a strict improvement over the status-quo.
I'm not a biological chauvinist, and I think that the upload has equal claim to my name and assets. I also expect that unless things go really awry, the human version would probably end up acquiring biological immortality eventually. Destructive scans seem much easier than ones that preserve the original, but the latter isn't a bad thing as far as I'm concerned. It always leaves the option of another upload, one when the original is closer to death.
Even if that wasn't the case, I'd rest easier. Growing old and dying of old age sucks, but it is a great comfort to have seen your grandkids be born, follow in your footsteps and flourish. You can die with fewer regrets. In the same manner, if I had a backup, even one that would outlive me, I'd wish it well, and know that it would share most of my goals and desires, and would mourn my passing.
Or feel relief about not having some progenitor who's seen as more-real-than-you hanging aroung anymore.
I suspect there will be all kinds of dysfunctions with the uploads themselves and revolving around them. The psychologists of the future will have their quackery cut out for them.
The best person to speculate about a copy of myself would be me. And I don't think that would happen.
The copying process may not be perfect. There could be bit flips. And that's assuming that your pattern will actually think like you go; that the difference in hardware will not matter and that the uploading process actually captures everything that makes you tick.
We're talking about a hypothetical technology, one that might pass through decades of improvement and iteration.
The minimum viable product is one that isn't a gibbering mess after destructive scanning. The ideal is one that is absolutely indistinguishable (outside of being in a computer) from the original human and leaves the OG meat intact.
There's plenty of room in between. I'm not the exact same person when I go to bed and wake up the next day, and I don't write a will before I do. Or when I consider what I'll be like in a year, or even decades.
I can't say with 100% certainty what I'll be like in a few decades as is, but I'm reasonably confident that short of a brain disease, I won't become a serial killer or try and suffocate my parents in their sleep.
Scanning the brain in any fashion is the hard bit. Making sure the hardware and simulated environment is congenial is trivial in comparison. If we diverge over time, that's no surprise, even twins do, or the me of today and next year.
If you don't consider yourself the same after you go to sleep then I don't see how you can justify having any sentimental attachment to the mind upload, who immediately deviates by virtue of existing in a different context.
I don't consider myself to be exactly the same person after any period of time that can be measured on a clock. I think it's quite clear that I'm not particularly bothered by this.
If I can identify with and care about the wellbeing of the person "I" will be in decades, then I can identify with an upload.
The difference is I will never be the upload.
This strikes me as one of the failures of rationalism: attempting to reconstruct the value system based on abstract and inevident first principles such as "pattern theory", then when someone's common intuition fails to align with those, the rationalist declares common intuition wrong. Not referring to you, since you seem pretty open to other people having different opinions, but rather someone like EY calling it "muggle thinking".
I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.
Human intuition is often wrong! Our intuition does not equip us in the least to naturally handle things like relativistic speeds or quantum phenomena. It evolved to handle the narrow ranges or scales that were relevant to macroscopic animals.
Even in that domain, our expectations fail us. Newton had to break from Aristotlian mechanical intuitions that were Plain Common Sense for hundreds of thousands of years.
I believe you, and why wouldn't I? At the end of the day, some values are entirely arbitrary and can't be otherwise (without solving infinite regress). This is as true for me as it is for you.
It is entirely fair for someone to believe that they value living for its own sake, and that their value systems exclude something like an upload. Now, sometimes there might be underlying beliefs or justifications that build on axiomatic bedrock, and those can be wrong when judged against their building blocks.
For example, someone might think it not even theoretically possible to upload a human mind and run its algorithms on a computer. I would, of course, think that they're mistaken on factual grounds, even if it is an enormously difficult feat to pull off.
Intuitive understandings of personal identity are often not robust in the least, but I'm confident mine are far more internally consistent than many people (especially since they've never even thought about it).
That being said, if someone recognizes that an upload might have the same thoughts and values as them, including a belief that uploads aren't moral persons or identical to the original (which might be overruled by self-interest, it's hard to make a man accept something if his literal existence hinges on not accepting it); and they still disagree with me on the moral equivalence of their own uploads, then I can't argue further.
We would agree on the facts, and disagree on their implications. I know that my mutual respect for any forks of myself drastically increase the odds that they will respect me back, and that we would be able to achieve peaceful co-existence and mutual aid. That is a fortunate fact about me, if someone would hate a copy of themselves, then their copy is going to hate them and they shouldn't make one.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link