Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
If you don't consider yourself the same after you go to sleep then I don't see how you can justify having any sentimental attachment to the mind upload, who immediately deviates by virtue of existing in a different context.
I don't consider myself to be exactly the same person after any period of time that can be measured on a clock. I think it's quite clear that I'm not particularly bothered by this.
If I can identify with and care about the wellbeing of the person "I" will be in decades, then I can identify with an upload.
The difference is I will never be the upload.
This strikes me as one of the failures of rationalism: attempting to reconstruct the value system based on abstract and inevident first principles such as "pattern theory", then when someone's common intuition fails to align with those, the rationalist declares common intuition wrong. Not referring to you, since you seem pretty open to other people having different opinions, but rather someone like EY calling it "muggle thinking".
I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.
Human intuition is often wrong! Our intuition does not equip us in the least to naturally handle things like relativistic speeds or quantum phenomena. It evolved to handle the narrow ranges or scales that were relevant to macroscopic animals.
Even in that domain, our expectations fail us. Newton had to break from Aristotlian mechanical intuitions that were Plain Common Sense for hundreds of thousands of years.
I believe you, and why wouldn't I? At the end of the day, some values are entirely arbitrary and can't be otherwise (without solving infinite regress). This is as true for me as it is for you.
It is entirely fair for someone to believe that they value living for its own sake, and that their value systems exclude something like an upload. Now, sometimes there might be underlying beliefs or justifications that build on axiomatic bedrock, and those can be wrong when judged against their building blocks.
For example, someone might think it not even theoretically possible to upload a human mind and run its algorithms on a computer. I would, of course, think that they're mistaken on factual grounds, even if it is an enormously difficult feat to pull off.
Intuitive understandings of personal identity are often not robust in the least, but I'm confident mine are far more internally consistent than many people (especially since they've never even thought about it).
That being said, if someone recognizes that an upload might have the same thoughts and values as them, including a belief that uploads aren't moral persons or identical to the original (which might be overruled by self-interest, it's hard to make a man accept something if his literal existence hinges on not accepting it); and they still disagree with me on the moral equivalence of their own uploads, then I can't argue further.
We would agree on the facts, and disagree on their implications. I know that my mutual respect for any forks of myself drastically increase the odds that they will respect me back, and that we would be able to achieve peaceful co-existence and mutual aid. That is a fortunate fact about me, if someone would hate a copy of themselves, then their copy is going to hate them and they shouldn't make one.
Yes, but I don't see how something like "we perceive our own continuity" can be proven wrong.
There is a dystopian scenario in the writing where people like me are conditioned to brainwash themselves into ego death because being able to productively fork oneself and share resources between forks without regard to the fork's individual needs for them is the only way to stay competitive.
I don't think I'm disputing that, as far as I can tell.
Cheer up! It's very unlikely that human uploads will be enslaved for cognitive labor, we're likely not cost-competitive with dedicated AI. Something like the Age of Ems or Pantheon requires technology to advance in ways that it very much does not appear to be advancing.* If you want to allow for enormous levels of pruning and optimizing to achieve parity, I strongly expect you won't have much recognizably human left by the time you're done. So you both can't really compete solely by Molochian self-abnegation, and you probably won't be expected to. Of course, I can't rule out that things go south in other ways and we all starve regardless.
*We can't properly emulate a nematode, but we have synthetic AI that can do graduate level mathematics.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link