site banner

Small-Scale Question Sunday for March 2, 2025

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

The best person to speculate about a copy of myself would be me. And I don't think that would happen.

The copying process may not be perfect. There could be bit flips. And that's assuming that your pattern will actually think like you go; that the difference in hardware will not matter and that the uploading process actually captures everything that makes you tick.

We're talking about a hypothetical technology, one that might pass through decades of improvement and iteration.

The minimum viable product is one that isn't a gibbering mess after destructive scanning. The ideal is one that is absolutely indistinguishable (outside of being in a computer) from the original human and leaves the OG meat intact.

There's plenty of room in between. I'm not the exact same person when I go to bed and wake up the next day, and I don't write a will before I do. Or when I consider what I'll be like in a year, or even decades.

I can't say with 100% certainty what I'll be like in a few decades as is, but I'm reasonably confident that short of a brain disease, I won't become a serial killer or try and suffocate my parents in their sleep.

Scanning the brain in any fashion is the hard bit. Making sure the hardware and simulated environment is congenial is trivial in comparison. If we diverge over time, that's no surprise, even twins do, or the me of today and next year.

If you don't consider yourself the same after you go to sleep then I don't see how you can justify having any sentimental attachment to the mind upload, who immediately deviates by virtue of existing in a different context.

I don't consider myself to be exactly the same person after any period of time that can be measured on a clock. I think it's quite clear that I'm not particularly bothered by this.

If I can identify with and care about the wellbeing of the person "I" will be in decades, then I can identify with an upload.

The difference is I will never be the upload.

This strikes me as one of the failures of rationalism: attempting to reconstruct the value system based on abstract and inevident first principles such as "pattern theory", then when someone's common intuition fails to align with those, the rationalist declares common intuition wrong. Not referring to you, since you seem pretty open to other people having different opinions, but rather someone like EY calling it "muggle thinking".

I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.

Human intuition is often wrong! Our intuition does not equip us in the least to naturally handle things like relativistic speeds or quantum phenomena. It evolved to handle the narrow ranges or scales that were relevant to macroscopic animals.

Even in that domain, our expectations fail us. Newton had to break from Aristotlian mechanical intuitions that were Plain Common Sense for hundreds of thousands of years.

I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.

I believe you, and why wouldn't I? At the end of the day, some values are entirely arbitrary and can't be otherwise (without solving infinite regress). This is as true for me as it is for you.

It is entirely fair for someone to believe that they value living for its own sake, and that their value systems exclude something like an upload. Now, sometimes there might be underlying beliefs or justifications that build on axiomatic bedrock, and those can be wrong when judged against their building blocks.

For example, someone might think it not even theoretically possible to upload a human mind and run its algorithms on a computer. I would, of course, think that they're mistaken on factual grounds, even if it is an enormously difficult feat to pull off.

Intuitive understandings of personal identity are often not robust in the least, but I'm confident mine are far more internally consistent than many people (especially since they've never even thought about it).

That being said, if someone recognizes that an upload might have the same thoughts and values as them, including a belief that uploads aren't moral persons or identical to the original (which might be overruled by self-interest, it's hard to make a man accept something if his literal existence hinges on not accepting it); and they still disagree with me on the moral equivalence of their own uploads, then I can't argue further.

We would agree on the facts, and disagree on their implications. I know that my mutual respect for any forks of myself drastically increase the odds that they will respect me back, and that we would be able to achieve peaceful co-existence and mutual aid. That is a fortunate fact about me, if someone would hate a copy of themselves, then their copy is going to hate them and they shouldn't make one.

Human intuition is often wrong!

Yes, but I don't see how something like "we perceive our own continuity" can be proven wrong.

I know that my mutual respect for any forks of myself drastically increase the odds that they will respect me back, and that we would be able to achieve peaceful co-existence and mutual aid. That is a fortunate fact about me, if someone would hate a copy of themselves, then their copy is going to hate them and they shouldn't make one.

There is a dystopian scenario in the writing where people like me are conditioned to brainwash themselves into ego death because being able to productively fork oneself and share resources between forks without regard to the fork's individual needs for them is the only way to stay competitive.

Yes, but I don't see how something like "we perceive our own continuity" can be proven wrong.

I don't think I'm disputing that, as far as I can tell.

There is a dystopian scenario in the writing where people like me are conditioned to brainwash themselves into ego death because being able to productively fork oneself and share resources between forks without regard to the fork's individual needs for them is the only way to stay competitive.

Cheer up! It's very unlikely that human uploads will be enslaved for cognitive labor, we're likely not cost-competitive with dedicated AI. Something like the Age of Ems or Pantheon requires technology to advance in ways that it very much does not appear to be advancing.* If you want to allow for enormous levels of pruning and optimizing to achieve parity, I strongly expect you won't have much recognizably human left by the time you're done. So you both can't really compete solely by Molochian self-abnegation, and you probably won't be expected to. Of course, I can't rule out that things go south in other ways and we all starve regardless.

*We can't properly emulate a nematode, but we have synthetic AI that can do graduate level mathematics.