site banner

Small-Scale Question Sunday for June 9, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

3
Jump in the discussion.

No email address required.

we would be able to identify numerous specific people who we would have actual scientific proof will only contribute to society in highly negative ways, and we'll have to decide what to do with those people.

Sorry to fight the hypothetical, but I really doubt many people like this exist. Let's say you possess a computer capable of simulating the entire universe. Figuring out the future of a specific bad person based on simulations is only step one. After that there are a practically infinite number of simulation variations. What happens if he gets a concussion? If he gets put on this medication (which we can also perfectly simulate)? If all of his family members come back to life and tell him in unison to shape up?

This is godlike power we're talking about. The ability to simulate a human means the ability to simulate that human's response to any possible molecule or combination of molecules introduced in any manner. If there is even a conceptually possible medication which may help this person then this computer--which we've established can simulate the universe--will be able to find it. Ditto for any combination of events, up to and including brainwashing and wholly replacing their brain with a new brain.

The interesting question to me is not whether these people can be "saved" and made into productive citizens. In my opinion that's a foregone conclusion. The question is at what point this turns from helping someone into forcing them against their will into an outcome their previous self would consider undesirable, and whether doing so is nevertheless moral. I think not--you may as well create new people rather than modifying existing ones drastically, and do with the existing ones as you will.

Would we eliminate them? We could lock them away for life, but that's expensive, should we bother if we know they will never reform? Also our current criminal justice system in most of the first world locks people away for a pre-determined length of time when we prove they did a specific bad thing. It's rather a departure to be saying, our mind-scanning computer says you'll always be bad, so we're going to lock you away for life, or do actual brain editing on you to make you act right. Definitely can't see that one going wrong in any way.

To engage with the actual question you're asking--what do we do with people who are just innately bad?--I definitely think locking people up is fine morally. These simulations are supposed to be infallible after all. If you feel like you need some justification to lock them up, just use the simulation to contrive a scenario where they do a non-negligible amount of harm but don't actually kill someone, and then lock them up after they've done it.