In many discussions I'm pulled back to the distinction between not-guilty
and innocent
as a way to demonstrate how the burden of proof works and what the true default position should be in any given argument. A lot of people seem to not have any problem seeing the distinction, but many intelligent people for some reason don't see it.
In this article I explain why the distinction exists and why it matters, in particular why it matters in real-life scenarios, especially when people try to shift the burden of proof.
Essentially, in my view the universe we are talking about is {uncertain,guilty,innocent}
, therefore not-guilty
is guilty'
, which is {uncertain,innocent}
. Therefore innocent ⇒ not-guilty
, but not-guilty ⇏ innocent
.
When O. J. Simpson was acquitted, that doesn’t mean he was found innocent, it means the prosecution could not prove his guilt beyond reasonable doubt. He was found not-guilty, which is not the same as innocent. It very well could be that the jury found the truth of the matter uncertain
.
This notion has implications in many real-life scenarios when people want to shift the burden of proof if you reject a claim when it's not substantiated. They wrongly assume you claim their claim is false (equivalent to innocent
), when in truth all you are doing is staying in the default position (uncertain
).
Rejecting the claim that a god exists is not the same as claim a god doesn't exist: it doesn't require a burden of proof because it's the default position. Agnosticism is the default position. The burden of proof is on the people making the claim.
Jump in the discussion.
No email address required.
Notes -
I wish this argument would be true.
It is unfortunately inept however, have you any idea of the energy and computing power needed to simulate a universe? Have you any idea of the energy and computing power needed to simulate a bottle of water at the atomic level? A single cell ? We can barely exhaustively study quantum systems with more than 2 particles IIRC
The universe has finite resources and the constraints of mathematics are universal or even meta-universal and so is https://en.wikipedia.org/wiki/Computational_complexity your hypothetical aliens must bends to those extreme limits. Even mankind has already mostly reached them, we have extreme diminishing returns everywhere.
Bostrom did the maths - he found that a planetary-scale computer operating on principles we know has more than enough computing power, provided you take a few shortcuts. Only go to the full quantum simulation if it's actually needed, like if there's an electron microscope pointing at the target. The twin slits phenomenon is one example of the kind of method they could use.
Why do people assume that a simulation needs to be perfect down to the very last particle? Most of that stuff is just noise. You can have everything outside the solar system be an elaborate skybox, abstract away most of the Earth's crust aside from tectonic drift and volcano/earthquakes. The Sun can be greatly simplified.
Furthermore, what stops them using methods unknown to us, or methods that they deliberately leave out of the simulation? One might just as well say it would be prohibitively expensive to make a Minecraft computer capable of simulating the world they observe in Minecraft. They can't automate the construction of redstone circuitry like we can - and there is no notion of quantum computing at all in Minecraft. Say that there's such a thing as 'Quantum Computing 2' using more advanced physics that we can't even access - that would make it very easy. But that's not even required, according to Bostrom's analysis.
More options
Context Copy link
More options
Context Copy link