This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I'm the dancer, not the dance. I'm the waves, not the water.
To be a little more concrete:
Every part of your body is endlessly recycled while you're alive. This is true for every structure more complicated than fundamental particles. If two electrons have the same mass, spin, charge and other quantum numbers, then it's impossible to distinguish between them.
The majority of the atoms in my body have swapped around since I was born. I have a consistent self-identity that is conserved even if I have a banana for breakfast or take a big dump, or if I go to bed and wake up tomorrow. I'm never the exact same, any more than a river is. Yet a river and a self_made_human are consistent entities about which it possible to make broad statements.
So it can't be pure and perfect identity. It can't be truly continuous consciousness, unless one wishes to believe that sleep is lethal.
What remains are the patterns of information and the algorithms that act on them. If I copy a png of a flower and share it with you, there is nothing lost or gained in the process, assuming standard error correction.
Right now, these algorithms and their data are instantiated in meat machinery: neurons.
Yet the neurons churn. And eventually, without advanced technological intervention, will die and take me with them.
If you can perform addition using both an abacus, a TI-84 and a supercomputer, in a very real sense they're all doing the same thing. It doesn't make much sense to say that your CPU can't actually add numbers.
(Ignore details such as how the floating point arithmetic would work, that's beside the point)
I think there's no fundamental barrier to extracting the algorithms and information in my neurons and creating a replica in-silico. It is a ridiculously difficult engineering challenge, but not something forbidden by the laws of physics.
I can dig up a link, but we already know that that artifical neural networks can near perfectly replicate the behavior of their biological counterparts (usually in a 1000:1 ratio). You can make them arbitrarily more precise, to the point where the original brain is noisier.
Hence, I want my mind to be uploaded into a computer. It's more robust than flesh, and unlocks far more scope for improvement. Going from a dumb baby to a reasonably intelligent adult didn't kill me, so I don't think becoming more intelligent will.
I'm also fine with multiple copies of myself running around. All that matters is that they begin as indistinguishable in terms of behavior to a blinded observer. If there's a "copy" of myself sitting in a black box, which says the exact same things, acts like me in simulation, and so on, then I accept that as me.
Consciousness isn't computation - it's fundamentally embedded into the biological processes. It also doesn't emerge from neural networks regardless of how well they mimic behaviours of real humans. Neural networks are statistical models, while you are your un-statistical emotions, you are your hormonal systems, microbiomes, other physical systems within your body. If you just extract the consciousness + the memories, just the raw contents of your brain and put them into the machine, you lose everything else, which is arguably the most important part. You get alien consciousness. Your consciousness is your consciousness BECAUSE of all of those icky yucky things attached to your brain, not DESPITE. If your replace them, why do you assume continuity?
All of this irreducible complexity can't be reimplemented by assuming that everything is an algorithm. Emotions aren't algorithmic abstract patterns, they are complex interactions between neurons and other biological systems and they are a fundamental part of the biological reality that makes you "you". Omitting them makes you a spider, an alien.
Hang on. Please note that you're just saying these things.
Why isn't consciousness computational? I mean, I can't prove that it is either, but that's just assuming the opposite side. The correct stance is agnosticism, albeit I think it'll turn out to have a mechanistic explanation eventually.
How do you know with, any confidence at all, that something like an LLM isn't conscious? It might not be conscious in the same manner as humans, but the same might be true for birds or octopi. They demonstrate all the hallmarks of intelligence, even if it's not the human kind.
The human body operates on biology. Which is abstracted chemistry. Which is abstracted physics. Which can be mathematically modeled. There's nothing in the human body or brain that violates the laws of physics, you need supernovae or massive particle collides to produce behavior the Standard Model can't explain (leaving aside dark matter and energy, which aren't relevant to human biology).
That physics, while intractable to compute at the quantum level or even the microscopic scale, still holds true. A lot of it can be usefully described and wrangled with statistics.
Well, emulate the body too! The neurons are electro-chemical, and surprisingly binary, in the sense that they're either firing or they aren't. This behavior can be well approximated.
If a disconnected brain emulation goes nuts, then if you have that kind of tech, you can trivially design a virtual body with the usual sensory modalities.
The brain is also very noisy. You can probably get away with saving a lot of computation by approximating events at the chemical scale. Not every random jiggle of proteins matters, it simply can't at scale.
I am not convinced that this complexity is irreducible at all. A single neuron, or even a thousand, misfiring: Happens all the time. Doesn't matter. An emulation can withstand a lot of noise, because the object it's representing is also noisy.
Emotions can be both algorithmic patterns and the product of a complex interplay between systems. All that really changes is that the algorithms in question become more complex.
This is already accounted for. Estimates for the amount of compute needed for a brain emulation vary multiple orders of magnitude. I never said this would be easy. It just isn't impossible, look, something evolution cobbled up manages. We even have our own alien artifical intelligences that can run on your phone.
If that was really the outcome, I'd take it over death when my fragile biological form fails me. I don't think this is likely at all, beyond the first imperfect uploads.
Thank you for the discussion! I think it heavily veers into sci-fi territory, but it's fun to think about.
https://youtube.com/watch?v=GKnAWcWnJJc
It turned out I must have already watched that video as YouTube shows that I liked it haha.
Of course, everything I've said is speculative, but it's modestly informed speculation. All future advances are sci-fi until they're not, we'll have to strap in for the ride and see where it takes us.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link