site banner

Culture War Roundup for the week of March 24, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

Consciousness isn't computation - it's fundamentally embedded into the biological processes. It also doesn't emerge from neural networks regardless of how well they mimic behaviours of real humans. Neural networks are statistical models, while you are your un-statistical emotions, you are your hormonal systems, microbiomes, other physical systems within your body. If you just extract the consciousness + the memories, just the raw contents of your brain and put them into the machine, you lose everything else, which is arguably the most important part. You get alien consciousness. Your consciousness is your consciousness BECAUSE of all of those icky yucky things attached to your brain, not DESPITE. If your replace them, why do you assume continuity?

All of this irreducible complexity can't be reimplemented by assuming that everything is an algorithm. Emotions aren't algorithmic abstract patterns, they are complex interactions between neurons and other biological systems and they are a fundamental part of the biological reality that makes you "you". Omitting them makes you a spider, an alien.

Consciousness isn't computation - it's fundamentally embedded into the biological processes. It also doesn't emerge from neural networks regardless of how well they mimic behaviours of real humans.

Hang on. Please note that you're just saying these things.

Why isn't consciousness computational? I mean, I can't prove that it is either, but that's just assuming the opposite side. The correct stance is agnosticism, albeit I think it'll turn out to have a mechanistic explanation eventually.

How do you know with, any confidence at all, that something like an LLM isn't conscious? It might not be conscious in the same manner as humans, but the same might be true for birds or octopi. They demonstrate all the hallmarks of intelligence, even if it's not the human kind.

Neural networks are statistical models, while you are your un-statistical emotions, you are your hormonal systems, microbiomes, other physical systems within your body.

The human body operates on biology. Which is abstracted chemistry. Which is abstracted physics. Which can be mathematically modeled. There's nothing in the human body or brain that violates the laws of physics, you need supernovae or massive particle collides to produce behavior the Standard Model can't explain (leaving aside dark matter and energy, which aren't relevant to human biology).

That physics, while intractable to compute at the quantum level or even the microscopic scale, still holds true. A lot of it can be usefully described and wrangled with statistics.

If you just extract the consciousness + the memories, just the raw contents of your brain and put them into the machine, you lose everything else, which is arguably the most important part.

Well, emulate the body too! The neurons are electro-chemical, and surprisingly binary, in the sense that they're either firing or they aren't. This behavior can be well approximated.

If a disconnected brain emulation goes nuts, then if you have that kind of tech, you can trivially design a virtual body with the usual sensory modalities.

The brain is also very noisy. You can probably get away with saving a lot of computation by approximating events at the chemical scale. Not every random jiggle of proteins matters, it simply can't at scale.

All of this irreducible complexity can't be reimplemented by assuming that everything is an algorithm.

I am not convinced that this complexity is irreducible at all. A single neuron, or even a thousand, misfiring: Happens all the time. Doesn't matter. An emulation can withstand a lot of noise, because the object it's representing is also noisy.

Emotions aren't algorithmic abstract patterns, they are complex interactions between neurons and other biological systems and they are a fundamental part of the biological reality that makes you "you".

Emotions can be both algorithmic patterns and the product of a complex interplay between systems. All that really changes is that the algorithms in question become more complex.

This is already accounted for. Estimates for the amount of compute needed for a brain emulation vary multiple orders of magnitude. I never said this would be easy. It just isn't impossible, look, something evolution cobbled up manages. We even have our own alien artifical intelligences that can run on your phone.

Omitting them makes you a spider, an alien.

If that was really the outcome, I'd take it over death when my fragile biological form fails me. I don't think this is likely at all, beyond the first imperfect uploads.

Thank you for the discussion! I think it heavily veers into sci-fi territory, but it's fun to think about.

https://youtube.com/watch?v=GKnAWcWnJJc

It turned out I must have already watched that video as YouTube shows that I liked it haha.

Of course, everything I've said is speculative, but it's modestly informed speculation. All future advances are sci-fi until they're not, we'll have to strap in for the ride and see where it takes us.