site banner

Culture War Roundup for the week of January 27, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Wait, is this actually what they believe, or are you exaggerating? This is Scientology levels of completely delusional.

From the dive I did, I'd say that sounds reasonably accurate. I linked the glossary below if you want to dive yourself. That, combined with the report on the attempted murder of their landlord and the personal accounts related to it, were more than enough to identify Ziz as ten pounds of crazy in a two-pound sack.

I'm not sure "timeless-decision-theoretic-blackmail-absolute-morality theory" is the term they actually used, but I'm not sure it's not the term either, and it seems like a reasonably accurate description from what I recall.

I linked the glossary below if you want to dive yourself.

Thanks, I appreciate it. I did look at the lesswrong post and I was bewildered at how seriously the commenters took the ideas of alternate personalities and behavior modification; for people who declare themselves scientific and anti-superstition they seem pretty stitious to me. Family systems therapy pervades the commentariat, which I find rather disturbing; I was sold that this was a tool to help people deal with mental illness and not a means to manipulate or explain the world in real terms, but they're treating it like they're talking about real-world magic. If this is what people mean by "therapy culture" then I agree wholeheartedly with the criticism of it. At this point, I'm ready to declare family systems therapy a cause of psychogenic illness, whatever good it might have done it's now clearly driving people mad.

That being said, I try to avoid prying too deeply into either delusional thinking or true crime; the former I fear might infest me (though I don't have a genetic predisposition to schizophrenia) and the latter just makes me angry. I have finite grey matter and I'd rather spend it on things that don't make me feel like the only sane man in an insane world. We need connection to reality and to other people and to average people and to people of different perspectives to remain sane, and this is a great example of why.

At this point, I'm ready to declare family systems therapy a cause of psychogenic illness, whatever good it might have done it's now clearly driving people mad.

Scott had a whole book review where he strongly suggested this was true. One of the main practitioners of family systems therapy wrote a book claiming demons are real and he was literally exorcising spirits. Scott thought he was just creating psychogenic illness in people.

In and of itself, there is nothing particularly weird about having fictional characters in your head: many famous authors talk to their characters while they're out and about in order to round out said characters' personalities. Children have imaginary friends, artists have muses. And it seems entirely plausible that if you go on doing it for long enough, you will start habitually supporting this kind of 'virtual machine' of another person in your head in the same way that docker environments run on a virtual machine inside your pc. I tried it for a couple of weeks with my favourite character from the novel I was writing, until I realised that actually I didn't want to never be alone in my own head. It works, more or less.

Of course, the true believers run away with it. 'My tulpa is real in the sense that this thought pattern currently exists in my brain' becomes 'my tulpa is an entity deserving of respect and ethical treatment' becomes 'I am a system of 32 personalities, none of whom claims precedence'. Imaginary friends, being imaginary, become whatever you imagine them to be. And if you're asking your imaginary friends to help you perform self-therapy on the already warped and delusional brain that spawned them, that isn't going to end well for anybody.

in the same way that docker environments run on a virtual machine inside your pc

[pushes up glasses]

Well actually, virtual machines and containers are different things. It is certainly possible to run containers inside a VM, but a VM is not strictly necessary.

(OK, in fairness, I think Docker in particular relies on features of the Linux kernel, namely cgroups and namespaces, so e.g. Docker Desktop on Mac or Windows will indeed spin up a Linux VM)

/pedantry

I was bewildered at how seriously the commenters took the ideas of alternate personalities and behavior modification; for people who declare themselves scientific and anti-superstition they seem pretty stitious to me.

These don't sound particularly anti-scientific to me. At least, not magically so.

Technobabble is indistinguishable from religious invocations. Chanting to the a Machine God is silly to us because the recognizable words we understand are mutated, but stringing technological sounding terms together into a single compound word like german gone wild is exactly that. Dressing up a wrong scientific concept, like 90% of your brain is unused or biohacking through blood transfusions, is just misreading of reality, like sacrificing virgins on the solstice for a good harvest.

See, your examples sound silly to me because those specific ones are implausible/debunked. Behaviour modification, on the other hand, sounds like this quaint "building a habit" thing.

As someone who drank like a fish when I was younger, but then had an experience which reduced my alcohol consumption to a few glasses per year, I have to wholeheartedly agree with you that modifying behavior is in fact possible.

They called it timeless-decision-theoretic-blackmail-absolute-morality theory on lesswrong

Related discussion on LW, with linkbacks to the blog in question. The actual article titled "The Multiverse" somehow missing from every archive snapshot (but definitely existing at some point, judging by linkbacks from the post) is too ironic to be put into words, I'm actually curious now.

Thanks for killing a few hours of my wageslavery, fascinating rabbit hole.

Thanks for the link. Slimepriestess
(★ Postbrat ★ Ex-Rat ★ Anarchist ★ Antifascist ★ Vegan ★ Qualia Enjoyer ★ Queer Icon ★ Not A Person ★ it/its ★) is the main ziz advocate on LW, and is the one whose YouTube podcast I linked above who supported them murdering the elderly man.

★ Postbrat

What does that mean? Is it using the term in a BDSM context, or referring to the Charli XCX album?

★ Not A Person ★ it/its

TV Tropes needs an update.

Slimepriestess

What I expected / What I got

In seriousness, I instantly knew from le quirky nickname before I even checked the vid but it's not any less sad. Starting to think I really prefer gamepad-eating """nerdy""" girls of yore over the nerdy """girls""" of today. Monkey paw curls.

I'm pretty sure that's not how it works, since almost anything to do with timeless decision theory is basically incomprehensible and could never be dumbed down into something as concrete as stabbing your landlord with a sword. If you're killing someone in the name of Wittgenstein or Derrida, you're doing something wrong (on several levels). Maoism on the other hand smiles upon executing landlords.

could never be dumbed down into something as concrete as stabbing your landlord with a sword.

As the meme goes, you are like a little baby. Watch this.

The government is something that can be compromised by bad people. And so, giving it tools to “attack bad people” is dangerous, they might use them. Thus, pacts like “free speech” are good. But so is individuals who aren’t Nazis breaking those rules where they can get away with it and punching Nazis.

<...>

If you want to create something like a byzantine agreement algorithm for a collection of agents some of whom may be replaced with adversaries, you do not bother trying to write a code path, “what if I am an adversary”. The adversaries know who they are. You might as well know who you are too.

Alternatively, an extended Undertale reference that feels so on the nose it almost hurts (yes, fucking Chara is definitely the best person to mentally consult while trying to rationalize your actions).

Once you make "no-selling social reality" your professed superpower, I imagine the difference in performing Olympic-levels mental gymnastics to justify eating cheese sandwiches and coming up with legitimate reasons to stab your landlord is negligible. (I know the actual killer is a different person but I take the patient zero as representative of the "movement".)

Alternatively, an extended Undertale reference that feels so on the nose it almost hurts (yes, fucking Chara is definitely the best person to mentally consult while trying to rationalize your actions).

I'm not very well versed in Undertale lore, so can you point out how this is an extended Undertale reference?

[cw: spoilers for a 10 year old game]

In brief, Chara is the most straightforwardly evil entity in all of Undertale and the literal embodiment of soulless "number go up" utilitarian metagaming. One of the endings (in which your vile actions quite literally corporealize it) involves Chara directly taking over the player avatar, remarking that you-the-player have no say in the matter because "you made your choice long ago" - hypocrite that you are, wanting to save the world after having pretty much destroyed it in pursuit of numbers.

Hence the post's name and general thrust, with Ziz struggling over having to do evil acts (catching sentient crabs) to fund a noble goal (something about Bay Area housing?):

In deciding to do it, I was worried that my S1 did not resist this more than it did. I was hoping it would demand a thorough and desperate-for-accuracy calculation to see if it was really right. I didn’t want things to be possible like for me to be dropped into Hitler’s body with Hitler’s memories and not divert that body from its course immediately.

After making the best estimates I could, incorporating probability crabs were sentient, and probability the world was a simulation to be terminated before space colonization and there was no future to fight for, this failed to make me feel resolved. And possibly from hoping the thing would fail. So I imagined a conversation with a character called Chara, who I was using as a placeholder for override by true self. And got something like,

You made your choice long ago. You’re a consequentialist whether you like it or not. I can’t magically do Fermi calculations better and recompute every cached thought that builds up to this conclusion in a tree with a mindset fueled by proper desperation. There just isn’t time for that. You have also made your choice about how to act in such VOI / time tradeoffs long ago.

So having set out originally to save lives, I attempted to end them by the thousands for not actually much money.

I do not feel guilt over this.

It really can't be more explicit, I took it as an edgy metaphor (like most of his writing) at first reading but it really is a pitch-perfect parallel: a guy has a seemingly-genuine crisis of principles, consciously picks the most evil self-serving path imaginable out of it, fully conscious of each individual step, directly acknowledging the Chara influence (he fucking spells out "override by true self"!), and manages to reason himself out of what he just did anyway. Now this is Rationalism.

I just can’t imagine being so much of a loser that I’m going to base my moral convictions on characters in a video game. That’s the thing that really strikes me here, not the murder and the consequentialism or even the rationalism, it’s that this is a person of obvious intelligence who has founded their entire worldview on video games and the Matrix movies.

I don't think they're founding their moral convictions on video games, only using video games and their connotations to smooth communication. It's no different than HPMOR, in my view.

I think you're underselling the phenomenon by just rounding all this off to crazy. I think it's entirely possible that Ziz and their accolytes have, among them, some significant neurological abnormalities. But it's hard to escape the impression that they're not losing their minds so much as intentionally throwing them away. They are actively taking concrete, premeditated action to undermine and compromise their own sanity, because they've bought into enough reasoning convolutions that they've committed to it being a good idea. I have some minor personal experience with cult shit, and this is definitely cult shit.

It's no different than HPMOR, in my view.

Yeah, and I also think HPMOR is very silly and shouldn't be treated as serious. Harry Potter fanfiction is not the means by which serious people discuss or disseminate philosophical treatises; it insults Harry Potter by trying to make it something it isn't, and insults philosophical treatises by trying to make them something they're not. That Yudkowsky used Harry Potter fanfiction to distribute his ideas indicates to me an unwillingness to choose the right register in which to communicate, a bit like TYPING IN ALL CAPS LIKE YOU'RE A BOOMER WITH A BROKEN CAPS LOCK or refusng 2 us propper gramar to rite yur txt bc its to hard 2 rite n propr inglish. It indicates a disrespect to your content and your audience, while also implying you don't believe your work is strong enough to stand on its own without adding a gimmick.

And that's exactly what I charge our cultists here are doing: they're disrespecting themselves by describing extremely significant and important themes in metaphysics and social reality through video game references, which aren't reality, indicating that either they can't justify their views in more complex terms or don't have the patience, lucidity, and self-control to choose to do so, both of which are damning.

I have some minor personal experience with cult shit, and this is definitely cult shit.

Sure, maybe. But I don't see "cult shit" as meaningfully distinguished from crazy; by crazy I don't simply mean schizophrenia or something along those lines, but simply that these are people whose reasoning and behavior are separated from reality and whose ramblings are therefore fruitless and best to be ignored. I don't really care, Margaret, whether the delusions came from neurological abnormalities or from manipulation as part of a cult.

More comments

I knew about Undertale's general outline but couldn't piece it together, so thanks for doing that. So, in essence, ziz identifies one-to-one with Chara, an avatar of utilitarianism. He excuses his actions by simply asserting that his "true self" is a soulless consequentialist; he by-passes moral deliberation or crisis of principles by simply saying that whatever actions that puts him into conflict with himself are expressions of his true self. And because they are expressions of his true self, and therefore out of his control, he should not feel guilt over them. Determinism taken to its logical conclusions. Rationalism is just its beast.

Good points and I appreciate you bringing up the lore, I now understand better why people are repulsed by rationalists if this kind of thing is what they think of.

I still think this isn't real timeless decision theory though, this looks like a severe case of antifa syndrome with a heavy dose of being defective as a person. Timeless decision theory is about basilisks and multiple universes and real proper game theory not 'kill nazis'. The galaxy-brain version of antifa syndrome with all these weird blog posts about being an obnoxious creep and a weirdo that are hard to decrypt more specifically is still only antifa syndrome.

Gwen rediscovered debucketing. (A fact that had been erased from their mind long ago). Pasek was on the edge of discovering it independently, they both came to agreement shared terminology, etc.. I joined in. Intense internal conflict between Gwen’s and Pasek’s hemispheres broke out. I preserved the information before that conflict destroyed it (again.)

Pasek’s right hemisphere had been “mostly-dead”. Almost an undead-types ontology corpse. Was female. Gwen and Pasek were both lmrf log. I was df and dg. Pasek’s rh was suicidal over pains of being trans, amplified by pains of being single-female in a bigender head. Amplified by their left hemisphere’s unhealthy attitude which had been victorious in the culture we’d generated. They downplayed the suicidality a lot. I said the thing was a failed effort, we had our answer to the startup hypothesis, the project as planned didn’t work. Pasek disappeared, presumed to have committed suicide.

Like what is going on here? I think this is schizobabble, it sounds like schizobabble. Timeless decision theory is incomprehensible but seems vaguely meaningful in certain rare circumstances, like advanced science. Maybe wrong science, who can say? But there's something in it more than this. If you put weird inputs into a bad piece of software and it glitches out, it's not the fault of the input but of the software (in this case Ziz and gang).

I already dumped most of this schizo shit from my mental RAM so I can't be certain, but s/he does explicitly touch on this in the extended Undertale reference above:

Any choice you can be presented with, is a choice between some amounts of some things you might value, and some other amounts of things you might value. Amounts as in expected utility.

When you abstract choices this way, it becomes a good approximation to think of all of a person’s choices as being made once timelessly forever. And as out there waiting to be found.

<...>

If your reaction to this is to believe it and suddenly be extra-determined to make all your choices perfectly because you’re irrevocably timelessly determining all actions you’ll ever take, well, timeless decision theory is just a way of being presented with a different choice, in this framework.

If you have done do lamentable things for bad reasons (not earnestly misguided reasons), and are despairing of being able to change, then either embrace your true values, the ones that mean you’re choosing not to change them, or disbelieve.

Given this evidently failed to induce any disbelief, I parse e.g. the sandwich anecdote above as revealing one's focus to not actually be on the means (I am a vegan so I must not eat a cheese sandwich), but on the ends (to achieve my goals and save the world I need energy - fuck it, let it even be a cheese sandwich). Timeless ends justify the immediate means; extrapolate to other acts as needed. Sounds boring, normal even, when I put it this way, this is plain bog standard cope; would also track with the general attitude of those afflicted with antifa syndrome. Maybe I'm overthinking or sanewashing it, idk.

On the other hand, quoth glossary:

Timeless Gambit

What someone’s trying to accomplish and how in the way they shape common expectations-in-potential-outcomes, computations that exist in multiple people’s heads typically, and multiple places in time. Named from Timeless Decision Theory. For example, if you yell at someone (even for other things) when they withdraw sexual consent, it’s probably a timeless gambit to coerce them sexually: make possibility-space where they don’t want to have sex into probability space where they do have sex. In other words, your timeless gambit is how you optimize possibility logically preceding direct optimization of actuality.

...I admit I have no idea what the fuck that means but I do see related words...?

I think it’s describing a situation where you engineer a threatening environment so that you don’t need to use explicit force at the moment of decision. I think ziz is trying to say once you recognize that the environment was designed to corner someone into compliance, you can view it as morally similar to actually using violence, because the threat itself is doing the work of forcing their hand.

Why ziz didn't just say that- I may never know.

Man this is a very convoluted way of describing the concept of persuasion.

I'm pretty sure that's not how it works

I think you're seriously underestimating rationalists' capacity to rationalize.

Timeless decision theory is (and always has been) an excuse to do what you were going to do anyway.

It's the old leftist fallacy of "society is to blame" writ at a metaphysical level. You can't blame me for the consequences of my actions, I was mearly a pawn of universal forces.

Rationalist here. Timeless decision theory was never explicitly designed for humans to use; it was always about "if we want to have AIs work properly, we'll need to somehow make them understand how to make decisions - which means we need to understand what's the mathematically correct way to make decisions. Hm, all the existing theories have rather glaring flaws and counterexamples that nobody seems to talk about."

That's why all the associated research stuff is about things like tiling, where AIs create successor AIs.

Of course, nowadays we teach AIs how to make decisions by plain reinforcement learning and prosaic reasoning, so this has all become rather pointless.

My understanding of timeless decision theory is that you are deciding for every entity sufficiently similar to you. So, you’re making decisions for yourself at different points in time, as well as anyone else who might be sufficiently similar to you at the same time. Well, technically, this would make backwards causality… Kind of a thing you could think about, it really doesn’t seem all that relevant to how you would use it to actually make decisions. Instead, it adds weight to the decisions you’re trying to make, by spreading the consequences farther than you would normally expect them to go.

But that was from over a decade ago. It’s entirely possible that it’s become a lot more insane since then.

Big yud did chime in on one of the LW posts to say they got it wrong, so I wouldn't be surprised if they were playing fast and loose with the philosophical side

Big Yud plays fast and loose with everything. If he says someone is wrong then I'm willing to strongly consider their position.

Lol someone reported your comment.

I looked it up.

The Zizians were a cult that focused on relatively extreme animal welfare, even by EA standards, and used a Timeless/Updateless decision theory, where being aggressive and escalatory was helpful as long as it helped other world branches/acausally traded with other worlds to solve the animal welfare crisis.

They apparently made a new personality called Maia in Pasek, and this resulted in Pasek's suicide.

They also used violence or the threat of violence a lot to achieve their goal.

This caused many problems for Ziz, and she now is in police custody.

His reply seemed indistinguishable from sarcasm to me, I thought he was inventing a term to tar them with. But you brought the receipts, and it does seem they are as disconnected from reality as he suggested.

At the same time, like all mass killers, the actual content of these people's delusions is irrelevant, and the only appropriate response is to medicate until sane and confine until natural death.

Of course they reported it lol. Thanks for the extended cite, I was still looking for it in this giant pile of tabs

Islamists so the same thing but I doubt they have such sophisticated sounding justification for it.

improve your ideology's bargaining position averaged across every other reality in the multiverse.

It'd be working out pretty well for Zizians if there were 50 million of them.