site banner

Culture War Roundup for the week of February 17, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

4
Jump in the discussion.

No email address required.

Surely rationalists, who like doing things like calculating the effect of trillions of hypothetical future specks of dust in people's eyes and weighing them against immediate murder, would consider the advantages and disadvantages of respecting a trans person's chosen pronouns beyond the immediate effect of "I tell a lie" vs. "the person might go through with their threat of suicide". I think what is actually going on is a combination of (1) the real community of rationalists has a high fraction of people who are not quite the independent thinkers resilient to social pressure they make themselves out to be, and (2) the old guard at some point concluded that the danger of AI doom dominates their value function, and that building and maintaining a durable alliance with the US Left is their best shot at averting AI doom.

This might in part be reasonable political calculation (unaligned movements with any amount of influence, in the US climate, tend to be crushed as crypto-outgroupers and pillaged for remaining political capital by both sides; of the two, the Left is in principle more receptive to safetyism and EA/tikkun olam/global paternalism), and in part a certain measure of arrogance by the core personnel (Yudkowsky probably thinks of himself and the handful of people he respects as smart enough to not have their ability as the Wisest and Most Rational Human Beings be compromised by a well-contained set of signalling beliefs, and doesn't think that they stand to benefit that much from potential additional peers that get lost to brainrot in the pipeline).

” the real community of rationalists has a high fraction of people who are not quite the independent thinkers resilient to social pressure they make themselves out to be“

… just figuring this out? I love the rationalist movement and read a lot of blogs/forums, but it totally falls apart once they start dealing with anything politically charged or that’s socially highly controversial.

A good example is Scott Alexander’s post after the election, where he basically said “I adamantly refuse to believe polymarket was correct by giving odds at 60-40 and the true odds were 50-50” really showed this to me. After the biggest right wing blow out election in recent history, you can’t accept one party had the odds going into it?

…anyways, many such examples, but it’s important to see this movement for what it is. Just so happened that the Rationalists, from Berkeley, rationally thought themselves into taking left wing stances on most all the controversial issues of ours time… right

…anyways, many such examples, but it’s important to see this movement for what it is. Just so happened that the Rationalists, from Berkeley, rationally thought themselves into taking left wing stances on most all the controversial issues of ours time… right

I mean, how many of those stances are about facts, as opposed to values? Separate magisteria.

Part of the problem with Californian Utilitarianism is that while, when pressed, it will recognize that the choice of a utility function is ultimately an arbitrary decision, in practice it always seems to round down to "increase happiness", and therefore to the care/harm moral foundation, and therefore to left wing politics.

To wit, Zizians are functionally communists, Hegelian dialectic and all. They believe that there is a good future out there, they they can be its instruments, and that the goodness of the future washes away all sins they may commit in the service of that future, because the ends justify the means.

Scientific Utopianism is the disease of worshiping reason, and I'm afraid this isn't the last of its incarnations, because without any tempering force to the hubris of men who think they can predict the future, the temptation to make it one's own at all costs is always there. Babel's construction crew is never going to run out of volunteer laborers.