site banner

Culture War Roundup for the week of November 4, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

8
Jump in the discussion.

No email address required.

rejecting Rationalism, as it leads to cudgels like "falsely claimed without evidence"

Does it? I don't think I've ever seen the phrase "without evidence" used sloppily by anyone whose definition of evidence is "B s.t. P(B|A)/P(B) > 0".

definition of evidence is "B s.t. P(B|A)/P(B) > 0".

Wouldn’t the definition of “A is Bayesian evidence of B” be “P(B|A) > P(B)”

I typed >0 when I meant to type >1, yes. That's very embarrassing.

It's not being sloppy with the phrase "without evidence", as @RenOS pointed it's more about elevating your position to the null hypothesis.

The whole Bayesian reasoning thing always felt like a gimmick to me anyway. You can claim to be a good Bayesian no matter the outcome of any particular case.

it's more about elevating your position to the null hypothesis.

One of the key differences between Bayesian and frequentist statistics is that the latter has a "null hypothesis" and the former does not. Priors aren't the same thing; in Bayesian-speak an experiment leads to an update that's a real number, not a binary acceptance/rejection.

You can claim to be a good Bayesian no matter the outcome of any particular case.

Yeah, but you can also claim to be a good non-Bayesian pundit regardless. The biggest difference from my point of view is that I've seen the best rationalists publish graphs of how well their past predictions, as declared in advance, turned out to be calibrated. I've never seen anybody more mainstream than Nate Silver do the same, even though "my punditry is my profession and public service and livelihood" would seem to entail a much stronger case for doing so than "I like blogging", so I'm going to doubt that rationalism has led to much of anything in the mainstream media.

... which is a shame, because an admission of "that's evidence but not enough to budge my priors" really is a big step up from a declaration of "without evidence". When not moving far from your priors is a good idea (which it often is - I've seen legitimate evidence for Flat Earth Theory!) you at least gain a little humility from having to openly admit what you're doing. And when your conclusions resembling your priors is a bad idea, you're more likely to notice that eventually if you have to acknowledge every time when you're dismissing Not Enough Evidence rather than Not Real Evidence.

People can claim to be good anything regardless of facts on the ground. Talk and unfounded claims are cheap and may even be free (ignoring opportunity costs).

I'd expect anyone who outright calls themselves a Bayesian to do better on that front.*

*Going off Bayesian priors about what the kind of people nerdy enough to have even heard of the idea are like, let alone self professed ones

I mean that the whole framework is designed so that you never end up having to eat crow. "My priors for this are very low. Oh, it happened anyway? Oh well, I promise to bump up my priors somewhat for the next time this non-repeatable event happens!".

Uh.. That's the worst way of reasoning from evidence that's ever been tried, barring all the others.

Absent logical omniscience, you are occasionally going to be wrong, and then you try to be less wrong. Taken deeply enough, no macroscopic events in the history of the universe are likely to ever be truly alike or repeatable, so sorting out reference classes is unavoidably important.

"I was wrong about World War 3 not happening. Well, we can't have a World War 3 2.0 happen for me to be right about, but at least I can adjust my priors for massive wars happening in the future".

Besides. You can very much eat crow when you are confidently wrong. It just takes intellectual honesty, and Bayesians at least pay lip service to the notion we learn from our mistakes. Keep being bad at updating, and people will stop considering what you say to be informative (and that's not unique to self-professed Bayesians, because in practise most humans apply the concepts implicitly, some are more disciplined and explicit than others).