This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Is it really more likely that academics believe HBD but resolve to fight it anyway? I think genuine disbelief is more parsimonious. That would include feeling motivated not to look too closely.
Yeah, when surveyed anonymously only about 15-20% believe that there's no genetic component in the Black-White achievement gap in the US.
http://lepo.it.da.ut.ee/~spihlap/snyderman@rothman.pdf - 1987 survey.
https://sci-hub.st/https://www.sciencedirect.com/science/article/abs/pii/S0160289619301886 - 2020 survey.
This contrasts with practically nobody saying this in public and consequently the public (exemplified by you here) believing that this reflects on the actual privately held beliefs of scientists.
And, specifically, if you want to see how an example of speaking power to truth looks like in this context, https://www.chronicle.com/article/racial-pseudoscience-on-the-faculty (paywall bypass: https://archive.li/ZxVYk)
More options
Context Copy link
If you genuinely believe X -- as in, all the evidence you've seen points towards X, you have no inkling that X might be false, you would be willing to bet at strong odds that X is true -- then there may not be any motivation to look any more closely but there certainly isn't any motivation to avoid looking closely -- because what's the worst that can happen? You find more evidence that you're right?
In order to know that you need to be motivated to not look to closely, you need to know that there's at least a significant chance of learning that a thing you want to be true is not true. And that means you know that you already know there's at least a significant chance of this thing not being true. At this point, if you act as if you "believe" X with any confidence then you are merely acting.
Genuine (dis)belief and motivated reasoning do not fit together.
Sure there is. You’ve got a limited amount of time on this earth, and aren’t obligated to spend it debating people you think are trolls or at least cranks. The worst that can happen is you waste your time, feel stupider for having engaged, or encourage your opponents. It’s also possible to tar yourself as an outsider, because tribalism isn’t always (ever?) open-minded.
There is some level of belief where it becomes rational to write off the rest as a rounding error, rather than spend time on it. There is a lesser level at which most people start to do this!
If I think something is 99% likely to be true, and I don’t want to spend time debating heretics, it’s still fair to say that I believe that thing.
This is justification for not having any motivation to talk to them. It is not motivation to avoid looking more closely at your beliefs.
I believe the earth is round. I could be wrong, but I find it sufficiently unlikely that I'm going to learn anything worthwhile from the average flat earther that I'm not really interested in debating them. However, if I ever find myself wanting to debate them, and also feeling like I need to avoid doing that, then that's a sign that something about what I claim to believe is wrong.
Having watched a couple actual debates on this topic, it's often that the "round earther" has no idea how to justify their (correct, IMO) beliefs, and instead of honestly admitting that they are essentially taking people's words for it, are trying to pretend that they actually understand things more than they do. That cognitive dissonance doesn't necessarily mean you're wrong on the object of contention, but it's a pretty good bet that you're wrong somewhere (perhaps in how confident and justified you actually are), and this sign marks the trailhead.
These are both signs that your story isn't adding up. Why did you feel tempted to do something stupid? What roped you in?
Why would your opponent leave feeling "encouraged" rather than humiliated? If you actually know the topic so well, and their beliefs so dumb, shouldn't you be able to address their points so well that they are the ones that leave feeling dumb?
That's 99% fair. And that 1% lie can be an acceptable rounding error.
But that 1% lie can also be a part of a much bigger lie to avoid having to deal with the fact that it's a hell of a lot more than 1% motivated and likely to be false.
Any time you find yourself actively wanting to avoid engagement (and not simply lacking motivation to engage), you're actively up against the part of your belief which isn't genuine. Even if it's only 1%, it's proven that it's not small enough to be irrelevant. And if it's making itself relevant, that's good evidence that it isn't really as small as you might like to believe.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link