This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
It always gives me a surge of vindictive glee when someone says something to the effect of, "I hate when outsiders learn to co-opt our language to scam us."* Sucks to suck! Next time learn to receive and transmit factual observations instead of markers of ingroup status!
I'd wager this happens because it's a time-efficient mental heuristic-- you learn that people in your ingroup are unlikely to lie to you because they share the same goals, so when you recieve ingroup-signals you spend less effort discerning truthfulness. Intelligent people need this heuristic less and therefore groups full of high-average-intelligence people have a sort of herd-immunity against this type of scammer. Scammers often try to signal that they're high-intelligence by talking like LLMs trained on smart-people-talk... but that only fools dumb people who've trained themselves to have the separate-but-related heuristic of trusting anything that includes enough technical language. (See: homeopathic remedies, the medbed people, anything "quantum.") To the high-intelligence group, they just look like nuts, cranks, and schizophrenics.
However, implicit understanding of that herd-immunity becomes its own type of heuristic. Which works fine under normal conditions because you need to be smart** to lie to a smart person, and if you're smart you have more alternatives to being a scammer. But there's a particular failure case that I think is especially interesting: when a formerly high-average-intelligence group reduces its selection criteria and lets lower-intelligence people in. High-intelligence people become vectors of information instead of firewalls against it, because their level of laziness when evaluating ingroup claims is no longer adaptive.
I don't have any real conclusion to draw from this... Actually, I suspect I shouldn't draw any conclusions from this, because "the ingroup gets shittier when we let new people in" is exactly the sort of heuristic I suspect I'm already predisposed to have by genetics and culture. It's almost certainly priced in so to speak. So having the mechanistic explanation for the heuristic should actually push me toward being more open to expanding the ingroup-- at least, in cases where I suspect the new members are equal or greater intelligence to the existing ingroup. (Should I be even more in favor of increasing green card caps for technically skilled workers? But then again, I'd guess that I'm predisposed to be biased in favor of that by political affiliation and cultural influences anyways so this might be a wash.)
Though-- if human intelligence actually declined after the invention of agriculture (I'd put a sub-50% probability of this being true, but it would be really interesting if it was) it would imply that we were in a sweet-spot in terms of ingroup formation. If you live in a optimally sized band of primates, there's no need to send ingroup signals because you already have a deep, personal connection to every member of the ingroup. If you live in our current, massively populous, highly-anonymous society, relying on signals of ingroup membership gets you scammed. But for a thousands-of-years-long golden age you could afford to be stupid. Ingroups were both large enough that you could rely on yours to avoid having to think for yourself, and small/impermeable/anti-anonymous enough that scammers weren't a risk.
* See: fake-feminists seducing feminists, trump supporters donating their kids' entire inheritance on accident because of predatory web design practices, LGBT getting suckered into buying rainbow capitalist merchantise, megachurch pastors fleecing their denominations into giving them private jets, etc. (I'm providing politics-related examples of this because they're the most visible, but I'd wager the most common version of this is, "this fast-taking fellow convinced me we'd both be rich but he got away with the money and I was left with the bag.")
** well, you need high fluid intelligence specifically
There is a great cost to scammers utilizing ingroup signals though, it introduces friction akin to a transaction tax imposed by the government, except now it is imposed by lack of ability to trust. Sometimes transactions are so consequential it always makes sense to have vetting with insurance. Something like title insurance on a real estate property. But what about the special Best Buy warranties? What if it made sense to buy that crap for a $400 television because you dont know if you are getting a SONY or a SƠNY? This pretty quickly destroys your economy.
I don't understand why you framed this as a rebuttal to what I'm saying. Was my thesis unclear? In case I need to restate it-- "resist attempts by your ingroup to use language as a status-signalling tool because it will make you all vulnerable to scammers and I will laugh when they take your shit."
You shouldn't laugh, you should be sad.
I won't claim that this vindictive glee is in line with my deepest ethical principles, but... c'mon. Bad things happening to good people is tragedy. Bad things happening to bad people is justice. Bad things happening to idiots is hilarious.
Building social cohesion is not stupid. When it works your community has virtually no transaction costs.
Well if your only means to do that is by using language to communicate status rather than factual information, it clearly doesn't work.
That's why I enmesh myself in communities where status is a factor of conveying useful, factual information. It's much more efficient than handicapping strategies like sticking to a counterproductive party line as a high-cost signal of commitment to ingroup values.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think it depends on the group of smart people and what their domain of knowledge actually is. My observation is that smart people tend to vastly overestimate their ability to understand things not in their own domain. They think the6 can tell if someone is scamming them in another area but often they’re just as vulnerable as anyone else.
Part of this is the way society works. Everyone specializes in one or two areas, and outside of those areas you really don’t have much more base knowledge than the average person. This makes it much easier to sell a scientist on a financial scam. Not because the guy is stupid, but because he doesn’t know much about finance and doesn’t spend a great deal of time thinking about it. Or maybe it’s home repairs where a roofer can come into a neighborhood full of lawyers after a big storm and make bank by scamming the lawyers on repairs they don’t need and cheap materials that don’t last.
The other part is plain ego. Smart people have been praised for their intelligence for a long time. Everybody since their third grade teacher has probably told them how smart they are. Add in the success they get in their domain, and you believe it. They’re smart and can figure it out. And they actually are much more vulnerable simply because their ego won’t let them notice that something is off. In fact I would consider this an advantage for the less intelligent. They know how much they don’t understand about stuff they didn’t understand at school.
This is a vibes-based claim rather than a data-based claim, but I genuinely don't think scam rates are comparable across intelligence brackets even after controlling for domain-specific intelligence. When you're smart you end up learning meta-strategies to evaluate fact-based claims like checking your assumptions against known cognitive biases and using formal logic to determine whether claims are contradictory without having to access the actual underlying truth value of either claim. If me and a random joe were presented a list of scammy and non-scammy investment options for... I don't know, undersea mining concerns, or instagram influencer management companies, or whatever, I think I would outperform the random joe on avoiding the scams. I wouldn't be immune, but I'd fall for less of them.
And in turn, even after controlling for intelligence-- I think a random joe trained to ignore signals of ingroup membership would do better at avoiding scams than an equally intelligent random joe left to his native heuristics. There's a valid question of global performance-- heuristics are mostly useful and adaptive. But I think, in the current age of massive, permeable, low-trust, groups, I think most peoples' heuristics are lagging behind what's actually efficient.
Re: ego, it's debatable whether the Dunning-Krueger effect is actually real.. If it is real, then smart people must be less unjustifiably confident than non-smart people. If it's not... well, then we still haven't found any evidence in the reverse direction, so the null hypothesis should be that the level of unjustifiable confidence is the same, and non-smart people don't have any relative advantage. (While also suffering from the other phenomena I talked about in my original post.)
More options
Context Copy link
To be clear, I've also run into less intelligent people who commonly make bizarre mistakes, people so dumb they're actively destructive with their foolishness. And some of those people notice I'm kinda smart and try to play the know-it-all game with me, making up stuff that sounds vaguely plausible to sound like they know what they're talking about. My favorite was in a technology class where I wondered aloud why the Internet Protocol skipped from IPv4 to IPv6, and a guy I know is this type started telling me all about how big and important IPv5 was. I looked it up on my phone, and an IPv5 did exist, but it was a special-purpose experiment that was never in common use -- obviously not something this guy knew anything about. Instead of having curiosity with me, and going, "yeah, that sounds like a good question, we should find out!" he took it as an opportunity to fake insight.
When I first started meeting people like this, I found it very surprising, because it confused me why people would actively fake knowledge instead of being straightforward -- you know, "better to remain silent and be thought a fool than to speak and to remove all doubt." But once I met a few, the pattern became clear; once someone starts being like this, you'll notice them doing it all the time. I usually let people get away with a few of these moments simply because I'm interpersonally trusting, but once I realize someone's doing it I lose all respect for them as sources of advice and I tend to assume anything that comes out of their mouth is just noise.
Maybe this is just something people do to me, or maybe they think I'm bullshitting all the time and they're just mirroring it, but long story short, it's not always the case that people who did poorly in school understand, or act in accordance with, their limitations. Dumb people are just as capable of intellectual overconfidence as smart people, especially when they believe it will ingratiate them with someone smarter than them. But someone who says, "I don't know -- but I'd like to find out!" presents intellectual curiosity and actually increases my perception of their intelligence.
It fascinates me when academics, interviewed on some high-quality podcasts, reply with a point-blank "I don't know". And only after host adds more explicit hedging and reframes the question as the one aimed at best guess (instead of what they've probably been taught to perceive as "give me an up-to-date overview of the field on this question" query) they respond with an account, naturally transcending a median listener's knowledge on the topic by a large margin. Such public talks seem like a promising venue to instill (or at least popularize) the courage of admitting your ignorance.
I remember such a thing happening in a podcast (or maybe radio program? its been a while) where they had a group of physicists on a "popular science" format to speak about some new thing that was breaking in their field. The only one I remember was Lawrence Krauss (b/c he was actually a professor of mine in college). There was the back and forth like you describe, and eventually the host was able to get a fairly detailed answer to his question, after which he asked the guest if they could possibly simplify the explanation for "the folks listening at home". After a brief pause Dr. Krauss simply replied: "no".
Nice example. I think it's a decent stance. Compressing models/theories is always lossy and it takes a special skill to map them onto simpler models/metaphors, while keeping predictive power intact. If you are unsure how to do this, don't do this.
Sounds cool. What was the experience like?
I was a 100s level physics class for non-physics majors, mostly comp sci people with a scattering of others who took it for fun or it counted as a required elective. Apparently it was a special pet class of his that he'd been working on for a while. It was pretty low stress with the grade being mostly based on attendance and participation. He did miss a lot of the classes himself though. In fact, by his own attendance policy he would have failed the class by missing about 30% of them. I do remember he had a 80s era red Corvette that really stuck out in the staff lot and was a good indicator he was actually there that day.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What podcasts like this do you recommend?
Disclaimer: although I consider them high-quality by various proxy measures, ultimately I don't have enough knowledge to assess most of their takes.
Off the top of my head:
There is more, of course. Speaking of literal examples (from CwT):
Claudia Goldin:
Joseph Henrich
Paul Graham:
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Isn't this mostly what Robin Hanson's The Elephant in the Brain is about? Haven't read it yet.
Even intelligent people are still driven by urge to seek status and fit in and receive social acceptance, and that can be hacked by a savvy operator, even if the smart person 'knows better.'
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link