site banner

Culture War Roundup for the week of January 29, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

I certainly wouldn't turn down the advice of a benevolent weakly-godlike ASI, but I would much prefer to become one myself.

I wish to not need GPT-Ω at all, but to be able to understand the world better myself.

Now, I don't think reliance on such an entity would be anywhere near as bad as Scott's story about The Whispering Earrings, especially since I would expect that if it truly meets my criteria for benevolence wouldn't let me become little more than a puppet following strictly optimal decisions. I wish to make those myself.

Do you see what the common thread is, in all the problems you've mentioned?

It's a lack of intelligence. While not a panacea, it is as close to an unalloyed good as it gets. Someone with an IQ >120 will do a much better job trying to parse the world on their own terms than a true midwit who is probably better served by accepting the wisdom of authority figures diffused through noisy channels. Thinking for yourself is powerful. It is also dangerous.

There is no human alive, nor did one ever exist, who possessed the level of intelligence needed to grokk the entire world from first principles. Even geniuses need tutors, but their genius lets them learn the lessons well, and more importantly, know how trustworthy the tutor is.

And so all of us--well, maybe except for a minority of brilliant minds active here and in other rationalist spaces--are fooled into confusion, frustration, and a learned helplessness. Do you know of a young woman who insists she would never cross the street any differently if the person coming to you in the distance were of a particular sex, race, or age? Have you ever met someone who genuinely believes pit bulls are no more dangerous than any other breeds if given their love? What do you do if you happen to be parent of a young child who learns from her teachers, doctor, and the APA that her feelings of wanting to be a boy must be affirmed or she'd probably kill herself?

An underappreciated, if distasteful to my libertarian sensibilities, is how well the modern world has built guard-rails around people managing to do grievous harm to themselves from their stupidity. For most of history, you could make the best decisions you could, strive earnestly and intelligently, and yet starve to death during a famine, or have a barbarian deprive you of your head.

In contrast, the stupid/luxurg beliefs here are, in absolute magnitude, practically harmless to those who hold them:

The /r/aww Redditor gushing over velvet hippos will almost never have a nanny-dog maul them and theirs. Even the levels of criminality and destruction of the commons that bleeding heart tolerance for a criminal underclass stewing the commons with needles only reduces QOL to a degree far above what most of the 97 or so billion anatomically modern humans have tolerated. Vegans may suffer nutritional deficiencies, they are not likely to starve because the shaman demanded they ritually sacrifice their last goat to call back the rain.

People are insulated from the worst consequences of their stupidity. This is both a triumph and a tragedy of modernity, but the former outweighs the latter by orders of magnitudes. The strong, intelligent and self-sufficient are more enabled by the stability of modern society to make the most of their gifts than we lose from the average person being deeply stupid.

Expend an enormous amount of time researching individual issues as they come up. Try to get slightly more proficient over time with process, and gain leverage through trusted sources and tools (until they can no longer be trusted).

Most things don't matter. Your opinion on land value taxes or your choice of candidate in the next election have minimal effect on your well-being. This is why explicit Rationality is more of a hobby than a guaranteed means of success, Yudkowsky framed it as the systematized art of winning, and you don't need a system to win. Of course, if their efforts to cry wolf when the great of AI was a mere pup pay off now that it possesses teeth, it will all be worth it nonetheless.

Accept that your agency is limited. That most of your opinions will not change the world. That is okay. That is true. Do not let it dissuade you from trying to be better.

Can we solve this with good old free market capitalism?

The market can remain stupid longer than you can remain solvent, but it approximates efficiency nonetheless, given enough time. We can give it a helping hand, as I endorse @faceh in thinking Prediction Markets are some of the best social technology we could ever build, if only people would get the fuck out of the way when they're being implemented.

Surprisingly, Manifold outperforms real-money prediction markets, which I wager is a combination of the Crowd being larger and thus more Wise, and because Fake Internet Points and reputation on leaderboards have enough intrinsic value to users that they can substitute cold hard cash.