site banner

Culture War Roundup for the week of September 30, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

1
Jump in the discussion.

No email address required.

Now adays, any time there is a disaster in the United States, you should assume that there is a Russian social media effort to try and inflame and twist it. Sometimes a disaster doesn't even have to actually occur, and they'll just fake-news one. This is just one of the things they do, independent of any truth to any criticsm.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media. Sure, sometimes it’s trolls, but by this point, enough ultimately true stories were officially dismissed as misinformation until they were shown to actually have happened that I no longer find the “Russian Trolls” story to be a sensible hypothesis. In fact, I’m trying to think of a story told in the past 2-3 years where it’s actually traced back to a real Russian whether working for the government or not.

I’m mostly with the steelman here. People who don’t know what they’re doing wandering about a disaster area are more likely to create situations where they need rescue than to do substantial good — unless they have enough knowledge to know what they’re doing. A bunch of rednecks coming in and sawing through things or chopping down trees or whatever might well injure people or need rescue themselves. Disaster areas tend to be dangerous and the dangers aren’t always obvious. Taking your John boat over downed power lines is pretty dangerous. So the government probably is turning people away because they don’t want to rescue the redneck brigades who have no experience rescuing people.

It's always extremely easy to be intellectually lazy and unconsciously fall victim to propaganda. Britain and the US invented the modern public relations and propaganda industry and have been very successful at convincing the average person who had never even heard of places like Donetsk and Luhansk up until a few years ago, that they're on the right side of an issue they know nothing about.

I remember awhile ago getting into a debate with someone in the /r/geopolitics subreddit, who literally said to me that if Russia only spent more money on it's domestic social programs to take care of its people, NATO wouldn't expand into Ukraine. And that is not hyperbole. This is the quality and caliber of the average person who takes great pride in having very strong opinions about something they know absolutely nothing about. Americans in general are not very good when it comes to putting themselves in the shoes of other people, and when you combine that with someone who mistakes the philosophy subreddit for the geopolitics one when it comes to understanding international affairs, riding a bike on the highway isn't your only problem when you're also going the wrong direction.

If you don't understand what's really going on, then you can't even represent the other accurately enough to have a sensible disagreement with it.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media. Sure, sometimes it’s trolls, but by this point, enough ultimately true stories were officially dismissed as misinformation until they were shown to actually have happened that I no longer find the “Russian Trolls” story to be a sensible hypothesis.

Are you even dismissing the right hypothesis?

No, seriously. I think you mis-read what was claimed, and projected previous / other experiences onto it. The hypothesis is not that 'the coverage is the result of Russian trolls.' The hypothesis is 'no matter what happens, there will be Russian trolls trying to make it worse.' Whether the Russian trolls succeed in significantly shaping the conversation, or originated the talking points, or are fallaciously conflated with legitimate grievance is irrelevant to a characterization of their (a) existence and (b) attempts.

If you want to dismiss that, sure, but you haven't actually provided a grounds of disputing either supporting point. Which do you find non-sensible- that Russian troll farms like the Internet Research Agency exist?

Very directly- what do you think the Russians use the Internet Research Agency for? Not how influential it is, not whether it's fair to tar Americans with guilt by association. What do you think the Russian IRA does, and why?

In fact, I’m trying to think of a story told in the past 2-3 years where it’s actually traced back to a real Russian whether working for the government or not.

What does 'traced back' even mean in this context? If you mean 'originated with,' one of the more famous was the Colombian Chemicals Plant Hoax in 2014, and more recently the 2021 the pre-Ukraine War propaganda justification/narrative blitz, which included claims of genocide of Russian-speakers to justify Russian intervention.

But if 'traced back' means 'shaped / signal boosted,' which is the claimed level involvement here, then by definition any Russian social media coverage of any topic counts, especially since you said 'for the government or not.' Unless you intend to argue that the Russians don't use social media...?

Just for your understanding, this is exactly the danger of the Russian style of disinformation. It is decentralized and not tied to any particular narrative or to truth in general. The agents will amplify both true and false stories with impunity. This is because the stated goal of the Russian propaganda machine in the West is not, for example, 'make Russia look good' or 'show hypocrisy in Western countries'. The essential goal is to create division in Western societies over the long term by degrading trust in institutions, information sources, and each other.

So yes, in this case Russian disinformation may be amplifying actual government failures. In other cases it may be making things up wholesale. The point is to be aware that there are malign agents (and not just Russians) whose purpose is to turn this into a political or cultural battle rather than giving a clear picture of reality, and then factor that in to our assessment of the situation.

Show me a person of influence who made this case when the George Floyd video dropped.

I do not believe anything the Russians could ever say or do could hold even a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

Sure, I think this is a healthy perspective. But Russia, and China, trying to sow discord is an argument some make:

https://www.politico.com/news/2020/06/01/russia-and-china-target-us-protests-on-social-media-294315

While these official social media accounts have not posted doctored images or false information, they have sowed divisive content — a strategy that Russia previously used during the 2017 Catalan referendum in Spain and the 2019 European Parliament election, according to previous analyses of social media activity by POLITICO. The goal, according to disinformation experts, is to foment distrust on both sides of the political spectrum rather than publishing easily identifiable fake social media posts.

https://journals.sagepub.com/doi/abs/10.1177/19401612221082052

RT and Sputnik primarily produced negative coverage of the BLM movement, painting protestors as violent, or discussed the hypocrisy of racial justice in America. In contrast, newer media properties like In The NOW, Soapbox, and Redfish supported the BLM movement with clickbait-style videos highlighting racism in America.

The government or at least substantial parts of it wanted the BLM protests. They aren’t going to call it trolling.

But again, very little of the stuff named Russian Trolls can actually be traced to Russia in any way whatsoever. They can’t find Russians behind the Laptop, election fraud, UAPs, or Q. They can’t because it’s not Russia.

The person you responded to is filtered.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media.

I don't see any particular reason both can't be true.