site banner

Culture War Roundup for the week of September 30, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

1
Jump in the discussion.

No email address required.

Now adays, any time there is a disaster in the United States, you should assume that there is a Russian social media effort to try and inflame and twist it. Sometimes a disaster doesn't even have to actually occur, and they'll just fake-news one. This is just one of the things they do, independent of any truth to any criticsm.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media. Sure, sometimes it’s trolls, but by this point, enough ultimately true stories were officially dismissed as misinformation until they were shown to actually have happened that I no longer find the “Russian Trolls” story to be a sensible hypothesis. In fact, I’m trying to think of a story told in the past 2-3 years where it’s actually traced back to a real Russian whether working for the government or not.

I’m mostly with the steelman here. People who don’t know what they’re doing wandering about a disaster area are more likely to create situations where they need rescue than to do substantial good — unless they have enough knowledge to know what they’re doing. A bunch of rednecks coming in and sawing through things or chopping down trees or whatever might well injure people or need rescue themselves. Disaster areas tend to be dangerous and the dangers aren’t always obvious. Taking your John boat over downed power lines is pretty dangerous. So the government probably is turning people away because they don’t want to rescue the redneck brigades who have no experience rescuing people.

It's always extremely easy to be intellectually lazy and unconsciously fall victim to propaganda. Britain and the US invented the modern public relations and propaganda industry and have been very successful at convincing the average person who had never even heard of places like Donetsk and Luhansk up until a few years ago, that they're on the right side of an issue they know nothing about.

I remember awhile ago getting into a debate with someone in the /r/geopolitics subreddit, who literally said to me that if Russia only spent more money on it's domestic social programs to take care of its people, NATO wouldn't expand into Ukraine. And that is not hyperbole. This is the quality and caliber of the average person who takes great pride in having very strong opinions about something they know absolutely nothing about. Americans in general are not very good when it comes to putting themselves in the shoes of other people, and when you combine that with someone who mistakes the philosophy subreddit for the geopolitics one when it comes to understanding international affairs, riding a bike on the highway isn't your only problem when you're also going the wrong direction.

If you don't understand what's really going on, then you can't even represent the other accurately enough to have a sensible disagreement with it.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media. Sure, sometimes it’s trolls, but by this point, enough ultimately true stories were officially dismissed as misinformation until they were shown to actually have happened that I no longer find the “Russian Trolls” story to be a sensible hypothesis.

Are you even dismissing the right hypothesis?

No, seriously. I think you mis-read what was claimed, and projected previous / other experiences onto it. The hypothesis is not that 'the coverage is the result of Russian trolls.' The hypothesis is 'no matter what happens, there will be Russian trolls trying to make it worse.' Whether the Russian trolls succeed in significantly shaping the conversation, or originated the talking points, or are fallaciously conflated with legitimate grievance is irrelevant to a characterization of their (a) existence and (b) attempts.

If you want to dismiss that, sure, but you haven't actually provided a grounds of disputing either supporting point. Which do you find non-sensible- that Russian troll farms like the Internet Research Agency exist?

Very directly- what do you think the Russians use the Internet Research Agency for? Not how influential it is, not whether it's fair to tar Americans with guilt by association. What do you think the Russian IRA does, and why?

In fact, I’m trying to think of a story told in the past 2-3 years where it’s actually traced back to a real Russian whether working for the government or not.

What does 'traced back' even mean in this context? If you mean 'originated with,' one of the more famous was the Colombian Chemicals Plant Hoax in 2014, and more recently the 2021 the pre-Ukraine War propaganda justification/narrative blitz, which included claims of genocide of Russian-speakers to justify Russian intervention.

But if 'traced back' means 'shaped / signal boosted,' which is the claimed level involvement here, then by definition any Russian social media coverage of any topic counts, especially since you said 'for the government or not.' Unless you intend to argue that the Russians don't use social media...?

I personally find the "Russian trolls" narratives to be really frustrating because, whether or not the subject actually originated, or even was just amplified by them, the discussion tends to devolve into Westerners (Americans) accusing each other of being Russian trolls. Which is itself a loss in social trust "making it worse" in ways far beyond what the Russians would have been able to do themselves. Bickering about Russian trolls is, in itself, a victory for those trolls! The long-running inquisition into the Russian activities in the 2016 election seems to me to have been far more damaging to American institutions than anything the Russians themselves directly did.

Which isn't to say that they don't exist -- they do -- but most coverage I see of the issue seems, at best, counterproductive.

I'd fully agree on grounds of counter-productive and social trust loss, and I've had similar thoughts for some time. Even here, the point of the original raising of it was an example of an actor that would be present rather than a claim that the actor was responsible, but not being clear enough about that clearly triggered the (justified!) argument-immune system response for some.

Which I think has been more than interesting enough to leave the original lack of clarity in, but I truly do sympathize for those who thought I was implying something I didn't intend to.

In the spirit of an apology- and to maybe remind myself to write on effort post on it later- here's a pretty interesting article from Foreign Affairs last week on how Russian influencer-networks like the Social Design Agency are inflating their roles.

This has some interesting (and effort-post worthy) implications for what it means for western discourse on Russian troll farms, as it can mean that Western leaders are truthfully conveying key points from actual intelligence reporting that accurately characterizes the intent of legitimate Russian influence efforts. It is both a potential example of the limits of deductive reasoning (where all premise must be true, but here the chain of links can be compromised by self-aggrandization), but also in characterizing the head-space of leaders who see these reports of 'we're going to mess with the Americans with lies', try to tell the public of these things, and are... discounted and dismissed by people who then also repeat themese these actors say they're going to boost.

There's more steps than that- the conflation of false and true signal boosting, the role of lack of social credibility, the motivated reasoning to believe the negative effects are the result of a malefactor taking credit for achieving them- but just as intellectual empathy requires understanding why some people can doubt elites for reasonable reasons, the same standard can understand that elites can have their own reasonable reasons to believe things others may dismiss as mere partisan motivation.

Just for your understanding, this is exactly the danger of the Russian style of disinformation. It is decentralized and not tied to any particular narrative or to truth in general. The agents will amplify both true and false stories with impunity. This is because the stated goal of the Russian propaganda machine in the West is not, for example, 'make Russia look good' or 'show hypocrisy in Western countries'. The essential goal is to create division in Western societies over the long term by degrading trust in institutions, information sources, and each other.

So yes, in this case Russian disinformation may be amplifying actual government failures. In other cases it may be making things up wholesale. The point is to be aware that there are malign agents (and not just Russians) whose purpose is to turn this into a political or cultural battle rather than giving a clear picture of reality, and then factor that in to our assessment of the situation.

This is an unfalsifiable theory. If there is Russian interference, hey, wow, I was right. If there's not, well, whatever, I was just being careful, and it's always good to be careful.

Russian social media campaigns being in any way influential is extremely implausible. Whatever they might be spending would be a drop in the bucket relative to what Americans spend on social media all the time. That has been the case every time a number is attached to whatever Russia is supposedly spending.

Did he claim they were influential, or was he claiming a style?

If he's claiming a style, then that would actually be falsifiable, by establishing a different style is what is actually pursued.

How are you defining "disinformation" in this context? That Russia has a project to subvert the liberal international order that the US has ran since the post-war period? They openly admit that all the time and have made formal declarations admitting as much. So presumably anybody who advances a different narrative through their own perception of events isn't pushing disinformation, unless you're setting the bar extremely low.

If Russia is this nebulous disinformation fountainhead that some people seem think it is, then their actions prove that they're incredibly bad at it. What Russia 'has' been successful in doing is a form of national rebranding and international marketing to try and attract disaffected people in their own nations to join them. And why would such a measure be aimed at such an end? Because most of the fractious disunity in western nations has come by their own hand. The progressive left in this country has done more harm and inflicted more damage upon itself than Vladimir Putin or Osama bin Laden ever have.

How are you defining "disinformation" in this context? That Russia has a project to subvert the liberal international order that the US has ran since the post-war period? They openly admit that all the time and have made formal declarations admitting as much. So presumably anybody who advances a different narrative through their own perception of events isn't pushing disinformation, unless you're setting the bar extremely low.

Why shouldn't the bar be that low for the way flailingace is using it?

Even selectively signal-boosting true-but-non-representative things can have an effect of misleading an audience. This very thread is based on someone taking something that has happened (an accusation of pushback against people wanting to help) in a way that generates outrage (FEMA is deliberately witholding help, partisan motivation?) that plausibly wouldn't exist with other potentially relevant context (the government has an interest in managing airspace, which appears to be the form of pushback being alluded to).

Nothing in it is false, but it's not information structured for building objective understanding either. It is an oppositional / antagonist information presentation, and one that- if done deliberately- can be information to promote discord rather than discourse.

flailingace's position, as I understand it, isn't that it's disinformation on the basis of truth / not truth, or 'their own' narrative, but the intended result of why the information is being presented.

If Russia is this nebulous disinformation fountainhead that some people seem think it is, then their actions prove that they're incredibly bad at it. What Russia 'has' been successful in doing is a form of national rebranding and international marketing to try and attract disaffected people in their own nations to join them. And why would such a measure be aimed at such an end? Because most of the fractious disunity in western nations has come by their own hand. The progressive left in this country has done more harm and inflicted more damage upon itself than Vladimir Putin or Osama bin Laden ever have.

Okay, I don't even disagree with you, but how does this relate to flailaingace's position?

This is a counter-argument of relative effectiveness, of relative harm done, but flailingace wasn't making an argument of relative harm / culpability / etc. Flailingace is making a point that russia will attempt to promote discord, to a person who has dismissed russian trolls as a reasonable hypothesis, to another post that also does not rest on relative effectiveness.

Remember that this branch of the conversation itself started over someone saying they felt there was a bit of an effort to manufacture an issue. Not that the issue was entirely manufactured, or that the dominant cause or concerns were manufactured.

Why shouldn't the bar be that low for the way flailingace is using it?

You can personally set the bar wherever you want. But in that case, I'm struggling to understand why people say this like it's some kind of surprise. What am I supposed to be made to think or feel upon hearing that?

Even selectively signal-boosting true-but-non-representative things can have an effect of misleading an audience. This very thread is based on someone taking something that has happened (an accusation of pushback against people wanting to help) in a way that generates outrage (FEMA is deliberately witholding help, partisan motivation?) that plausibly wouldn't exist with other potentially relevant context (the government has an interest in managing airspace, which appears to be the form of pushback being alluded to).

Well put it this way then. Anyone who would want to hold Russia or anyone else for that matter guilty of disinformation and not the media complex in the west which IMO is far worse by comparison, has a very hard sell to convince me of some kind of moral indictment, because anyone who wouldn't also hang the whole of CNN, Fox, MSNBC, CBS and everyone else from lampposts outside their headquarters for also being guilty of disinformation, is just being a partisan hack.

Nothing in it is false, but it's not information structured for building objective understanding either. It is an oppositional / antagonist information presentation, and one that- if done deliberately- can be information to promote discord rather than discourse.

And RussiaToday can also make similar claims in some of their reports as well as far as exposing disinformation. So what? Are people calling for them to be restored to YouTube now on grounds of their occasional fairness?

flailingace's position, as I understand it, isn't that it's disinformation on the basis of truth / not truth, or 'their own' narrative, but the intended result of why the information is being presented.

Meaning what? If they're doing it for a good cause or something they agree with then its okay then?

You can personally set the bar wherever you want. But in that case, I'm struggling to understand why people say this like it's some kind of surprise. What am I supposed to be made to think or feel upon hearing that?

That yourself and others should think on what you are feeling, and why, before you act upon what you are feeling, in case someone is trying to deceptively manipulate your feelings to cause you to act in their interests rather than yours.

That the lesson may be unnecessary to you personally does not mean the lesson is not needed for other people. Some people may not recognize that they are being targetted for manipulation. Others may dismiss the existence of relevant actors to focus on other grievances.

Well put it this way then. Anyone who would want to hold Russia or anyone else for that matter guilty of disinformation and not the media complex in the west which IMO is far worse by comparison, has a very hard sell to convince me of some kind of moral indictment, because anyone who wouldn't also hang the whole of CNN, Fox, MSNBC, CBS and everyone else from lampposts outside their headquarters for also being guilty of disinformation, is just being a partisan hack.

Noted, but where do you get the belief that flailingace or myself wouldn't agree that those aren't also disinformation actors?

Granted, I don't believe in hanging disinformation actors in general, so I suppose I fail that purity test if that's the standard you want to make.

And RussiaToday can also make similar claims in some of their reports as well as far as exposing disinformation. So what?

So you should consider what, how, and why RT chooses to cover what it covering in the way it does before taking what it says as substantially true, the same as you should have bounded skepticism of any source...

...but also that you should recognize that RT, and countless actors like it, will continue to try and execute their motives in any given case, regardless of how much traction they have in general...

...so that if you start getting a suspicion that your intake of social media on something feels like it's being manipulated to try and encourage an impression, you're not being crazy, you are having a reasonable grounds of wanting to think more critically before you decide how to feel.

And, by extension, so are other people.

Are people calling for them to be restored to YouTube now on grounds of their occasional fairness?

Yes, and why would you think there aren't any? The topic has died away from public awareness with time and distance, but there were and still are people who would agree that banning RT from youtube was bad on various grounds.

One of the general reasons for maximal free speech stances is that even malefactors can bring up good points and challenge/compel better actors to clean themselves up in ways they wouldn't if the 'better' people could exclude them from the public stage, and that it's easier to hone the counter-arguments / strengthen your own when you can openly engage them.

Even completely unfair media actors have their defenders on why they should be allowed to have a public position. For example, North Korea is one of the extreme examples of 'bad media actor,' but it's youtube presence was (and, to a lesser degree, still is) a resource for researchers trying to understand.

And this doesn't even touch on grounds of national interest, ideology, or various forms of strategy. Russia took a decent black eye in the early Ukraine War when several hosts who had previously been taking the party line that the warnings of invasion were an American russophobic hoax publicly quit / were fired in objection. It was a self-harm / 'even their own propagandists couldn't support it' that could not have discredited the pro-Russian factions in various western governments had RT been restricted from that sort of public awareness earlier.

Meaning what? If they're doing it for a good cause or something they agree with then its okay then?

Less 'okay' and more of 'categorical difference in actor intent.'

Let's stick to 'just' true things, as in someone who never tells a direct falsehood.

If someone says true things because they value truth as an abstract concept in and of itself, we call them a truth-seeker and can recognize their errors may be out of ignorance but not deliberate distortion of context.

If someone says true things because they dislike deception even when it would benefit them, we call them honest, and can take them at their word. Their word may be limited, and unllike the truth seeker they may not be interested in actively establishing context and understanding, but they can be trusted within the bounds of that.

If someone would say true things but only selectively and with the intent to ruin others relationships, we would call them a manipulator, and recognize that they deserve extra scrutiny. Because their intent is what determines what they say and why, it behooves an audience to consider if there is additional context, missing information, or other truths that simply aren't being provided before believing what the manipulator tries to lead us to feel.

And this is before outright lies and other forms of dishonesty are included. A truth-seeker may have a motivated interest in what they focus on and find, an honest person may selectively try to avoid being questioned in certain ways to let a misunderstanding continue, but a manipulator who doesn't limit themselves to just truths can do even more to meet their interest.

Intent matters, and as such recognizing who's intent for what is a relevant piece of meta-context. 'Disinformation' may be an abused term, but 'Russian disinformation' is as good enough term as any other for characterizing a system intent by a coherent actor for information that is ambivalent about truth/accuracy but which is systemically proferred to try and shape public discourse in ways hoped to be systemically detrimental to the national target. This is a categorically different intent of, say, 'Partisan disinformation'- which wants what is bad for the opposition but good for the party- or 'ideological disinformation'- which wants what is good for cause and willing to tear down the obstacles.

You may feel the impact is grossly overestimated- and not only would I agree, but there was a very recent article last week pointing out a Russian incentive to overestimate their own impact which has interesting implications for if western leaders are accurately reflecting western intelligence accurately reporting on Russian self-assessments that are themselves incorrect for reasons of self-interested motivated reasoning- but again, what you are responding to isn't about 'relative' impact.

Show me a person of influence who made this case when the George Floyd video dropped.

I do not believe anything the Russians could ever say or do could hold even a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

I think I may have encountered a Russian troll. Specifically, this guy. He went into a bunch of WP articles about US surveillance, ruining them, and when I noticed the pattern and alerted WP he made a few ominous-but-vague threats and then vanished.

At the time I thought he was simply an NSA/CIA agent, but in retrospect I think that's unlikely. He was very sloppy, copypasting entire sections of NSA propaganda into Wikipedia without even changing the "we"s to "they"s, and my read on the Five Eyes is that they're usually slicker than that; a real NSA/CIA agent would also have no motivation to make vague public threats and then disappear, rather than simply ghosting straight away or picking up the phone to threaten someone for real. And if he wasn't a Five Eyes spook, he was somebody pretending to be one, presumably someone intending to get caught in order to frame them for vandalising Wikipedia. Could be a random lunatic, I suppose, but the people with a logical motive to do that are strategic adversaries of the USA, and my read based on PRC external propaganda and the Sam Dastyari fiasco is that 4D-chess shenanigans like this aren't their style. I suppose I'll never know, particularly since I've left Wikipedia.

a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

Sure, I think this is a healthy perspective. But Russia, and China, trying to sow discord is an argument some make:

https://www.politico.com/news/2020/06/01/russia-and-china-target-us-protests-on-social-media-294315

While these official social media accounts have not posted doctored images or false information, they have sowed divisive content — a strategy that Russia previously used during the 2017 Catalan referendum in Spain and the 2019 European Parliament election, according to previous analyses of social media activity by POLITICO. The goal, according to disinformation experts, is to foment distrust on both sides of the political spectrum rather than publishing easily identifiable fake social media posts.

https://journals.sagepub.com/doi/abs/10.1177/19401612221082052

RT and Sputnik primarily produced negative coverage of the BLM movement, painting protestors as violent, or discussed the hypocrisy of racial justice in America. In contrast, newer media properties like In The NOW, Soapbox, and Redfish supported the BLM movement with clickbait-style videos highlighting racism in America.

Computer, enhance:

Over the last three days, Chinese ambassadors, Russian-backed news outlets and others with ties to Russia and China have tweeted more than 1,200 times about the United States,

Wow, foreign infiltrators tweeted a thousand times! That's a lot of tweets.

Come on, there is no evidence that these campaigns are barely statistically significant. I know guys who put out that many tweets in a week.

The government or at least substantial parts of it wanted the BLM protests. They aren’t going to call it trolling.

But again, very little of the stuff named Russian Trolls can actually be traced to Russia in any way whatsoever. They can’t find Russians behind the Laptop, election fraud, UAPs, or Q. They can’t because it’s not Russia.

The person you responded to is filtered.

On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media.

I don't see any particular reason both can't be true.