site banner

Culture War Roundup for the week of September 30, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

Just for your understanding, this is exactly the danger of the Russian style of disinformation. It is decentralized and not tied to any particular narrative or to truth in general. The agents will amplify both true and false stories with impunity. This is because the stated goal of the Russian propaganda machine in the West is not, for example, 'make Russia look good' or 'show hypocrisy in Western countries'. The essential goal is to create division in Western societies over the long term by degrading trust in institutions, information sources, and each other.

So yes, in this case Russian disinformation may be amplifying actual government failures. In other cases it may be making things up wholesale. The point is to be aware that there are malign agents (and not just Russians) whose purpose is to turn this into a political or cultural battle rather than giving a clear picture of reality, and then factor that in to our assessment of the situation.

This is an unfalsifiable theory. If there is Russian interference, hey, wow, I was right. If there's not, well, whatever, I was just being careful, and it's always good to be careful.

Russian social media campaigns being in any way influential is extremely implausible. Whatever they might be spending would be a drop in the bucket relative to what Americans spend on social media all the time. That has been the case every time a number is attached to whatever Russia is supposedly spending.

Did he claim they were influential, or was he claiming a style?

If he's claiming a style, then that would actually be falsifiable, by establishing a different style is what is actually pursued.

When the style claimed is "increases discord", it's indistinguishable from internal partisans who are unhappy with the current state of affairs, and post their (discordant) opinions on social media.

I guess this is falsifiable if you found some russian operatives posting so as to... increase harmony, but this seems unlikely, and I can't really visualize what "increase discord" looks like on the other end. "Here's some rubles, go stir the shit on twitter"? Government propaganda campaigns always have some sort of goal in mind IME -- it used to be "promote global communism", but what is it now?

When the style claimed is "increases discord", it's indistinguishable from internal partisans who are unhappy with the current state of affairs, and post their (discordant) opinions on social media.

Absolutely. Or at least, almost indistinguishable. There are occasionally tells- for example, intermixing the awkward fixing of things an internal partisan wouldn't care about that happens to align with a foreign propaganda interest (plenty of Americans don't like the idea of fighting China over Taiwan, but only a minute number do so on grounds of appeals to the Century of Humiliation narrative)- but often it is indistiguishable.

This is why I'm fully sympathetic to people whose ideological immune system is flaring in suspicion.

I guess this is falsifiable if you found some russian operatives posting so as to... increase harmony, but this seems unlikely, and I can't really visualize what "increase discord" looks like on the other end. "Here's some rubles, go stir the shit on twitter"?

Unironically pretty close to that.

One of the origins of the modern Russian troll factory is that one of the more notorious- the Information Research Agency- was founded by Yevgeny Prigozhin. Yes, the Wagner Mercenary guy. Prigozhin was basically somewhere between a front, a fence, and a semi-autonomous vassal of Putin's security establishment. The distinction is that not only did he do what he was told, but he had a degree of freedom to try initiatives on his own. This was/is part of Putin's power structure, where inner-circle elites compete for power and influence and attention... and one of the ways is to do something impressive. Or, in Prigozhin's case, something that appeals to Putin's spy-mentality, while also serving as an excuse to charge the Russian government for services rendered. Other elites began copycatting later, and the American reaction probably justified the investment in Russian views, but IRA was the first (until it's dismantling / repurposing after the Wagner Coup and Prigozhin's assassination).

The IRA began in 2013, and by 2015 it had a reported ~1000 people working in a single building. One of its earlier claims to notice, before the 2016 election and compromise of American political discourse on that front, was back in 2014 when Russia was trying to recalibrate international opinion on its post-Euromaidan invasion of Ukraine. Buzzfeed published some leaked/stolen IRA documents, including a description of daily duties.

To quote-

"Foreign media are currently actively forming a negative image of the Russian Federation in the eyes of the global community," one of the project's team members, Svetlana Boiko, wrote in a strategy document. "Additionally, the discussions formed by comments to those articles are also negative in tone.

"Like any brand formed by popular opinion, Russia has its supporters ('brand advocates') and its opponents. The main problem is that in the foreign internet community, the ratio of supporters and opponents of Russia is about 20/80 respectively."

So how does one counter that narrative mismatch?

The documents show instructions provided to the commenters that detail the workload expected of them. On an average working day, the Russians are to post on news articles 50 times. Each blogger is to maintain six Facebook accounts publishing at least three posts a day and discussing the news in groups at least twice a day. By the end of the first month, they are expected to have won 500 subscribers and get at least five posts on each item a day. On Twitter, the bloggers are expected to manage 10 accounts with up to 2,000 followers and tweet 50 times a day.

...

The trolls appear to have taken pains to learn the sites' different commenting systems. A report on initial efforts to post comments discusses the types of profanity and abuse that are allowed on some sites, but not others. "Direct offense of Americans as a race are not published ('Your nation is a nation of complete idiots')," the author wrote of fringe conspiracy site WorldNetDaily, "nor are vulgar reactions to the political work of Barack Obama ('Obama did shit his pants while talking about foreign affairs, how you can feel yourself psychologically comfortable with pants full of shit?')." Another suggested creating "up to 100" fake accounts on the Huffington Post to master the site's complicated commenting system.

And how does one fund that?

The trolling project's finances are appropriately lavish for its considerable scale. A budget for April 2014, its first month, lists costs for 25 employees and expenses that together total over $75,000. The Internet Research Agency itself, founded last summer, now employs over 600 people and, if spending levels from December 2013 to April continue, is set to budget for over $10 million in 2014, according to the documents. Half of its budget is earmarked to be paid in cash.

So, yes. "Here's some rubles, go stir the shit on twitter" is unironically close to what happened. Reportedly.

And this was back in 2014, when it was still very new and immature as an institution. As internet social media technologies evolved, so did the Russian technical infrastructure and incorporation into information warfare theory, which itself evolved. Note that IRA in the early days functioned as a more message-focused concept (a russian position). However, other parts of the Russian information-proxy sphere were decentralized and took other, even contradictory stances- most notable to western observors in the pro-wagner vs pro-MOD narrative wars before the Wagner Coup.

If you'll forgive an unrepentantly NATO-based analysis, the Irregular Warfare Center has a pretty comprehensive analysis of how the Russian information efforts has evolved over time.

Government propaganda campaigns always have some sort of goal in mind IME -- it used to be "promote global communism", but what is it now?

Other models of propaganda include making you want to buy something (advertisement), go to a specific church (missionary work), think favorably of a specific cause or subject (advocacy), think worse of a specific cause (defamation),undercut a subject's moral authority (deligitmization), spread a cultural viewpoint (normalization), and so on.

For a more typical model, China's propaganda apparatus is much more focused on specific topics where it wants you have a specific position, such as a good view on Xi, the CPC, multipolarism, etc, while having no particular stance and spending no particular effort on others. Arguing both sides of an argument is rarely done, because point of propaganda is seen as to persuade / push to a certain perspective, and playing both sides at the same time is generally seen as information fratricde countering your own efforts. When confusion is the point, it can be pursued, but these are shorter-term and generally the exception rather than the norm. To a degree this is itself a measure of centralization- the Chinese government has a stronger message control over its directly employed propagandists than the Russians imposed on their associated blogosphere and elite-owned influencer networks.

A general 'increase discord by truth and fiction on any topic any time' motive is relatively rare as a result. Not only does that lead to contradictory themes, but doing so is a success on its own standing. Note how Russian sources fed both a source of anti-Trump narratives (the Steelle Dossier), and in anti-anti-trump narratives (social media boosting), or how in the Ukraine context Ukraine was simultaneously a NATO puppet controlled from abroad (attempting to generate nationalist resistance to foreign meddling against European liberalism) and a Nazi regime suppressing locals (a justification for foreign intervention to prevent an antithesis of European liberalism) . If the goal of propaganda was to actually enable a favored manchurian candidate or promote a foreign (Russian) intervention, this would be self-defeating, since you'd still be having primary state-propaganda persuasion of the classical model, but be actively undercutting it with more contradictory messaging.

An implication of this sort of model is not only is it cause-agnostic, but it can take both sides of the same argument at the same time- support Tribe A with social media via venue C, and Tribe B on the other stance with different media via venue D. (In a non-single-nation context, if you ever get the chance, look up the global conspiracy variations of 'who is to blame for COVID.' The US and China are not the only candidates claimed.) I've long since lost the articles, but a personal pet peeve back in the early Trump administration when the disinformation craze was at it's peak was how much of the coverage of 'Russian interference' in US politics didn't actually identify relative partisan themes being boosted.... because it was both Republican and Democratic themes.

Which, as you say, can be indistinguishable from partisan propaganda, even though it has a different intent.

and I can't really visualize what "increase discord" looks like on the other end. "Here's some rubles, go stir the shit on twitter"?

If you love what you do, you’ll never work a day in your life.

That would be even emptier. Be careful about what you see on social media, because it could have the same effect as Russian disinformation. That parses to something like: Look both ways before you cross the street, because a plane could fall on you.

Counter-point, "Remind yourself that overconfidence is a slow and insidious killer."

Which has the merit and utility of being actually useful advice. Overconfidence is a risk factor, and it can take a long time to take detrimental effect. You could dismiss the warning on the same grounds of falsifiability- if overconfidence does get you killed here then you were right and if it doesn't you're just being careful and careful is good- but this ignores that sustaining carefulness is an enduring good in and of itself.

This is a relatively common form of warning for harms that can come with unclear immediate impacts. Don't just eat mushrooms you find in a forest, they may be poisonous. Walk slower on just-mopped floors, they may be slippery. Don't trust strangers on the internet, they might be bad. The fact that these warnings don't have to come in a context where the element of danger is immediate or guaranteed doesn't make them non-falsifiable, and their value can come because the warned against function is rare. When an element of danger is rare, it's easy to ignore the possibility of something that could be prevented with diligence.

By contrast, 'look both ways because a plane could fall on you' has no link between cause of warning and effect of warning. Looking both ways does nothing to warn you of the danger that comes with 'up,' so there's no merit of dilligent reminder. It also an argument of a specific instance (planes crashing into crosswalks is so singular that it can't really be claimed as a trend) as opposed to a trend-consequence of mounting risks (overconfidence may not get you killed this time, but the reoccuring and persistent nature can lead the threat to grow over time).

Which simile is better for "the danger of the Russian style of disinformation" is up for debate, but I'd wager (and right) on the comparison to overconfidence than to airplanes-on-crosswalks.

How are you defining "disinformation" in this context? That Russia has a project to subvert the liberal international order that the US has ran since the post-war period? They openly admit that all the time and have made formal declarations admitting as much. So presumably anybody who advances a different narrative through their own perception of events isn't pushing disinformation, unless you're setting the bar extremely low.

If Russia is this nebulous disinformation fountainhead that some people seem think it is, then their actions prove that they're incredibly bad at it. What Russia 'has' been successful in doing is a form of national rebranding and international marketing to try and attract disaffected people in their own nations to join them. And why would such a measure be aimed at such an end? Because most of the fractious disunity in western nations has come by their own hand. The progressive left in this country has done more harm and inflicted more damage upon itself than Vladimir Putin or Osama bin Laden ever have.

How are you defining "disinformation" in this context? That Russia has a project to subvert the liberal international order that the US has ran since the post-war period? They openly admit that all the time and have made formal declarations admitting as much. So presumably anybody who advances a different narrative through their own perception of events isn't pushing disinformation, unless you're setting the bar extremely low.

Why shouldn't the bar be that low for the way flailingace is using it?

Even selectively signal-boosting true-but-non-representative things can have an effect of misleading an audience. This very thread is based on someone taking something that has happened (an accusation of pushback against people wanting to help) in a way that generates outrage (FEMA is deliberately witholding help, partisan motivation?) that plausibly wouldn't exist with other potentially relevant context (the government has an interest in managing airspace, which appears to be the form of pushback being alluded to).

Nothing in it is false, but it's not information structured for building objective understanding either. It is an oppositional / antagonist information presentation, and one that- if done deliberately- can be information to promote discord rather than discourse.

flailingace's position, as I understand it, isn't that it's disinformation on the basis of truth / not truth, or 'their own' narrative, but the intended result of why the information is being presented.

If Russia is this nebulous disinformation fountainhead that some people seem think it is, then their actions prove that they're incredibly bad at it. What Russia 'has' been successful in doing is a form of national rebranding and international marketing to try and attract disaffected people in their own nations to join them. And why would such a measure be aimed at such an end? Because most of the fractious disunity in western nations has come by their own hand. The progressive left in this country has done more harm and inflicted more damage upon itself than Vladimir Putin or Osama bin Laden ever have.

Okay, I don't even disagree with you, but how does this relate to flailaingace's position?

This is a counter-argument of relative effectiveness, of relative harm done, but flailingace wasn't making an argument of relative harm / culpability / etc. Flailingace is making a point that russia will attempt to promote discord, to a person who has dismissed russian trolls as a reasonable hypothesis, to another post that also does not rest on relative effectiveness.

Remember that this branch of the conversation itself started over someone saying they felt there was a bit of an effort to manufacture an issue. Not that the issue was entirely manufactured, or that the dominant cause or concerns were manufactured.

Why shouldn't the bar be that low for the way flailingace is using it?

You can personally set the bar wherever you want. But in that case, I'm struggling to understand why people say this like it's some kind of surprise. What am I supposed to be made to think or feel upon hearing that?

Even selectively signal-boosting true-but-non-representative things can have an effect of misleading an audience. This very thread is based on someone taking something that has happened (an accusation of pushback against people wanting to help) in a way that generates outrage (FEMA is deliberately witholding help, partisan motivation?) that plausibly wouldn't exist with other potentially relevant context (the government has an interest in managing airspace, which appears to be the form of pushback being alluded to).

Well put it this way then. Anyone who would want to hold Russia or anyone else for that matter guilty of disinformation and not the media complex in the west which IMO is far worse by comparison, has a very hard sell to convince me of some kind of moral indictment, because anyone who wouldn't also hang the whole of CNN, Fox, MSNBC, CBS and everyone else from lampposts outside their headquarters for also being guilty of disinformation, is just being a partisan hack.

Nothing in it is false, but it's not information structured for building objective understanding either. It is an oppositional / antagonist information presentation, and one that- if done deliberately- can be information to promote discord rather than discourse.

And RussiaToday can also make similar claims in some of their reports as well as far as exposing disinformation. So what? Are people calling for them to be restored to YouTube now on grounds of their occasional fairness?

flailingace's position, as I understand it, isn't that it's disinformation on the basis of truth / not truth, or 'their own' narrative, but the intended result of why the information is being presented.

Meaning what? If they're doing it for a good cause or something they agree with then its okay then?

You can personally set the bar wherever you want. But in that case, I'm struggling to understand why people say this like it's some kind of surprise. What am I supposed to be made to think or feel upon hearing that?

That yourself and others should think on what you are feeling, and why, before you act upon what you are feeling, in case someone is trying to deceptively manipulate your feelings to cause you to act in their interests rather than yours.

That the lesson may be unnecessary to you personally does not mean the lesson is not needed for other people. Some people may not recognize that they are being targetted for manipulation. Others may dismiss the existence of relevant actors to focus on other grievances.

Well put it this way then. Anyone who would want to hold Russia or anyone else for that matter guilty of disinformation and not the media complex in the west which IMO is far worse by comparison, has a very hard sell to convince me of some kind of moral indictment, because anyone who wouldn't also hang the whole of CNN, Fox, MSNBC, CBS and everyone else from lampposts outside their headquarters for also being guilty of disinformation, is just being a partisan hack.

Noted, but where do you get the belief that flailingace or myself wouldn't agree that those aren't also disinformation actors?

Granted, I don't believe in hanging disinformation actors in general, so I suppose I fail that purity test if that's the standard you want to make.

And RussiaToday can also make similar claims in some of their reports as well as far as exposing disinformation. So what?

So you should consider what, how, and why RT chooses to cover what it covering in the way it does before taking what it says as substantially true, the same as you should have bounded skepticism of any source...

...but also that you should recognize that RT, and countless actors like it, will continue to try and execute their motives in any given case, regardless of how much traction they have in general...

...so that if you start getting a suspicion that your intake of social media on something feels like it's being manipulated to try and encourage an impression, you're not being crazy, you are having a reasonable grounds of wanting to think more critically before you decide how to feel.

And, by extension, so are other people.

Are people calling for them to be restored to YouTube now on grounds of their occasional fairness?

Yes, and why would you think there aren't any? The topic has died away from public awareness with time and distance, but there were and still are people who would agree that banning RT from youtube was bad on various grounds.

One of the general reasons for maximal free speech stances is that even malefactors can bring up good points and challenge/compel better actors to clean themselves up in ways they wouldn't if the 'better' people could exclude them from the public stage, and that it's easier to hone the counter-arguments / strengthen your own when you can openly engage them.

Even completely unfair media actors have their defenders on why they should be allowed to have a public position. For example, North Korea is one of the extreme examples of 'bad media actor,' but it's youtube presence was (and, to a lesser degree, still is) a resource for researchers trying to understand.

And this doesn't even touch on grounds of national interest, ideology, or various forms of strategy. Russia took a decent black eye in the early Ukraine War when several hosts who had previously been taking the party line that the warnings of invasion were an American russophobic hoax publicly quit / were fired in objection. It was a self-harm / 'even their own propagandists couldn't support it' that could not have discredited the pro-Russian factions in various western governments had RT been restricted from that sort of public awareness earlier.

Meaning what? If they're doing it for a good cause or something they agree with then its okay then?

Less 'okay' and more of 'categorical difference in actor intent.'

Let's stick to 'just' true things, as in someone who never tells a direct falsehood.

If someone says true things because they value truth as an abstract concept in and of itself, we call them a truth-seeker and can recognize their errors may be out of ignorance but not deliberate distortion of context.

If someone says true things because they dislike deception even when it would benefit them, we call them honest, and can take them at their word. Their word may be limited, and unllike the truth seeker they may not be interested in actively establishing context and understanding, but they can be trusted within the bounds of that.

If someone would say true things but only selectively and with the intent to ruin others relationships, we would call them a manipulator, and recognize that they deserve extra scrutiny. Because their intent is what determines what they say and why, it behooves an audience to consider if there is additional context, missing information, or other truths that simply aren't being provided before believing what the manipulator tries to lead us to feel.

And this is before outright lies and other forms of dishonesty are included. A truth-seeker may have a motivated interest in what they focus on and find, an honest person may selectively try to avoid being questioned in certain ways to let a misunderstanding continue, but a manipulator who doesn't limit themselves to just truths can do even more to meet their interest.

Intent matters, and as such recognizing who's intent for what is a relevant piece of meta-context. 'Disinformation' may be an abused term, but 'Russian disinformation' is as good enough term as any other for characterizing a system intent by a coherent actor for information that is ambivalent about truth/accuracy but which is systemically proferred to try and shape public discourse in ways hoped to be systemically detrimental to the national target. This is a categorically different intent of, say, 'Partisan disinformation'- which wants what is bad for the opposition but good for the party- or 'ideological disinformation'- which wants what is good for cause and willing to tear down the obstacles.

You may feel the impact is grossly overestimated- and not only would I agree, but there was a very recent article last week pointing out a Russian incentive to overestimate their own impact which has interesting implications for if western leaders are accurately reflecting western intelligence accurately reporting on Russian self-assessments that are themselves incorrect for reasons of self-interested motivated reasoning- but again, what you are responding to isn't about 'relative' impact.

Show me a person of influence who made this case when the George Floyd video dropped.

I do not believe anything the Russians could ever say or do could hold even a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

I think I may have encountered a Russian troll. Specifically, this guy. He went into a bunch of WP articles about US surveillance, ruining them, and when I noticed the pattern and alerted WP he made a few ominous-but-vague threats and then vanished.

At the time I thought he was simply an NSA/CIA agent, but in retrospect I think that's unlikely. He was very sloppy, copypasting entire sections of NSA propaganda into Wikipedia without even changing the "we"s to "they"s, and my read on the Five Eyes is that they're usually slicker than that; a real NSA/CIA agent would also have no motivation to make vague public threats and then disappear, rather than simply ghosting straight away or picking up the phone to threaten someone for real. And if he wasn't a Five Eyes spook, he was somebody pretending to be one, presumably someone intending to get caught in order to frame them for vandalising Wikipedia. Could be a random lunatic, I suppose, but the people with a logical motive to do that are strategic adversaries of the USA, and my read based on PRC external propaganda and the Sam Dastyari fiasco is that 4D-chess shenanigans like this aren't their style. I suppose I'll never know, particularly since I've left Wikipedia.

a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.

Sure, I think this is a healthy perspective. But Russia, and China, trying to sow discord is an argument some make:

https://www.politico.com/news/2020/06/01/russia-and-china-target-us-protests-on-social-media-294315

While these official social media accounts have not posted doctored images or false information, they have sowed divisive content — a strategy that Russia previously used during the 2017 Catalan referendum in Spain and the 2019 European Parliament election, according to previous analyses of social media activity by POLITICO. The goal, according to disinformation experts, is to foment distrust on both sides of the political spectrum rather than publishing easily identifiable fake social media posts.

https://journals.sagepub.com/doi/abs/10.1177/19401612221082052

RT and Sputnik primarily produced negative coverage of the BLM movement, painting protestors as violent, or discussed the hypocrisy of racial justice in America. In contrast, newer media properties like In The NOW, Soapbox, and Redfish supported the BLM movement with clickbait-style videos highlighting racism in America.

Computer, enhance:

Over the last three days, Chinese ambassadors, Russian-backed news outlets and others with ties to Russia and China have tweeted more than 1,200 times about the United States,

Wow, foreign infiltrators tweeted a thousand times! That's a lot of tweets.

Come on, there is no evidence that these campaigns are barely statistically significant. I know guys who put out that many tweets in a week.

At least in 2016 they also had bots/provocateurs masquerading as legitimate users. And Russia just wanted to fan the flames, they played both sides from “gay rights to gun rights”.

WSJ 2017: Facebook Users Were Unwitting Targets of Russia-Backed Scheme
https://archive.ph/rZJBo

“Blacktivist,” an account that supported causes in the black community and used hashtags such as #BlackLivesMatter, frequently posted videos of police allegedly shooting unarmed black men.

The issues they targeted spanned the U.S. political and social spectrum, including religion, race, immigration, gun rights and gay rights. Facebook said the accounts were created by Russian entities to exploit tensions among Americans and interfere with U.S. elections.

NYT 2017: Purged Facebook Page Tied to the Kremlin Spread Anti-Immigrant Bile
https://archive.ph/kuS2E

Oh yeah? Did muh russia also coordinate the drag queen story hours in libraries? Did the russians decide to inject dildoes and strapons in childrens books? This muh russia shit is so tiresome, the progs are running ammok and when caught with their pants down all they can do is deflect-deflect-deflect.

A WSJ story about two Facebook accounts, and a NYT story about one. If these campaigns were worth taking seriously there'd be more to show for it. Right now scrolling twitter in fear of Russian Propaganda is pretty out-there in risk-perception. I read an article today about a sick cow in Idaho, maybe I'll go vegan.

The government or at least substantial parts of it wanted the BLM protests. They aren’t going to call it trolling.

But again, very little of the stuff named Russian Trolls can actually be traced to Russia in any way whatsoever. They can’t find Russians behind the Laptop, election fraud, UAPs, or Q. They can’t because it’s not Russia.

The person you responded to is filtered.