This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
On the other hand, it’s a very very useful tool to hide incompetence and grift. Everything the government doesn’t want people talking about seems to be “Russian Trolls” and it’s become a sort of go to excuse for why people are saying things the government doesn’t want to hear on social media. Sure, sometimes it’s trolls, but by this point, enough ultimately true stories were officially dismissed as misinformation until they were shown to actually have happened that I no longer find the “Russian Trolls” story to be a sensible hypothesis. In fact, I’m trying to think of a story told in the past 2-3 years where it’s actually traced back to a real Russian whether working for the government or not.
I’m mostly with the steelman here. People who don’t know what they’re doing wandering about a disaster area are more likely to create situations where they need rescue than to do substantial good — unless they have enough knowledge to know what they’re doing. A bunch of rednecks coming in and sawing through things or chopping down trees or whatever might well injure people or need rescue themselves. Disaster areas tend to be dangerous and the dangers aren’t always obvious. Taking your John boat over downed power lines is pretty dangerous. So the government probably is turning people away because they don’t want to rescue the redneck brigades who have no experience rescuing people.
It's always extremely easy to be intellectually lazy and unconsciously fall victim to propaganda. Britain and the US invented the modern public relations and propaganda industry and have been very successful at convincing the average person who had never even heard of places like Donetsk and Luhansk up until a few years ago, that they're on the right side of an issue they know nothing about.
I remember awhile ago getting into a debate with someone in the /r/geopolitics subreddit, who literally said to me that if Russia only spent more money on it's domestic social programs to take care of its people, NATO wouldn't expand into Ukraine. And that is not hyperbole. This is the quality and caliber of the average person who takes great pride in having very strong opinions about something they know absolutely nothing about. Americans in general are not very good when it comes to putting themselves in the shoes of other people, and when you combine that with someone who mistakes the philosophy subreddit for the geopolitics one when it comes to understanding international affairs, riding a bike on the highway isn't your only problem when you're also going the wrong direction.
If you don't understand what's really going on, then you can't even represent the other accurately enough to have a sensible disagreement with it.
More options
Context Copy link
Are you even dismissing the right hypothesis?
No, seriously. I think you mis-read what was claimed, and projected previous / other experiences onto it. The hypothesis is not that 'the coverage is the result of Russian trolls.' The hypothesis is 'no matter what happens, there will be Russian trolls trying to make it worse.' Whether the Russian trolls succeed in significantly shaping the conversation, or originated the talking points, or are fallaciously conflated with legitimate grievance is irrelevant to a characterization of their (a) existence and (b) attempts.
If you want to dismiss that, sure, but you haven't actually provided a grounds of disputing either supporting point. Which do you find non-sensible- that Russian troll farms like the Internet Research Agency exist?
Very directly- what do you think the Russians use the Internet Research Agency for? Not how influential it is, not whether it's fair to tar Americans with guilt by association. What do you think the Russian IRA does, and why?
What does 'traced back' even mean in this context? If you mean 'originated with,' one of the more famous was the Colombian Chemicals Plant Hoax in 2014, and more recently the 2021 the pre-Ukraine War propaganda justification/narrative blitz, which included claims of genocide of Russian-speakers to justify Russian intervention.
But if 'traced back' means 'shaped / signal boosted,' which is the claimed level involvement here, then by definition any Russian social media coverage of any topic counts, especially since you said 'for the government or not.' Unless you intend to argue that the Russians don't use social media...?
Except that every time I’ve seen the claim made, it’s not really backed up by any evidence of trolling. It’s just a go-to excuse for the reports in question and circulated on social media. This isn’t remotely a good faith attempt at explaining what’s going on, but an easy off the cuff statement of “yeah don’t pay attention to this.” And I think at this point, the propaganda claims that Russia is causing or amplifying these stories by far outstrips what Russia itself is actually doing.
Yes, troll farms exist, I’m not disputing that Russia, China, and pretty much every other country with internet access has some sort of troll farm. But if they aren’t capable of getting results and getting good results, then it kinda doesn’t matter. And given that it’s possible for us to track them, we know where the trolling is coming from, stuff like this is probably fairly trivial to block.
And to be clear my grounds for dismissal are pretty simple. First, this is the go-to story every single time a social media story contradicts or embarrasses the cathedral. It never happens that Russian Trolls are pushing the narrative of Project 2025, or calling Trump a danger to democracy, or calling Republicans fascists. That is never considered trolling. But when the story is something embarrassing to the establishment, that’s the trolls. Kinda interesting how one set of stories is always pushed by, started by, faked by, or amplified by Russia, and the other side absolutely never is.
Secondly, we never seem to find out which Russian troll account starts or amplifies these stories. Can you name any troll accounts outed by the regime? Have they given any evidence beyond “trust us bro” for any such claims that a story has been deliberately seeded or amplified by a known Russian troll account? And this seems fairly telling. There’s almost never evidence presented to show these trolls did all the things they’re accused of. They are invisible and leave no evidence behind every time.
The Russian IRA does cyberwarfare, that much is obvious. To the degree it exists, it’s there to do various forms of cyber warfare in support of Russian military operations. And it’s not like I don’t think they’re occasionally effective. Honestly they might be as good as the ones in the CIA group we have. But again, if you’re going to issue a blanket statement that every anti-cathedral story on social media is based on something Russians are pushing, it’s simply not credible unless and until it’s shown to actually have been done by Russia.
To blame Russian trolls for every negative viral story is a conspiracy theory.
Cool, but who here that you're replying to is doing that?
You lead off with this,
And if you're looking for examples of Russian efforts because you literally have never head any, sure these can be provided. Here is a 140 page academic review of Russian propaganda in the context of the start of the Ukraine war. Here is a 2014 (and thus pre-2016 craziness) on the Internet Research Agency, one of the original notable troll farms. Here is coverage of an IRA-linked accounts conducting an Ebola and cop-shooting hoax in Atlanta, GA. Here is a study of when IRA accounts were engaging in pre-COVID vaccine debates. Here is IRA posters involved in inflamatory British rhetoric. Here are times they helped organize protests by Americans on differing parts of the spectrum, including BLM.
Heck- and you'd probably agree with the thesis of this one- here is a Foreign Affairs article including a recount of the Doppelganger project which cloned entire news sites to introduce fake news in what people believed would be real webistes.
One of the benefits of the IRA when it was around was that it didn't constantly change all of its accounts regularly, allowing for pattern-matching. This has gotten rarer with evolutions in bot-technologies and such, but you can still find examples if you look.
But then you go to this
Which is assigning a motive to me I do not have, and a mischaracterization of many opinions I do have.
To which I and others would say... yes! If / when Russian troll accounts are linked to these such things, they can absolutely be called supported by Russian trolls! It's Russian trolls if they're involved in trying to arrange Black Lives Matter protests. If Russia trolls are linked to supporting a cause it is considered Russian trolling. There is no claim to the Russsian troll style that there is any allegiance to a specific cause.
....but this is where I feel bad for you, because this is the opposite of positions already provided to you in this overall thread. The people claiming Russian trolls only support one side are not the people you are actually arguing against, shoving other peoples arguments into theirs is dishonest.
As such, I'm going to skip to this-
And be frank: it doesn't matter to the argument you responded to if Russian stroll accounts start or amplify these stories.
There are cases of Russian trolls starting stories. There are cases of Russian trolls amplifying stories. Neither is meaningfully different when it comes to whether it is a bit of an effort to manufacture a narrative. Signal boosting and initiation are both ways to try and manipulate narratives.
More options
Context Copy link
Based on the "cathedral" and "establishment" phrasing, it looks like you believe the Democrats are basically in control of the country. If so, then why would Russia, should they aim to destabilize, push narratives against the underdog Republicans? They are already losing, if Russia starts helping Democrats they'll just lose harder and then there'll be no destabilization, just securing the Democrat regime.
Would boosting PunchANazi, BLM, MeToo, Trans Women Are Women and whatnot count as helping or hurting Republicans?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I have two questions:
$controversy
is spreading, what's the point of bringing Russians into it, if you're not going to make a claim on the spread being a result of their interference?Yes. The point is raising an uncontroversial example demonstrating the claim that there are motivated actors who will try and shift a public discourse regardless of context, and whether or not that involves lying or truth.
Notably, the controversy here isn't whether the Russians do it, which was the claimed example, but how responsible they are for the effect of discord, which was not argued and irrelevant to the position.
Russia was raised as single example because a single example was all that was needed to demonstrate the premise, a single example from US politics could have been interpreted as an insinuation of relative responsible to the party invoked and insulting to the tribes associated with it, and two or more examples would have been twice or more the work without changing the generally uncontested point that the example was raised to demonstrate.
Writing about a whole bunch of groups seemed unnecessary. Is it?
It's still not clear to me what is the meaningful content here. What information is it bringing that wasn't already being taken into account?
- Jews steal!
- Everybody steals!
- Well yeah, but we were talking about Jews.
What can I say, I disagree. If you wanted to make the point that we are all being psy-opped by cyber-warfare divisions of every major world power, the point would have been better made as a general statement. If you single out one particular power, it will look like you think there's something different about them in particular.
It's a bit like that thing about cat-eating Haitians. The claim is not particularly interesting if it was a freak occurrence, and raising it only makes sense if Haitians are disproportionately more likely to do it.
To the person who originally felt that there may be actors trying to manipulate public discourse, affirmation that there are actors trying to manipulate public discourse.
Someone is learning something for the first time every day. The information is always meaningful for those who weren't already taking it into account.
Except that not all psy-opps run in the Russian style, which was the specific style identified for the example, so claiming that every major world power is psy-opping in the same way would not only be wrong, but a deliberate falsehood.
And if I didn't single out an example, I could be accused of not supporting a claim and doing low-effort posting.
Shrugs
Is there a credible reason to believe a disproportionately refugee population from a state with endemic contemporary food insecurity is not disproportionately more likely to partake in non-traditional dining?
I'm not sure what to tell that person other than "welcome to the Internet". There have been actors trying to manipulate public discourse since forever. Maybe you mean "state actors"? That is an interesting development, as far as history goes, but it's not even a recent one.
Ok, so there is a reason to single-out Russia. I'll even agree with it. Unlike when they're trying to affect countries in their orbit (say, for example, Russia trying to push Ukrainians to vote for a pro-Russian party), Americas rivals probably have greater incentive just to cause chaos to weaken America, rather than back any particular faction, so their cyber-warfare operations will look particularly twisted.
And while this might be an interesting conversation if we were discussion psy-ops in themselves, I still feel like my earlier "what's the content here" question still has merit. Because Russians have an incentive to cause chaos and have westerners at each-others throats, you can't even tell what narrative they're promoting. It could be "FEMA IS PREVENTING VOLUNTEERS FROM DELIVERING AID" or it could be "RUSSIAN BOTS AND CONSPIRACY THEORIST ARE CLAIMING FEMA IS PREVENTING VOLUNTEERS FROM DELIVERING AID". It could even be both. It just doesn't seem to bring that much into a discussion on whether it's true that FEMA is blocking aid.
I'm actually on team "Haitians eat cats" for this very reason, it was just an example. Now that I think about it, I'm not sure the analogy even fits that well, but my point was just that if you single a group out you should show how that group is different from the other groups.
More options
Context Copy link
The argument is that Haitians never did it, not once. And TBF the evidence in favor of them doing it is pretty weak. I'm not aware of anyone who says there was a freak occurrence of Haitians eating cats.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I personally find the "Russian trolls" narratives to be really frustrating because, whether or not the subject actually originated, or even was just amplified by them, the discussion tends to devolve into Westerners (Americans) accusing each other of being Russian trolls. Which is itself a loss in social trust "making it worse" in ways far beyond what the Russians would have been able to do themselves. Bickering about Russian trolls is, in itself, a victory for those trolls! The long-running inquisition into the Russian activities in the 2016 election seems to me to have been far more damaging to American institutions than anything the Russians themselves directly did.
Which isn't to say that they don't exist -- they do -- but most coverage I see of the issue seems, at best, counterproductive.
I'd fully agree on grounds of counter-productive and social trust loss, and I've had similar thoughts for some time. Even here, the point of the original raising of it was an example of an actor that would be present rather than a claim that the actor was responsible, but not being clear enough about that clearly triggered the (justified!) argument-immune system response for some.
Which I think has been more than interesting enough to leave the original lack of clarity in, but I truly do sympathize for those who thought I was implying something I didn't intend to.
In the spirit of an apology- and to maybe remind myself to write on effort post on it later- here's a pretty interesting article from Foreign Affairs last week on how Russian influencer-networks like the Social Design Agency are inflating their roles.
This has some interesting (and effort-post worthy) implications for what it means for western discourse on Russian troll farms, as it can mean that Western leaders are truthfully conveying key points from actual intelligence reporting that accurately characterizes the intent of legitimate Russian influence efforts. It is both a potential example of the limits of deductive reasoning (where all premise must be true, but here the chain of links can be compromised by self-aggrandization), but also in characterizing the head-space of leaders who see these reports of 'we're going to mess with the Americans with lies', try to tell the public of these things, and are... discounted and dismissed by people who then also repeat themese these actors say they're going to boost.
There's more steps than that- the conflation of false and true signal boosting, the role of lack of social credibility, the motivated reasoning to believe the negative effects are the result of a malefactor taking credit for achieving them- but just as intellectual empathy requires understanding why some people can doubt elites for reasonable reasons, the same standard can understand that elites can have their own reasonable reasons to believe things others may dismiss as mere partisan motivation.
I look forward to reading your effortpost! It sounds interesting.
EDIT: that Foreign Affairs article seems pretty reasonable. Thanks for the link!
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Just for your understanding, this is exactly the danger of the Russian style of disinformation. It is decentralized and not tied to any particular narrative or to truth in general. The agents will amplify both true and false stories with impunity. This is because the stated goal of the Russian propaganda machine in the West is not, for example, 'make Russia look good' or 'show hypocrisy in Western countries'. The essential goal is to create division in Western societies over the long term by degrading trust in institutions, information sources, and each other.
So yes, in this case Russian disinformation may be amplifying actual government failures. In other cases it may be making things up wholesale. The point is to be aware that there are malign agents (and not just Russians) whose purpose is to turn this into a political or cultural battle rather than giving a clear picture of reality, and then factor that in to our assessment of the situation.
This is an unfalsifiable theory. If there is Russian interference, hey, wow, I was right. If there's not, well, whatever, I was just being careful, and it's always good to be careful.
Russian social media campaigns being in any way influential is extremely implausible. Whatever they might be spending would be a drop in the bucket relative to what Americans spend on social media all the time. That has been the case every time a number is attached to whatever Russia is supposedly spending.
Did he claim they were influential, or was he claiming a style?
If he's claiming a style, then that would actually be falsifiable, by establishing a different style is what is actually pursued.
When the style claimed is "increases discord", it's indistinguishable from internal partisans who are unhappy with the current state of affairs, and post their (discordant) opinions on social media.
I guess this is falsifiable if you found some russian operatives posting so as to... increase harmony, but this seems unlikely, and I can't really visualize what "increase discord" looks like on the other end. "Here's some rubles, go stir the shit on twitter"? Government propaganda campaigns always have some sort of goal in mind IME -- it used to be "promote global communism", but what is it now?
Absolutely. Or at least, almost indistinguishable. There are occasionally tells- for example, intermixing the awkward fixing of things an internal partisan wouldn't care about that happens to align with a foreign propaganda interest (plenty of Americans don't like the idea of fighting China over Taiwan, but only a minute number do so on grounds of appeals to the Century of Humiliation narrative)- but often it is indistiguishable.
This is why I'm fully sympathetic to people whose ideological immune system is flaring in suspicion.
Unironically pretty close to that.
One of the origins of the modern Russian troll factory is that one of the more notorious- the Information Research Agency- was founded by Yevgeny Prigozhin. Yes, the Wagner Mercenary guy. Prigozhin was basically somewhere between a front, a fence, and a semi-autonomous vassal of Putin's security establishment. The distinction is that not only did he do what he was told, but he had a degree of freedom to try initiatives on his own. This was/is part of Putin's power structure, where inner-circle elites compete for power and influence and attention... and one of the ways is to do something impressive. Or, in Prigozhin's case, something that appeals to Putin's spy-mentality, while also serving as an excuse to charge the Russian government for services rendered. Other elites began copycatting later, and the American reaction probably justified the investment in Russian views, but IRA was the first (until it's dismantling / repurposing after the Wagner Coup and Prigozhin's assassination).
The IRA began in 2013, and by 2015 it had a reported ~1000 people working in a single building. One of its earlier claims to notice, before the 2016 election and compromise of American political discourse on that front, was back in 2014 when Russia was trying to recalibrate international opinion on its post-Euromaidan invasion of Ukraine. Buzzfeed published some leaked/stolen IRA documents, including a description of daily duties.
To quote-
So how does one counter that narrative mismatch?
...
And how does one fund that?
So, yes. "Here's some rubles, go stir the shit on twitter" is unironically close to what happened. Reportedly.
And this was back in 2014, when it was still very new and immature as an institution. As internet social media technologies evolved, so did the Russian technical infrastructure and incorporation into information warfare theory, which itself evolved. Note that IRA in the early days functioned as a more message-focused concept (a russian position). However, other parts of the Russian information-proxy sphere were decentralized and took other, even contradictory stances- most notable to western observors in the pro-wagner vs pro-MOD narrative wars before the Wagner Coup.
If you'll forgive an unrepentantly NATO-based analysis, the Irregular Warfare Center has a pretty comprehensive analysis of how the Russian information efforts has evolved over time.
Other models of propaganda include making you want to buy something (advertisement), go to a specific church (missionary work), think favorably of a specific cause or subject (advocacy), think worse of a specific cause (defamation),undercut a subject's moral authority (deligitmization), spread a cultural viewpoint (normalization), and so on.
For a more typical model, China's propaganda apparatus is much more focused on specific topics where it wants you have a specific position, such as a good view on Xi, the CPC, multipolarism, etc, while having no particular stance and spending no particular effort on others. Arguing both sides of an argument is rarely done, because point of propaganda is seen as to persuade / push to a certain perspective, and playing both sides at the same time is generally seen as information fratricde countering your own efforts. When confusion is the point, it can be pursued, but these are shorter-term and generally the exception rather than the norm. To a degree this is itself a measure of centralization- the Chinese government has a stronger message control over its directly employed propagandists than the Russians imposed on their associated blogosphere and elite-owned influencer networks.
A general 'increase discord by truth and fiction on any topic any time' motive is relatively rare as a result. Not only does that lead to contradictory themes, but doing so is a success on its own standing. Note how Russian sources fed both a source of anti-Trump narratives (the Steelle Dossier), and in anti-anti-trump narratives (social media boosting), or how in the Ukraine context Ukraine was simultaneously a NATO puppet controlled from abroad (attempting to generate nationalist resistance to foreign meddling against European liberalism) and a Nazi regime suppressing locals (a justification for foreign intervention to prevent an antithesis of European liberalism) . If the goal of propaganda was to actually enable a favored manchurian candidate or promote a foreign (Russian) intervention, this would be self-defeating, since you'd still be having primary state-propaganda persuasion of the classical model, but be actively undercutting it with more contradictory messaging.
An implication of this sort of model is not only is it cause-agnostic, but it can take both sides of the same argument at the same time- support Tribe A with social media via venue C, and Tribe B on the other stance with different media via venue D. (In a non-single-nation context, if you ever get the chance, look up the global conspiracy variations of 'who is to blame for COVID.' The US and China are not the only candidates claimed.) I've long since lost the articles, but a personal pet peeve back in the early Trump administration when the disinformation craze was at it's peak was how much of the coverage of 'Russian interference' in US politics didn't actually identify relative partisan themes being boosted.... because it was both Republican and Democratic themes.
Which, as you say, can be indistinguishable from partisan propaganda, even though it has a different intent.
More options
Context Copy link
If you love what you do, you’ll never work a day in your life.
More options
Context Copy link
More options
Context Copy link
That would be even emptier. Be careful about what you see on social media, because it could have the same effect as Russian disinformation. That parses to something like: Look both ways before you cross the street, because a plane could fall on you.
Counter-point, "Remind yourself that overconfidence is a slow and insidious killer."
Which has the merit and utility of being actually useful advice. Overconfidence is a risk factor, and it can take a long time to take detrimental effect. You could dismiss the warning on the same grounds of falsifiability- if overconfidence does get you killed here then you were right and if it doesn't you're just being careful and careful is good- but this ignores that sustaining carefulness is an enduring good in and of itself.
This is a relatively common form of warning for harms that can come with unclear immediate impacts. Don't just eat mushrooms you find in a forest, they may be poisonous. Walk slower on just-mopped floors, they may be slippery. Don't trust strangers on the internet, they might be bad. The fact that these warnings don't have to come in a context where the element of danger is immediate or guaranteed doesn't make them non-falsifiable, and their value can come because the warned against function is rare. When an element of danger is rare, it's easy to ignore the possibility of something that could be prevented with diligence.
By contrast, 'look both ways because a plane could fall on you' has no link between cause of warning and effect of warning. Looking both ways does nothing to warn you of the danger that comes with 'up,' so there's no merit of dilligent reminder. It also an argument of a specific instance (planes crashing into crosswalks is so singular that it can't really be claimed as a trend) as opposed to a trend-consequence of mounting risks (overconfidence may not get you killed this time, but the reoccuring and persistent nature can lead the threat to grow over time).
Which simile is better for "the danger of the Russian style of disinformation" is up for debate, but I'd wager (and right) on the comparison to overconfidence than to airplanes-on-crosswalks.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
How are you defining "disinformation" in this context? That Russia has a project to subvert the liberal international order that the US has ran since the post-war period? They openly admit that all the time and have made formal declarations admitting as much. So presumably anybody who advances a different narrative through their own perception of events isn't pushing disinformation, unless you're setting the bar extremely low.
If Russia is this nebulous disinformation fountainhead that some people seem think it is, then their actions prove that they're incredibly bad at it. What Russia 'has' been successful in doing is a form of national rebranding and international marketing to try and attract disaffected people in their own nations to join them. And why would such a measure be aimed at such an end? Because most of the fractious disunity in western nations has come by their own hand. The progressive left in this country has done more harm and inflicted more damage upon itself than Vladimir Putin or Osama bin Laden ever have.
Why shouldn't the bar be that low for the way flailingace is using it?
Even selectively signal-boosting true-but-non-representative things can have an effect of misleading an audience. This very thread is based on someone taking something that has happened (an accusation of pushback against people wanting to help) in a way that generates outrage (FEMA is deliberately witholding help, partisan motivation?) that plausibly wouldn't exist with other potentially relevant context (the government has an interest in managing airspace, which appears to be the form of pushback being alluded to).
Nothing in it is false, but it's not information structured for building objective understanding either. It is an oppositional / antagonist information presentation, and one that- if done deliberately- can be information to promote discord rather than discourse.
flailingace's position, as I understand it, isn't that it's disinformation on the basis of truth / not truth, or 'their own' narrative, but the intended result of why the information is being presented.
Okay, I don't even disagree with you, but how does this relate to flailaingace's position?
This is a counter-argument of relative effectiveness, of relative harm done, but flailingace wasn't making an argument of relative harm / culpability / etc. Flailingace is making a point that russia will attempt to promote discord, to a person who has dismissed russian trolls as a reasonable hypothesis, to another post that also does not rest on relative effectiveness.
Remember that this branch of the conversation itself started over someone saying they felt there was a bit of an effort to manufacture an issue. Not that the issue was entirely manufactured, or that the dominant cause or concerns were manufactured.
You can personally set the bar wherever you want. But in that case, I'm struggling to understand why people say this like it's some kind of surprise. What am I supposed to be made to think or feel upon hearing that?
Well put it this way then. Anyone who would want to hold Russia or anyone else for that matter guilty of disinformation and not the media complex in the west which IMO is far worse by comparison, has a very hard sell to convince me of some kind of moral indictment, because anyone who wouldn't also hang the whole of CNN, Fox, MSNBC, CBS and everyone else from lampposts outside their headquarters for also being guilty of disinformation, is just being a partisan hack.
And RussiaToday can also make similar claims in some of their reports as well as far as exposing disinformation. So what? Are people calling for them to be restored to YouTube now on grounds of their occasional fairness?
Meaning what? If they're doing it for a good cause or something they agree with then its okay then?
That yourself and others should think on what you are feeling, and why, before you act upon what you are feeling, in case someone is trying to deceptively manipulate your feelings to cause you to act in their interests rather than yours.
That the lesson may be unnecessary to you personally does not mean the lesson is not needed for other people. Some people may not recognize that they are being targetted for manipulation. Others may dismiss the existence of relevant actors to focus on other grievances.
Noted, but where do you get the belief that flailingace or myself wouldn't agree that those aren't also disinformation actors?
Granted, I don't believe in hanging disinformation actors in general, so I suppose I fail that purity test if that's the standard you want to make.
So you should consider what, how, and why RT chooses to cover what it covering in the way it does before taking what it says as substantially true, the same as you should have bounded skepticism of any source...
...but also that you should recognize that RT, and countless actors like it, will continue to try and execute their motives in any given case, regardless of how much traction they have in general...
...so that if you start getting a suspicion that your intake of social media on something feels like it's being manipulated to try and encourage an impression, you're not being crazy, you are having a reasonable grounds of wanting to think more critically before you decide how to feel.
And, by extension, so are other people.
Yes, and why would you think there aren't any? The topic has died away from public awareness with time and distance, but there were and still are people who would agree that banning RT from youtube was bad on various grounds.
One of the general reasons for maximal free speech stances is that even malefactors can bring up good points and challenge/compel better actors to clean themselves up in ways they wouldn't if the 'better' people could exclude them from the public stage, and that it's easier to hone the counter-arguments / strengthen your own when you can openly engage them.
Even completely unfair media actors have their defenders on why they should be allowed to have a public position. For example, North Korea is one of the extreme examples of 'bad media actor,' but it's youtube presence was (and, to a lesser degree, still is) a resource for researchers trying to understand.
And this doesn't even touch on grounds of national interest, ideology, or various forms of strategy. Russia took a decent black eye in the early Ukraine War when several hosts who had previously been taking the party line that the warnings of invasion were an American russophobic hoax publicly quit / were fired in objection. It was a self-harm / 'even their own propagandists couldn't support it' that could not have discredited the pro-Russian factions in various western governments had RT been restricted from that sort of public awareness earlier.
Less 'okay' and more of 'categorical difference in actor intent.'
Let's stick to 'just' true things, as in someone who never tells a direct falsehood.
If someone says true things because they value truth as an abstract concept in and of itself, we call them a truth-seeker and can recognize their errors may be out of ignorance but not deliberate distortion of context.
If someone says true things because they dislike deception even when it would benefit them, we call them honest, and can take them at their word. Their word may be limited, and unllike the truth seeker they may not be interested in actively establishing context and understanding, but they can be trusted within the bounds of that.
If someone would say true things but only selectively and with the intent to ruin others relationships, we would call them a manipulator, and recognize that they deserve extra scrutiny. Because their intent is what determines what they say and why, it behooves an audience to consider if there is additional context, missing information, or other truths that simply aren't being provided before believing what the manipulator tries to lead us to feel.
And this is before outright lies and other forms of dishonesty are included. A truth-seeker may have a motivated interest in what they focus on and find, an honest person may selectively try to avoid being questioned in certain ways to let a misunderstanding continue, but a manipulator who doesn't limit themselves to just truths can do even more to meet their interest.
Intent matters, and as such recognizing who's intent for what is a relevant piece of meta-context. 'Disinformation' may be an abused term, but 'Russian disinformation' is as good enough term as any other for characterizing a system intent by a coherent actor for information that is ambivalent about truth/accuracy but which is systemically proferred to try and shape public discourse in ways hoped to be systemically detrimental to the national target. This is a categorically different intent of, say, 'Partisan disinformation'- which wants what is bad for the opposition but good for the party- or 'ideological disinformation'- which wants what is good for cause and willing to tear down the obstacles.
You may feel the impact is grossly overestimated- and not only would I agree, but there was a very recent article last week pointing out a Russian incentive to overestimate their own impact which has interesting implications for if western leaders are accurately reflecting western intelligence accurately reporting on Russian self-assessments that are themselves incorrect for reasons of self-interested motivated reasoning- but again, what you are responding to isn't about 'relative' impact.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Show me a person of influence who made this case when the George Floyd video dropped.
I do not believe anything the Russians could ever say or do could hold even a flickering candle to the gigaton flare generated by the actual words and deeds of genuine Americans.
I think I may have encountered a Russian troll. Specifically, this guy. He went into a bunch of WP articles about US surveillance, ruining them, and when I noticed the pattern and alerted WP he made a few ominous-but-vague threats and then vanished.
At the time I thought he was simply an NSA/CIA agent, but in retrospect I think that's unlikely. He was very sloppy, copypasting entire sections of NSA propaganda into Wikipedia without even changing the "we"s to "they"s, and my read on the Five Eyes is that they're usually slicker than that; a real NSA/CIA agent would also have no motivation to make vague public threats and then disappear, rather than simply ghosting straight away or picking up the phone to threaten someone for real. And if he wasn't a Five Eyes spook, he was somebody pretending to be one, presumably someone intending to get caught in order to frame them for vandalising Wikipedia. Could be a random lunatic, I suppose, but the people with a logical motive to do that are strategic adversaries of the USA, and my read based on PRC external propaganda and the Sam Dastyari fiasco is that 4D-chess shenanigans like this aren't their style. I suppose I'll never know, particularly since I've left Wikipedia.
More options
Context Copy link
Sure, I think this is a healthy perspective. But Russia, and China, trying to sow discord is an argument some make:
https://www.politico.com/news/2020/06/01/russia-and-china-target-us-protests-on-social-media-294315
https://journals.sagepub.com/doi/abs/10.1177/19401612221082052
Computer, enhance:
Wow, foreign infiltrators tweeted a thousand times! That's a lot of tweets.
Come on, there is no evidence that these campaigns are barely statistically significant. I know guys who put out that many tweets in a week.
At least in 2016 they also had bots/provocateurs masquerading as legitimate users. And Russia just wanted to fan the flames, they played both sides from “gay rights to gun rights”.
WSJ 2017: Facebook Users Were Unwitting Targets of Russia-Backed Scheme
https://archive.ph/rZJBo
NYT 2017: Purged Facebook Page Tied to the Kremlin Spread Anti-Immigrant Bile
https://archive.ph/kuS2E
Oh yeah? Did muh russia also coordinate the drag queen story hours in libraries? Did the russians decide to inject dildoes and strapons in childrens books? This muh russia shit is so tiresome, the progs are running ammok and when caught with their pants down all they can do is deflect-deflect-deflect.
More options
Context Copy link
A WSJ story about two Facebook accounts, and a NYT story about one. If these campaigns were worth taking seriously there'd be more to show for it. Right now scrolling twitter in fear of Russian Propaganda is pretty out-there in risk-perception. I read an article today about a sick cow in Idaho, maybe I'll go vegan.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
The government or at least substantial parts of it wanted the BLM protests. They aren’t going to call it trolling.
But again, very little of the stuff named Russian Trolls can actually be traced to Russia in any way whatsoever. They can’t find Russians behind the Laptop, election fraud, UAPs, or Q. They can’t because it’s not Russia.
More options
Context Copy link
The person you responded to is filtered.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I don't see any particular reason both can't be true.
More options
Context Copy link
More options
Context Copy link