site banner

Culture War Roundup for the week of June 5, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

8
Jump in the discussion.

No email address required.

Not sure if this has been mentioned before, but on the topic of The Little Mermaid, I am extremely confused by the Rotten Tomatoes score. The "audience score" has been fixed at 95% since launch, which is insanely high. The critics score is a more-believable 67%. Note that the original 1989 cartoon - one of my favorite movies growing up, a gorgeous movie that kickstarted an era of Disney masterpieces - only has an 88% audience score. Also, Peter Pan & Wendy, another woke remake coming out at almost the same time, has an audience score of 11%. And recall that the first time Rotten Tomatoes changed their aggregation algorithm was actually in response to Captain Marvel's "review bombing", another important and controversial Disney movie.

If you click through to the "all audiences" score, it's in the 50% range. And metacritic's audience score is 2.2 out of 10. The justification I've heard in leftist spaces is that the movie's getting review bombed by people who haven't seen it. And there certainly is a wave of hatred for this movie (including from me, because the woke plot changes sound dreadful). How plausible is this? I haven't seen the movie myself, so it's possible that it actually is decent enough for the not-terminally-online normies to enjoy. But even using that explanation, how is 95% possible?

Right now I only see two possibilities:

  • Rotten Tomatoes has stopped caring about their long-term credibility, and they're happy to put their finger on the scale in a RIDICULOUSLY obvious way for movies that are important to the Hollywood machine. I should stop trusting them completely and go to Metacritic.

  • People like me who have become super sensitive to wokeness already knew they'd hate the movie and didn't see it; for the "verified" audience, TLM is actually VERY enjoyable, and the 95% rating is real.

But, to be honest, I would have put a low prior on BOTH of these possibilities before TLM came out. Is there a third that I'm missing?

We've had this discussion before and my personal take on it is that RT has decided that providing the right audience rating is more important than providing the actual audience rating. Woman King gets a 99% audience score? Pull the other one, it's got bells on it. Same thing is happening here. It is more important that the plebs be informed about what a good (diverse, tells a cathedral approved story) movie is than it is that the plebs be able to decide what a good movie is.

Ah, perfect, thanks for the link. That looks like exactly the same thing; I completely missed it because I didn't care about Woman King in the slightest. 99% is as hilariously unbelievable as El Presidente winning an election with 105% of the votes. So, it really does look like RT is willing to just blatantly lie about certain - ahem - "culturally relevant" movies, and that casts doubt even on other scores that aren't obviously fake. Maybe I can keep clicking through to the "all audiences" score, but who knows how long they'll allow that? Looks like it's time to drop RT for good and go to Metacritic ... which also uses an opaque aggregation algorithm, but at least I haven't caught them in an obvious lie yet!

I think the steelman is this:

  1. We (the review site) know that audience trolls attempt to manipulate ratings in various ways. (Review bombing, stanning, reviewing without seeing the film).

  2. Either we let this stand, and users see audience-manipulated scores, or we attempt to correct for this and give a good idea of the real sentiment of good-faith reviewers.

  3. Time passes.

  4. We're already filtering out trolls who are review bombing because Luke Skywalker's lightsaber is the wrong shade of green, why not also filter out people who give bad reviews because the mermaid is the wrong colour?

  5. Some of the trolls have got wise to this, and give bad reviews for plausible reasons, even though actually they're racists who hate the film for having a black lead.

  6. People who criticise films with a black lead should be filtered out, even if (especially if!) they have legitimate points.

This is the same logic that has played out in every part of social media over the last 15 years.

This is coming close to being a quokka. You shouldn't be steelmanning to the point where attacks and bad faith actions can't be recognized as such. "Maybe my mother really is a whore, or at least, she could be dressing like one and speaking crudely, so it gives people the justified misimpression that my mother is a whore."

Oh, sure. But this is how Twitter / Facebook / Rotten Tomatoes and all the other ones talked themselves into their current policies, or got pushed into them, or justified them to skeptics. If I remember correctly, one of the big turning points was when social media companies realised that they were the linchpin of Isis recruitment efforts, complete with execution videos.

If you’re serious about building something which supports free speech, you therefore have to recognise that a fraction of your users are going to be trolls and bad actors. Another fraction will use the existence of the first faction to push their agenda, and you have to have countermeasures for both.

As countermeasures go, this site’s rules do pretty well, I think, with the caveat that it relies on the good will and restraint of a handful of mods to function.

I thought the "steelman" turning to rust was @Corvos's point.

Indeed so :)

In general I'd expect any prominent review site (not just ones for movies) to be affected by Goodhart's Law. When millions of dollars (not to mention reputation) are potentially on the line, companies will try to game the scores, and whether or not they do it with the cooperation of the review site will depend on how principled the site's management is.

You're not the first to notice. It seems like IMDB already weighted scores because of review-bombing. On IMDB, even weighted, it's at 7.2. And metacritic's score of 2.2 seems more reflective of what review-bombing might look like, so I'd bet Rottentomatoes put in some extra protections against review-bombing, above and beyond just weighting the score like IMDB. It seems like Rottentomatoes user scores are like Wikipedia articles, if it's political I wouldn't trust it implicitly.

The problem then is, what is review-bombing versus this movie/show really does suck?

I feel like it's a lost cause at this point. Review-bombing is probably real, fake, and irrelevant all at the same time. I say irrelevant because once a review-bombing has been deemed to happen all reviews become tainted because let's assume it's all natural both ways, people will still counter-review bomb to say something is great for culture war reasons or pretend to be the enemy and strawman their position. I'm beginning to believe the latter is very likely, if not predestined, to happen in once a review-bomb starts.

This is just a problem for aggregation and numbers. There are still usually reviews by people who have valid criticisms and praise. The review bomb basically just renders the number meaningless and anything with too much negativity or praise becomes much harder to believe as real. So, maybe people just read reviewers whose opinions they already trust to not be contaminated by playing a culture war game with review scores. I'm sure some exist.

Yup, that's my dilemma. The whole point of these aggregation sites was to try to get a more objective measure of how good a movie actually is. But there's no paper trail for any of these sites' scores (it's not just RT), and it's become common practice to fudge the numbers with a special "algorithm". I guess I mostly just accepted this before, but TLM is such a ridiculous outlier that I'm starting to doubt whether there's any useful signal left.

I think it's quite common for audiences to rate movies higher than critics. It seems to happen a lot for sequels and remakes, where dedicated fans will go out and watch it even though critics pan the sequel for not being sufficiently innovative.

For example, The Black Stallion Returns, the very unnecessary 1983 sequel to the beloved 1979 original, holds a 20% critic score but a 73% audience score. Why the discrepancy? Critics correctly pointed out that this film followed basically the same storyline as the previous movie yet it didn't improve upon it in any way, so there was no reason for this movie to be made. Audiences seemed to like it for exactly the same reason: they loved the original and this is more of the same so why shouldn't they like it too?

You can also see this effect in the ratings for The Little Mermaid II: Return to the Sea, the sequel to the Disney classic, which practically copies the storyline and cast of characters from the original. It has 17% critic and 45% audience approval, and although both scores are low, again the audience seems to be way more forgiving than the critics.

The same applies to the live action remake of Beauty and the Beast which is more popular with audiences (80%) than critics (71%), despite starring notable feminist Emma Watson. (This movie was only mildly controversial because they'd made LeFou explicitly gay, which probably boosted critic reviews, and lowered audience scores.)

I can totally believe that for the live-action remake of The Little Mermaid, the verified audience (i.e., the people who paid money to go see the movie) are more positive about it than the critics. It seems to follow the same pattern as other Disney remakes: not a lot of innovation, but the fans seem to eat it up anyway.

So I think your second possibility is closer to the truth: the people most upset about the race-swapping probably didn't even watch the movie.

In addition, I suspect there is some selection effect going on: I suspect the woke are more likely to be verified Rotten Tomato users, since it seems to involve sharing your personal data to Rotten Tomatoes or something (I honestly don't know how it works), which would probably exclude older (i.e., less woke) people and people critical of big tech (i.e. less woke). So the “verified” population probably skews heavily woke, and is not representative of the overall audience.

Also, Peter Pan & Wendy, another woke remake coming out at almost the same time, has an audience score of 11%.

Note that in this case, Rotten Tomatoes shows you the all audience score, not a verified score. That movie was also the subject of woke controversy due to race and gender swapping a bunch of characters, so a lot of the negative scores probably come from people who were unhappy about those changes. This isn't an apples-to-oranges comparison.

Note that in this case, Rotten Tomatoes shows you the all audience score, not a verified score. That movie was also the subject of woke controversy due to race and gender swapping a bunch of characters, so a lot of the negative scores probably come from people who were unhappy about those changes. This isn't an apples-to-oranges comparison.

Oh, good catch! I didn't even notice that (showing how insidious that "verified audience" marker is). For some reason I thought Peter Pan & Wendy was another theatrical release, but apparently it went straight to Disney+, so there are no verified reviews. So it's a lot less comparable to TLM than I thought.

I think it's quite common for audiences to rate movies higher than critics. It seems to happen a lot for sequels and remakes, where dedicated fans will go out and watch it even though critics pan the sequel for not being sufficiently innovative.

This could certainly explain some of the discrepancy, but 94% (that's the score I see right now on the site, with 100,000+ verified users) is a ridiculously high score that beggars belief. As the comment to which you're replying states, that's significantly higher than the score of the original which is almost universally considered a masterpiece and a classic.

In addition, I suspect there is some selection effect going on: I suspect the woke are more likely to be verified Rotten Tomato users, since it seems to involve sharing your personal data to Rotten Tomatoes or something (I honestly don't know how it works), which would probably exclude older (i.e., less woke) people and people critical of big tech (i.e. less woke). So the “verified” population probably skews heavily woke, and is not representative of the overall audience.

I'm guessing this also plays a big factor, though I doubt it explains all of it. By all accounts, this film was bad even while ignoring all woke factors, and I doubt such an overwhelming majority of verified users are people that are sufficiently woke as to give a film good scores just to send a political message.

Speaking of apples to oranges comparisons.

Rottentomatoes critic scores before a certain era and for certain products are absolutely not useful because they have so few reviews compared to anything recent and until streaming there were very few serious about making reviews for direct-to-dvd movies. Black Stallion Returns has 5 critic reviews. The Little Mermaid II has 6 critic reviews (and one of them is a duplicate). I don't see how you can take the comparison between thousands of user reviews seriously with that discrepancy.

Not to mention the fact that reviews for older movies are almost never going to draw review-bombing, and almost always going to have people leaving a less critical review of something older because it was older, because their nostalgia, because if they thought it was middling they wouldn't care to make a review for it. Hype, marketing, cultural issues (warring or not) probably skew reviews for modern things in ways that I have a hard time believing are going to reflect accurately back when examining 40 year old movies or direct-to-dvd sequels that came out in 2000.

A better comparison would be to take a movie without controversy, to my knowledge that fits in a similar mold. Look at The Lion King(2019) 52% critic and 85% audience and Aladdin (2019) with a 48% critic and 95% audience which would seem to suggest along with Beauty and the Beast that verified audience percentages make disney movies review proof for audiences. Then again there's Dumbo (2019) 46% critic and 48% audience, Mulan (2020) 78% critic and 46% audience, Lady and the Tramp (2019) 66% critic, 50% audience and finally, Pinnochio (2022) 29% critic and 27% audience. If IMDB has admitted they had to weight the score of The Little Mermaid to combat review bombing and rottentomatoes is releasing a 95% with no comment, I find it hard to believe. Not impossible taking into account something like Aladdin, but still hard to believe.

If IMDB has admitted they had to weight the score of The Little Mermaid to combat review bombing and rottentomatoes is releasing a 95% with no comment, I find it hard to believe.

Again, it's not “with no comment”, RT explicitly tells you they are only including verified viewers, so that cuts out the review bombers just like on IMDB, and probably limits votes to American audiences (which are probably more supportive of race-swapping and other woke nonsense).

What I typically do when looking up ratings on IMDB is check out the distribution of votes (which I believe is not censored), ignore the highest and lowest scores, and then look at where the bulk of the histogram is. This doesn't work for movies that are extremely good or extremely bad (e.g., The Godfather, or The Room) but those are exceptions. It works great for controversial films, e.g. Cuties has an average rating of 3.6/10, and 70% of voters gave it 1/10, but the bulk is around 7/10 which I think is a fair grade.

Using the same metric, take a look at The Little Mermaid and the other remakes you mentioned:

You can see that audiences legitimately rated this one higher than all those other remakes (the bulk of the histogram is at 7/10 but 8/10 is really close with 6000 vs 5600 votes). Aladdin comes closest but cannot exactly match it. And yes, the score on IMDB is lower than on RT but that's partially because IMDB tends to be more critical overall, and because the calculation is different. Again, Aladdin has 94% audience score on Rotten Tomatoes and a 6.9/10 on IMDB. Unless you believe RT fixed Aladdin's score too, it's fair to say that IMDB voting patterns support the fact that audiences liked The Little Mermaid at least as much as Aladdin.

Of course, if you assume that Rottentomatoes is not manipulating any data than the data comes back to show exactly what they're telling you. But I was suggesting that they were secretly weighting the score which may seem conspiracy nutty but that's the entire point of looking at it and thinking "this seems strange, I don't buy it." I'm not going to say nobody believes that the score matches what the website shows but I believe most people who think things might be being manipulated think that a portion of negative reviews are being excluded outside of their own verification system because it's socially/politically in their interest to do so for any number of reasons. There's been so many instances of things being protected from false reviews in the past few years that I find it hard to believe without any hint of doubt that the 95% reflects reality.

Protecting TV shows/videogames/movies from review-bombing for political reasons is considered just what a good/respectable company does these days. In the same way that allowing people to talk about certain risque things or have certain opinions isn't allowed, saying "I didn't like this product because I don't like its political message" is only allowed in one direction and if it's the wrong direction (right slanted) then that is deemed bad and cracked down on in some way by changing how the reviews work (netflix), limiting reviews affecting scores when a lot of reviews happen at once (steam), verifying reviews in some way (rottentomatoes), all these things only exist because of review-bombing for political/culture war reasons. It's clear that review-bombing does happen by people who haven't consumed the media but even in cases where money is confirmed to have changed hands (steam) they still have protections for review-bombing because there are reasons for reviews that are deemed invalid. It seems easy for a website like Rottentomatoes to just turn off commentless zero star reviews for something even if it's been "verified" (I put a quotes because I don't know how their verification works). It's relatively conspiratorial and I don't necessarily believe it 100% but it doesn't strike me as crazy outlandish to do.

I also would find it easy to believe that a pr company would manufacture bad user reviews for something like metacritic to take a 5.0 down to a 2.0 and flood it with reviews specifically targeting the woke angle of something to completely erase the perceived value of user reviews that are bad or middling. I said in another post that I just don't trust rottentomatoes in the same way I don't trust wikipedia for anything political. Manipulation is just too easy even discounting RT doing it themselves. There are plenty of people that would give a 5/10 a 6/10 purely for culture war reasons and vice versa. But given the critic reviews, genre-fatigue (I guess live action remakes are maybe a genre), the baked-in culture war angle from both sides(I've seen three articles on deadline about how it sets a bad example for women, erases black slavery, and appropriates drag culture) I still find it hard to believe that it sits at 95%. I didn't say impossible, just hard to believe.

Of course, if you assume that Rottentomatoes is not manipulating any data than the data comes back to show exactly what they're telling you.

I don't assume that; I tried to investigate the possibility by corroborating the RT figures with more transparant sources like IMDB, and I think it's plausible that the RT verified audience score is real.

It sounds like you've predetermined that RT is explicitly manipulating the data (beyond the biased selection mechanisms which we've already discussed) and you are not willing to consider evidence to the contrary.

I get that if you're an old conservative curmudgeon on a forum of likeminded people it's hard to imagine that 95% of the audience could like this movie, but you should at least be able to realize you're not the target audience, and consider the possibility that the actual audience doesn't have the same preferences as you do.

What percentage of Mottizens do you think are fans of Cardi B's music? And what percentage of people who attended a Cardi B concert do you think would say they enjoyed the show?

It seems easy for a website like Rottentomatoes to just turn off commentless zero star reviews for something

Okay, but this is a testable hypothesis at least. I don't see any reviews with less than ½ star or with no text. Is it even possible to give a zero-star rating or leave a rating without any comment? Or maybe comment-less ratings don't show up on the site but are still included in the score?

Protecting TV shows/videogames/movies from review-bombing for political reasons is considered just what a good/respectable company does these days.

Again, you assume that measures against review bombing are taken only for political reasons. Even witout politics, you need to do something to prevent review-bombing, otherwise scores reflect nothing but which group was able to drum up a larger army of trolls. That's obviously not what movie ratings should be about, regardless of political views.

I don't blame review sites for trying to combat that; I would probably do the same thing if I ran such a site, and I'm not left-wing and definitely not woke.

I have no idea why you've gone into multi-quote argument failure mode. I mostly agree with you and just think it's still not unlikely that they manipulated data because I'm biased that way and I've explained why.

I think it's quite common for audiences to rate movies higher than critics.

unless the critics are giving it extra points because it is woke. Then the people's score is usually a lot lower.

The go woke get broke thing.

I’ve noticed for probably 5 years now I rarely enjoy something new. I don’t have much interest in going to the movies. I usually like watching things from hbo 20 years ago.

Maybe I’m just old and old people always just watch things from their prime. Or perhaps despite there being 10x as much media production it all sucks and I am not watching because they went woke and got broke and get unwatchable media.

It's been at 95% since before the film was released. I think that's a rating from a hand picked test audience.

Is there a third that I'm missing?

It's more of a variation of your first possibility, but RT could also be acting out of principal-agent problems, not at the behest of Hollywood executives. The explanations probably overlap. There's also the possibility that they care about their credibility every bit as much as they did in the past, but it's their credibility among tastemakers that's important, not the rabble.

I'm not sure why you'd put a low prior on the first, though. Particularly for high visibility productions, "everyone" knows to take politics into account when reading reviews. Positively weighting aligned reviews doesn't seem like an incredible step beyond that.

It's more of a variation of your first possibility, but RT could also be acting out of principal-agent problems, not at the behest of Hollywood executives. The explanations probably overlap. There's also the possibility that they care about their credibility every bit as much as they did in the past, but it's their credibility among tastemakers that's important, not the rabble.

Yeah, I'd be surprised if RT's review aggregation takes "marching orders" from any executives. In fact, I think RT is owned indirectly by Warner Bros., so if anything you'd expect they'd be "adjusting" Disney movies unfavorably. I like your explanation that RT's just sincerely trying to appease the Hollywood elite, rather than provide a useful signal to the masses. It fits.

I'm not sure why you'd put a low prior on the first, though. Particularly for high visibility productions, "everyone" knows to take politics into account when reading reviews. Positively weighting aligned reviews doesn't seem like an incredible step beyond that.

I knew to take that into account with the critics score, which I would usually ignore for the "woke" crap. But in the past I've generally found the audience score trustworthy. Maybe I was just naive, and it took a ridiculous outlier for me to finally notice that they have their fingers on every scale.

Rotten Tomatoes turned off reviewing unreleased movies just before Captain Marvel came out, but claim that they "definitely" didn't change the site to protect Captain Marvel. Given how much fudging of everything has happened in the world since then, I wouldn't be surprised if they are now willing to make up review scores to protect favored films.