This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
The Writer's Guild of America (WGA) is on strike as of May 2nd, after negotiations with the Alliance of Motion Picture and Television Producers (AMPTP) broke down. While most of their demands deal with the way pay and compensation in the streaming era is structured, on the second page towards the bottom is:
I think this is an interesting first salvo in the fight over AI in creative professions. While this is just where both parties are starting for strike negotiations, and either could shift towards a compromise, I still can't help but see a hint that AMPTP isn't super interested in foregoing the use of AI in the future.
In 2007, when the WGA went on strike for 3 months, it had a huge effect on television at the time. There was a shift to unscripted programming, like reality television, and some shows with completed scripts that had been on the back burner got fast tracked to production. Part of me doubts that generative AI is really at the point where this could happen, but it would be fascinating if the AMPTP companies didn't just use traditional scabs during this strike, but supplemented them with generative AI in some way. Maybe instead of a shift to reality television, we'll look back on this as the first time AI became a significant factor in the production of scripted television and movies. Imagine seeing a "prompt engineer" credit at the end of every show you watch in the future.
It'll be interesting to see how this all plays out.
What would be so hard about using non-union writers, plus AI. Unions have been in decline in other respects. Why should screenwriting be different. I think screenwriting is at risk of being replaced by AI, but this has not been true for punditry or other forms of writing, such as blogs. The substack blogs I follow, for example, have seen record traffic and engagement over past 2 years despite GPT.
Nothing, except you can't do that in places where unions pretty much run the industry...like Hollywood.
More options
Context Copy link
More options
Context Copy link
Without looking at the other replies, I'll jump in immediately and point out that we already have a whole bunch of "endless" shows on Twitch using basically AI writing/scripting. It started with Nothing Forever, but now it's expanded to Always Break Time (an anime), Endless Steam (literally just the Steamed Hams scene from The Simpsons, varied endlessly), and Dragon's Lair (basically Shark Tank in a medieval fantasy setting). I think, even with however this strike plays out, both parties are at risk of being completely side-stepped by small, independent creators who just want to make a thing.
More options
Context Copy link
I think that it is inevitable that AI will get used for TV and film screenwriting in the future. Not to completely replace writers, but to make it so you can get by with half or a quarter of the writers you used to have, with each of those writers using AI as tools to produce a lot more than they could before.
I'm basing that off of GPT-4, the tool we currently have: even if text generation AI doesn't get significantly better than GPT-4 its still going to increase writer productivity.
More options
Context Copy link
My main thought here is that, compared to 2007, the competition for eyeballs is much more fierce. Even ignoring social media and video games, unscripted streams and independently scripted free YouTube videos offer video entertainment accessible to the masses that Hollywood has to prove themselves better than. Their production values are still much better than even the best independent professionally produced non-Hollywood stuff, but that advantage keeps getting chipped away. The one advantage they won't lose anytime soon is ownership of lots of intellectual property, which gives them a legal monopoly on producing certain stories. I think they need to hold onto this with a death grip if they want to keep being prosperous in the future.
And to do that, they need to make sure the IPs they own are associated with high quality content. Regardless of AI, this has been in some trouble lately; e.g. for all the money MCU and Star Wars films make, the trend has clearly been negative in recent iterations, both in box office numbers and sentiment from longtime fans. A large part of that has to do with the quality of writing, which I imagine AI could help with. In the meanwhile, we're likely to see independently produced scripts become higher quality thanks to AI aid, and with the production values also getting better for similar reasons, Hollywood probably needs to make sure they're at the cutting edge in using these sorts of tech; for as well an independent creator can use this tech, Hollywood has the resources to do it 1000x as much and better.
Given that, I'm not sure that this kind of limiting of AI in script writing is viable. Looking at the quality of writing in modern Hollywood scripts, it's quite clear that an amateur writer with ChatGPT and rudimentary understanding of storytelling principles could write better-than-median quality scripts right now. And it will only get better. Perhaps the writers' guild can group together enough and prevent scabs, but then the entire industry has to compete against independent producers whose quality would go up. If the quality of writing in mainstream professional films and shows go down during this strike and/or fewer such mainstream professional films and shows get released, resulting in less prestige in the IPs that these companies hold, followed by the guild getting what they want in terms of AI leading to them having a harder time competing against AI-assisted independent creators, it could end up with less overall money flowing to Hollywood, a Pyrrhic victory for the writers.
I see your point, but professional writing is a skill that is probably difficult to easily replace not so much because of writing quality as because of the need to have them follow orders while maintaining that writing quality.
I don't think they can even do this. Whenever a piece of media stops being itself so that the characters can turn to face the screen and tell me about how Donald Trump is bad and that I should hire some more black people it immediately throws me out of the story and ruins my enjoyment. It isn't like the writing that we're getting now is of particularly high quality either ("They fly now? THEY FLY NOW!") - I can absolutely understand being scared of AI writers if your writing is already as devoid of humanity as most modern media products are.
More options
Context Copy link
More options
Context Copy link
That's a fair point; perhaps Hollywood is doomed anyway, unless they can get AI into the executive suite as well as in the writer's room.
But I also have to wonder if the terrible writing in those Disney films is because they didn't have access to writers good enough to write good scripts while meeting the constraints placed on them by executives. Perhaps the constraints that executives place on the writers makes it literally impossible to write a good script, in which case the prior paragraph holds, but what if it's just that the writers weren't skilled enough to figure that out? I don't know what sorts of constraints the execs at Disney and Lucasarts placed on the writers for Rise of Skywalker, but perhaps a more skilled writer could have written a script that was at least half-decent? And what if AI could/would have elevated those writers to such a level?
It was always doomed, for a few reasons.
Disney rushed Star Wars: The Force Awakens into production which meant no holistic plan for the trilogy. They also went all-in on pandering to existing fans with a derivative plot. It was already broken at this point.
Last Jedi came out, rightly loathing TFA's derivative plot but erasing it (due to there being no plan) which just enraged fans of the first film invested in what it set up
Disney insists on a pivot but also doesn't change the release date so someone had to redo the entire (non-existent) plan from scratch to mollify both TFA and TLJ fans under a time constraint.
Abrams is very often be called a hack, but it was always a cursed endeavor and he's a veteran who's shown he can do competent work.
A lot of the time what seems to happen is that the writer/director is chosen due to being pliable and helpful instead of experienced (see Abrams' proteges who ended up on Rings of Power) and then have to craft a good story between studio mandates. They're neither competent enough to do it nor do they have enough cachet to impose their will even if they had a better plan.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Is this even legal? AFAICT there’s no abstract ownership of concepts or ideas that copyright holders can claim, only claims against produced works. So a copyright holder can sue someone who uses AI to generate similar content to what is copyrighted, but not for using a work as training data per se. Sounds like the writers should be picketing Congress too.
I'd argue that a neural net is a derivative work of its training data, so its mere creation is a copyright violation.
But you could make a similar argument that a human brain is a derivative work of its training data. Obviously there are huge differences, but are those differences relevant to the core argument? A neural net takes a bunch of stuff it's seen before and then combines ideas and concepts from them in a new form. A human takes a bunch of stuff they've seen before and then combines ideas and concepts from them in a new form. Copyright laws typically allow for borrowing concepts and ideas from other things as long as the new work is transformative and different enough that it isn't just a blatant ripoff. Otherwise you couldn't even have such a thing as a "genre", which all share a bunch of features that they copy from each other.
So it seems to me that, if a neural net creates content which is substantially different from any of its inputs, then it isn't copying them in a legal sense or moral sense, beyond that which a normal human creator who had seen the same training data and been inspired by them would be copying them.
The dystopian take is obviously that the copyright lawyers will come for the brain next: experiencing copyrighted media without paying for it will be criminalized.
More options
Context Copy link
That's an entirely different question. Obviously the LLM is not itself a human, but neither is a typwriter or computer which a human uses as a tool to write something. So probably the copyright author would be the person who prompts the LLM and then takes its output and tries to publish it. Especially if they are responsible for editing its text and don't just copy paste it unchanged. You could make an argument that the LLM creator is the copyright holder, or that the LLM is responsible for its own output which is then uncopyrightable since it wasn't produced by a human.
But regardless of how you address the above question, it doesn't change my main point that the AI does not violate copyrights of humans it uses input from in any way differently from a human doing the same things that it does. Copyright law is complicated, but there's a long history and a lot of precedents and individual issues tend to get worked out. For this purpose, the LLM, or a human using LLM as an assistant, should be subject to the same constraints that human creators already are. They're not "stealing" any more or less than humans already do by consuming each other's work. You don't need special laws or rules or restrictions on it that don't already exist.
You can't reason by analogy with what humans do because LLMs are not human. They are devices, which contain data stored on media. If that data encodes copyrighted works, they are quite possibly copyright violations. If I memorize the "I have a dream" speech, the King estate can do nothing to me. They can bust me for reciting it in public, but I can recite in in private all I want (though I could get in trouble for writing it down). If I can ask an LLM for the "I have a dream" speech and it produces it, I have proven that the LLM contains a copy of the "I have a dream" speech and is therefore a copyright violation. And that's just the reproduction right; the derivative work right is even wider.
Except that LLM don't explicitly memorize any text, they generate it. It's the difference between storing an explicit list of all numbers 1 to 100 {1,2,3...100}, and storing a set of instructions: {f_n = n: n in [1,100]} that can be used to generate the list. It has a complicated set of relationships between words that it understands, and is very refined such that if it sees the words "Recite the "I have a dream" speech verbatim", it has a very good probability of successfully saying each of the words correctly. At least I think the better versions do, many of them would not actually get it word for word, because none of them have it actually memorized, they're generating it new.
Now granted, you can strongly argue, and I would tend to agree, that a word for word recitation by a LLM of a copyrighted work is a copyright violation, but this is analogous to being busted for reciting it in public. The LLM learning from copyrighted works is not a violation, because during training it doesn't copy them, it learns from them and changes its own internal structure in ways that improve its generating function such that it's more capable of producing works similar to them, but does not actually copy them or remember them directly. And it doesn't create an actual verbatim copy unless specifically asked to (and even then is likely to fail because it doesn't have a copy stored and has to generate it from its function)
Imagine I create some wacky compression format that will encode multiple copyrighted images into a single file, returning one of them when you hit it with an encryption key corresponding to the name of the image -- the file "contains" nothing but line noise, but if you run it through the decompressor with the key "Mickey Mouse" it will return a copyright violation.
Is passing this file around on Napster (or whatever the kids are doing these days) a copyright violation?
More options
Context Copy link
From 17 USC 101
Thus an LLM which can reproduce a copyrighted work is a copy of it, regardless of whether it is "generating" it or not.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Once again it is unclear to me how this is very different than a human who reads a bunch of scripts/novels/poems and then produces something similar to what he studied.
There’s a lot of different ways you could look at it, but I think I might just say that the principle of “if you use someone else’s work to build a machine that replaces their job, then you have a responsibility to compensate that person” just seems axiomatic to me. To say that the original writers/artists/etc are owed nothing, even though the AI literally could not exist without them, is just blatantly unfair.
Is it not different from the early factory laborers buildings the machines that would replace them? Or maybe more aptly the carriage companies that supplied the ford factories. They were paid for the production fairly enough. That was the narrow agreement, not that no one else could be inspired by it or build an inspiration machine. To be replaced by people who were inspired by your works is the fate of every artist in history, at least those that didn't wallow in obscurity.
They consented and were paid. It's not analogous at all.
They produced the media, which is being consumed and paid for under the current payment model. They are being compensated for it regularly and under all the fair and proper agreements. The AI is trained off of the general inspiration in the air, which is also where they artists pulled their own works for. It's a common resource emitted by everyone in the culture to some degree. The Disney corporation did not invent their stories from whole cloth, they took latent myths and old traditional tales from us and packaged it for us and the ideas return to us. Now we're going to have a tool that can also tap this common vein and more equitably? This is democratization. This is equality. This is progress.
Last week I not so popularly defended copyright, and I still believe it's the best compromise available to us. But it doesn't exist because of a fundamental right, it exists because it solves a problem we have with incentivizing the upfront cost of creating media. If these costs can be removed from the equation then the balance shifts.
How do you feel about software license agreements? Plenty of software/code is publicly visible on the internet and can be downloaded for free, but it's accompanied by complex licensing terms that state what you can and can't do with it, and you agree to those terms just by downloading the software. Do you think that license agreements are just nonsense? Once something is out on the internet then no one can tell you what you can and can't do with it?
If you think that once a sequence of bits is out there, it's out there, and anyone can do anything with it, then it would follow that you wouldn't see anything wrong with AI training as well.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Because the law doesn't consider your memory or the arrangement or functioning of your neurons to be a tangible medium of expression, but a computer memory (including RAM, flash, spinning rust, paper tape, whatever) is. If you see a work and produce something similar, what you produce might indeed by a copyright violation (and there are a lot of lawsuits about that), but your mind can't itself be a copyright violation. A neural network in a computer can be.
This doesn't seem to follow either. Maybe what the computer produces could be a violation, but the actual information contained within it doesn't resemble the copyrighted material in any way we can determine. At least that's based on my understanding of how the trained AI works.
The fact that the information contained within the LLM doesn't, in some sense, resemble the copyrighted material isn't relevant, nor should it be; the fact that we can get it out demonstrates that it is in there.
Which is why I'm confused as to why this does not apply to a human memory, other than that just being the Court's distinction between human hardware and electronic systems that they apply.
And then, even accepting your point:
Then presumably putting sufficient controls on the system so that it WILL NOT produce copyrighted works on demand solves the objection.
We also run into the 'library of Babel" issue. If you have a pile of sufficiently large randomized information, then it probably contains 'copies' of various copyrighted works that can be extracted.
So an AI that is trained on and 'contained' the entire corpus of all human-created text might be said to contain copies of various works, but the incredible, vast majority of what it it contains is completely 'novel,' unrelated information which users can generate at will, too.
There may be no other distinction.
Only if the controls are inseparable from the system. If I can take your weights and use them in an uncensored system to produce the copyrighted works, they were still in there. Just as if you rigged a DVD player not to play a DVD of "Return of the Jedi" wouldn't mean "Return of the Jedi" wasn't on the DVD.
If the randomized information was generated without reference to the copyrighted works, this doesn't matter. That's not the case with the AIs; they had the copyrighted works as training data.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I believe that's unclear actually, it's the crux of many a ongoing lawsuits including the one against GitHub.
I see convincing cases both that this is similar to human learning and that it's plagiarism.
Ultimately it is the legislature that's going to have to intervene to settle this, otherwise the courtd will follow the law in whatever silly direction it'll go when applied to an object it was never intended to deal with.
Personally I'd be glad to see copyright effectively abolished for essentially anarchist reasons. But I would rather we do this on purpose than by accident.
More options
Context Copy link
I think this is something that has to get figured out fast. I agree that right now, they can't really regulate that. At most the AMPTP could themselves agree not to train their own AI instances on it, but yeah, what's stopping you from using AI's already trained on that.
I have stated elsewhere, that I'm personally disposed toward supporting some kind of expansion of copyright-adjacent rights that includes training rights. But that would have to be a somewhat globally coordinated legal effort.
Yes, and those training rights should be called universal basic income. Every voice in culture contributes to training these AIs just like the voices given the largest audience. They funnel the culture we all produce into media and are supported by us. If the ultimate storytelling machine is going to be produced from our culture it belongs to us all, the specific creatives who used to do this labor were fairly compensated.
More options
Context Copy link
More options
Context Copy link
Of course it is legal. Parties can always contract to provide greater protections than the minimum provided by copyright law. Of course, those protections only apply to the parties to the agreement.
Guess that would mean no WGA content on broadcast TV or YouTube clips?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Some data about how oppressed the struggling worker masses are (https://variety.com/2023/biz/news/wga-contract-inflation-minimums-1235564920/):
In other words, their minimum wages are about 1.5x to 3x country's median wages for essentially half-time work.
I don't think it is bad for anyone to earn a lot of money, but given that the number of decent quality shows has been extremely low for many years now, and most of those that had decent quality have been based on existing literary work, what it seems to be there is extremely overpaid bunch of people producing a very low-quality product. Still, I am sure eventually they'd get what they want - this time - because AIs aren't ready to produce scripts yet. But in 5-10 years. given the immense savings it promises? I can totally see it.
But these aren't steady jobs. A writer might have that job for 6 weeks and is then on the hunt for another gig. It pays well, when it happens, but it's not dependable like most lower paying jobs are.
There's a lot of low-paying jobs which aren't steady. Plumbers, electricians, gardeners, locksmiths, etc. - they all depend on work flow day to day and might be subject to dry spells. So it's not unique in any way, a lot of non-salaried jobs are like that.
More options
Context Copy link
More options
Context Copy link
Or because the production and distribution pipelines are much more global now they adapt by investing content from other production centers + some scab work to pick up the slack.
Just the other day news came that netflix further increased its content budget for new Korean dramas to $2.5b over the next 4 years. Maybe more news in that vein will come in the coming weeks/months.
I'd love to see more international content in places like Netflix. There are many decent works done outside the US and I am too lazy to find ways to discover them independently...
More options
Context Copy link
More options
Context Copy link
I get the opposite impression on quality. We have been in a true golden age of high quality shows for the last decade. HBO alone has put out hit after hit.
I also take issue with the framing of the workers wages as “overpaid”. In union negotiations it’s an argument about how to split the pie and while 200k might sound like a lot to some the fact is Hollywood is extremely profitable. Plus if we want to talk “overpaid” Iger alone is making $27 million a year and seems like a juicer target.
Well, obviously we have very different tastes. I am looking at the list of HBO shows (https://en.wikipedia.org/wiki/List_of_HBO_original_programming) and what I'd be at least somewhat interested in, of recent work, let's say past 2010?
House of the Dragon - yeah, kinda, somewhat weak sauce, but there are some strong points, wouldn't put is as a hit, but as a decent offering
Westworld - started awesome, went downhill fast, dropped it.
His Dark Materials - based on existing work, kinda decent though didn't see the last season yet
Game of Thrones - based on existing work, we all know how it ended in an utter disaster once Martin lost interest in finishing it
Silicon Valley - ok, this one is a hit. Full points for this one, no questions asked.
Perry Mason - didn't see it, may be interested, depending on how woke it'd be (I have no hope for non-woke, but whether it would be tolerable?) - conditional, on the strength of the franchise
This is basically it, feel zero interest to the rest of it. Maybe it's a golden age, but not for me.
Oh, I absolutely don't think them grabbing a share of the corporate profits is something wrong. I think their product is crap (ok, 90% crap, with a rare gem buried in it), but if it finds the market, then they deserve a share in it. But I personally wouldn't care either way - because for me, their product is not valuable.
Barry is pretty decent. Curb your Enthusiasm is just high quality. Boardwalk empire was also pretty decent.
More options
Context Copy link
It's pretty woke. I watched it all, but would treat the woker scenes as commercial breaks. I wasn't really invested. I generally like Matthew Rhys work, as a kid I loved Perry Mason.
I watched the first season and will embark on S2 soonish. What was woke about S1?
S1 was less woke than S2.
S2 continues with the diversity and adds more girl-bossing and sexual degeneracy.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
BTW I notice I can name a bunch of hits I liked pre-2010 easily: The Wire, The Sopranos, Rome, Carnivale, Oz, Angles in America, John Adams. Oh, forgot one - add Chernobyl to the hits, it's surprisingly decently done for an US series about USSR.
More options
Context Copy link
More options
Context Copy link
Nothing stops us from concluding that BOTH the execs AND the writers are overpaid considering their actual role and the value they bring to the table.
I'll go a step further and say that all the people involved in creating most TV shows are overpaid, except probably the day-to-day hands-on workers handling all the technical aspects of production (lighting, camera work, set-building, audio mixing, etc. etc.) that actually allows the production to function.
Being maximally uncharitable to the writers and execs, I'm saying that most short-form media these days seems extremely 'paint-by-numbers' where the actors show up for a couple hours and do some easy line reads, any mistakes can be fixed in post, cinematography and audio choices are generally rote and uninspired, and as mentioned elsewhere, the writing is either bland or actively bad.
Big names get paid well because they draw eyeballs, that's fair enough, but so many productions end up giving a 'going through the motions' feeling, as though everything is being produced on an assembly line, with the only major difference being what coat of paint gets slapped on the end product.
Maybe the pay for the professionals involved should reflect that of factory workers.
I dunno.
More options
Context Copy link
More options
Context Copy link
Id question if the number of decent shows is actually low these days.
We may be past Peak TV but there's still plenty of stuff to watch on cable.
There have always been hacks and pandering but TV is actually very competitive and, unlike the films, it isn't all cookie cutter PG stuff like Marvel or cookie cutter PG stuff aspiring to be Marvel (why not make a Universal Horror franchise instead of an action-horror abomination? The money was right there...)
Euphoria probably doesn't deserve its budget (Sam Levinson should be kissing Zendayas feet) but that show as a film would be relegated to the $10 million indie ghetto.
It's fine if you just treat it as a melodrama.
But its reported budget is totally out of whack with its quality imo. As I said, Levinson is benefiting from a Zendaya premium (as well as the view that it's an insightful or trend-setting show, which I think has kind of died down after all of the criticism of S2)
That's Game of Thrones money. Half of that sounds unbelievable.
More options
Context Copy link
More options
Context Copy link
Where do they live though? Wages w/o considering their COL is likely missing something.
That said, there's probably a networking bonus for people who live there (meet other writers, producers, directors, etc.). So doing remote work isn't totally feasible for that (there are small day-to-day interactions that even a quarterly convention/meeting cannot capture the value of).
COL is not some natural phenomenon that strikes at random. COL is high because rich people live there and bid up prices for resources and services.
Right, but that's not the point. My point is that simply seeing a person making much above the median doesn't tell us very much. The implication is that they are rich, but if there are reasons for not leaving, then their QoL can be on par with that of those who make less.
Well, they could also spend 90% of it on cocaine and live in absolute squalor, you can't ever know. I think you still can see that these are pretty well-paid bunch.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Man, I just have very little sympathy for these writers. Any job that a substantial amount of people would do for free in their spare time is obviously going to have immense downwards pressure on pay. These people consciously took a risk with their career, and now they're getting burned for it. The issue they're having is a difference in the supply of aspiring writers and demand.
Here's a Reddit take with 1k upvotes from this post:
There's a sickening reverence the blue tribe has for those working in creative fields. These writers AFAIK usually earn more (albeit less consistently) than the grips and grunts responsible for the rest of the production. But through their outsized influence on culture and the zeitgeist (the real compensation of their work and a reason it has the status people chase) you get a situation like in the quote above taking place. This strike affects the financial security of countless others in the industry, who (at least overwhelmingly so on Reddit) have somehow convinces themselves this is OK.
I don't actually know too much about the specifics of the Writers' guild union, but they leave me with the same instinctual disgust I have for unions in general. I hope someone can link me to some well written and sourced pro union articles to educate me on this, but comments like these in the same Reddit post leave a bitter taste in my mouth
I get the theory behind the usefulness of collective bargaining and how it should result/has resulted in better worker treatment, but god, the aesthetics are just awful. Just seems like collective bullying and the imposition of economic dead weight to me.
As a Europoor who never had the privileged opportunity to work on global cultural defining work, let them stifle the American media machine into the dirt. Maybe Netflix can fund some more cool German or British works.
source: https://old.reddit.com/r/television/comments/135adyi/the_writers_guild_of_america_is_officially_on/
As for the AI question. High budget productions will be the last affected, with generative art or text models. Right now, Ai can't replicate top talent, but it can replicate a lot of the low quality writing and CGI that's rushed out of low budget media. We will see Paw Patrol be automated years before a season of GOT has AI paintings in the background. Which still means that most shows will still be heavily affected, as most shows don't have anything like the budget of a GOT. The biggest change this would bring about first, is more, nicher, and cheaper shows, but also a further squeezing in the winner's take all reality of creative profession to decide who gets those scant jobs in prestige TV.
More options
Context Copy link
More options
Context Copy link
I predict very little chance that this triggers any AI-assisted scriptwriting.
Reality TV got a boost as a tried alternative. We'd need to see successful, existing projects before Hollywood sees it as an option. At most, one or two such projects will use this to sweeten their pitch.
More options
Context Copy link
I wonder who would be more hated in the future, the "prompt engineers" or the skabs.
More options
Context Copy link
More options
Context Copy link