This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Yes, the blowback against AI art seems to me a little insincere.
Ostensibly, it's about the AI 'stealing' public art to train itself. (I agree with you that this argument is nonsense)
More realistically, it's people disliking the idea of robots putting artists out of work.
Cynically, it's artists being sore that their highly developed skills can suddenly be near-replicated by a computer in 15 seconds.
Many times over the past few centuries, skilled workers have found themselves driven into obsolescence by technology. Very few of them succeeded in holding back the tide for long. If I were a digital artist, I would urgently be either swapping to a physical medium, or figuring out how I could integrate AI into my workflow.
FWIW, I think the argument that this argument is nonsense is nonsense. That's not to say, that I think the argument is necessarily correct, but the immediate dismissal, usually with some analogic assertion is too pat.
AI training is a pretty novel category, and while it's 'like' other things, I disagree that it's enough the same that it can be dismissed as an extension of what's come before.
I think the argument that 'copyright laws and IP and automation somewhat breakdown in new territory and are at least worthy of renewed consideration', is valid and not immediately dismissable as nonsense.
If your view is that we need to redefine what 'stealing' is in order to specifically encompass what AI does then yes, you can make the argument that AI art is stealing, but if you do that you can make the argument that literally anything is stealing, including things that blatantly aren't stealing.
AI training is novel, but I don't at all agree that it is so novel that it cannot possibly be placed into the existing IP framework. In fact I think it fits reasonably comfortably. I do not believe there is anything that AI training and AI generation does that could be reasonably interpreted to violate any part of IP law, nor the principles upon which IP law is based. You cannot IP protect a style, genre, composition, or concept. You cannot prevent people using a protected work as an inspiration or framework for another work. You cannot prevent people from using techniques, knowledge, or information gleaned from copyrighted work to create another original work. You cannot prevent an individual or company from examining your protected work. You cannot induce a model to reproduce any copyrighted work, nor reverse engineer any from the model itself. Indeed, carveouts in IP law like 'fair use' - which most people who decry AI art would defend passionately - gives far more leeway to individuals than would be required to justify anything generated by an AI.
The issue here is that when we're talking about "stealing" in the copyright/IP law sense, the only way something is "stealing" is by legally defining what "stealing" is. Because from a non-legal perspective, there's just no justification for someone having the right to prevent every other human from rearranging pixels or text or sound waves in a certain order just because they're the ones who arranged pixels or text or sound waves in that order first.
So if the law says that it is, then it is, and if it says that it isn't, then it isn't, period.
So the question is what does the law say, and what should the law say, based on the principles behind the law? My non-expert interpretation of it is that the law is justified purely on consequentialist grounds, that IP law exists to make sure society has more access to better artworks and other inventions/creations/etc. So if AI art improves such access, then the law ought to not consider it "stealing." If AI art reduces it, then the law ought to consider it "stealing."
My own personal conclusions land on one side, but it's clearly based on motivated reasoning, and I think reasonable people can reasonably land on the other side.
I think that human, natural language definitions of 'stealing', 'plaigiarism', 'copying' etc are not totally fluid. These are words with specific meanings. If someone wants to argue that AI-art is bad on consequentialist grounds then sure, crack on. But 'stealing' is not a catch all term for 'bad'
Whether or not AI-art is bad, I maintain it is not theft.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is the way I see it as well. When people say "stealing," they actually mean "infringing on IP rights," and that raises the issue of what are IP rights and what justifies them. As best as I can tell, the only justification for IP rights is that they allow for us as a society to enjoy better and more artworks and inventions by giving artists and creators more incentive to create such things (having exclusive rights to copy or republish their artworks allows greater monetization opportunities for their artworks, which obviously means greater incentive). The US Constitution uses this as the justification for enabling Congress to create IP laws, for instance.
Which is why, for instance, one of the tests for Fair Use in the US is whether or not the derivative work competes against the original work. In the case of AI art and other generative AI tools, there's a good argument to be made that the tools do compete with the original works. As such, regardless of the technical issues involved, this does reduce the incentives of illustrators by reducing their ability to monetize their illustrations.
The counterargument that I see to this, which I buy, is that generative AI tools also enable the creation of better and more artworks. By reducing the skill requirements for the creation of high fidelity illustrations, it has opened up this particular avenue of creative self expression to far more people than before, and as a result, we as a society benefit from the results. And thus the entire justification for there being IP laws in the first place - to give us as a society more access to more and better artworks and inventions - become better fulfilled. I recall someone saying the phrase "beauty too cheap to meter," as a play on the whole "electricity too cheap to meter" quote about nuclear power plants, and this clearly seems to be a large step in that direction.
Yes, but AI art does not rely on fair use. The argument that the copyright issue is nonsense is that in almost no other circumstances, except where a EULA is enforced, does copyright limit the way someone can use a work. It only means they can't copy it. But the case against AI art would have to extend the concept of copying a work beyond any reasonable point in order for those restrictions to apply. You can't copyright concepts or styles for this reason, only specific works. Obtaining legitimate copies of works and assimilating them for novel synthesis has never implicated copyright before.
But this is the core of my objection to the objection. LLMs are a novel paradigm and the expectation that previous legal frameworks that were designed for other paradigms should work just as well here without reflection is my objection. It is question begging to answer the question of how copyright out to work around AI to how it worked in non-AI.
That is not to say that it necessary should end up somewhere different. What I am rejecting is the simplistic, predetermined conclusion that it's not different so isn't different. IP protections are not some immutable natural force, and society should have a right to consider refinement in the face of massively disruptive technological innovations. that said...
Realistically nothing can be done anyway. Anything would be impossible to enforce, so I'm not going to lose sleep where you can't do anything anyway.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I approach art under the assumption that the Artist has deliebrately and intentionally packed layers of meaning into it that take time and mental effort to dig through. For good art (as I see it) this is true, for bad art it usually isn't and the time and effort are wasted, and for AI art it's categorically never the case. The technical quality of art, which skilled artists achieve through practice, bad ones usually do not, and AI art can do situationally, used to serve as a heuristic for which art is worth engaging with in the first place. Technically competent AI art is still devoid of meaning and intention, so the heuristic becomes worse than useless.
It's probably a matter of taste. Someone who's just out to consume technically competent art regardless of the artist's intention or any potential meaning packed into the artwork can subsist perfectly fine on a diet of AI-generated junk art. A pretentious pseud like me can not, and having my heuristic ruined by AI art is outrageous.
This is a really interesting perspective, but I admit I have a hard time vibing with it. I tried to get into art appreciation when I was younger. Went to the national galleries and the Tate modern, hemmed and hawed at paintings and modern art pieces. This was the top 1% of the top 1% of art, and yet I was disappointed that there was usually very little explanatory notes to go along with the piece. Often when I did find some guide to the 'canon' meaning of the art it was usually perfunctory and not terribly interesting. Usually I preferred my own interpretation to the one I was apparently supposed to draw from the piece. I fully admit this was probably a 'me' problem. Perhaps art appreciation is a deliberately clutivated skill and I simply wasn't able to develop it
All this to say that I'm a 'meaning is in the eye of the beholder' kinda guy when it comes to art. If I draw something meaningful from a piece, I'm not sure it matters if it wasn't the meaning the creator intended, or even if the creator intended no meaning at all.
Besides, what proportion of art that a person consumes on a daily basis actually has layers of meaning deliberately packed into it, let alone deep or philosophical meaning? 1%? Less?
Fair points. We may just be wired differently. For what it's worth, I absolutely despise modern visual art because how the fuck is anyone supposed to get meaning out of three layers of literal shit on canvas? Art to me is mostly literature, with a little music and film on the side, and I am by no means a connoisseur.
Well most "art" that people consume on a daily basis is hardly created by one artist or a few working in unison, but industrially produced slop meant to be consoomed and forgotten. If there's any deep meaning in superhero movies, pop music or corporate imagery, it's "you are a well-trained consumer".
Does this sound like an anti-capitalist screed? That's not what I mean. What I mean is that most people just have a media consumption habit in place of taste. Yes I am an unjustified snob - not like I know what I'm talking about.
More options
Context Copy link
More options
Context Copy link
My brother once put it to me this way: Imagine you have a favorite band with several albums of theirs on your top-faves list. You've followed them for years, or maybe even decades. It's not even necessary for this thought experiment, but for a little extra you've even watched or read interviews with them, so you have a sense of their character, history, etc. And then one day it is revealed to you that all of it was generated by an AI instead of human beings. How would you feel?
I think I would feel a profound sense of loneliness. I would never revisit those albums again. And I don't think this basic feeling can be hacked through with some extra applications of rationalism or what have you. This feeling precedes thinking on a very deep level for me.
I don't have much sympathy for the various creative professions getting their oxen gored. Partly because social media has made me lose respect for many of them, their output quality is not commensurate with their whining, and I won't be sad to see them needing employment elsewhere. But also because I can't even see my own regular 'office job' being spared once the tech is good enough. I'm rather clear-eyed about the inevitabilities of this stuff. But I also foresee further alienation that humans may learn to live with but won't necessarily solve.
I think differing intuitions on this is exactly what makes this such a heated and fascinating culture war topic. My response to this thought experiment is that I'd be mostly neutral, with a bit of positivity merely for it being just incredibly cool that all this meaning that I took out of this music, as well as the backstories of the musicians who created it, was able to be created with AI sans any actual conscious or subconscious human intent.
In fact, this thought experiment seems similar to one that I had made up in a comment on Reddit a while back about one of my favorite films, The Shawshank Redemption, which I think isn't just fun or entertaining, but deeply meaningful in some way in how it relates to the human condition. If it had turned out that, through some weird time travel shenanigans, this film was actually not the work of Stephen King and Frank Darabont and Morgan Freeman and Tim Robbins and countless other hardworking talented artists, but rather the result of an advanced scifi-level generative AI tool, I would consider it no less meaningful or powerful a film, because the meaning of a film is encoded within the video and audio, and the way that video and audio is produced affects that only inasmuch as it affects those pixels (or film grains) and sound waves. And my view on the film wouldn't change either if it had been the case that the film had been created by some random clerk accidentally tripping while carrying some film reels and somehow damaging them in a way as to make the film.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Anecdotally, in the circles I move in, while concerns about stolen training data and artist livelihoods are real, I think the biggest factor is a combination of the aesthetic (i.e. AI art just looks bad) as well as what I think of as purity concerns. The way people treat AI art reminds me a great deal of Jonathan Haidt's purity foundation - people react to it the way they used to react to GM foods, or just way they reacted to junk, heavily processed foods in general. It's gross. It's icky. There's a kind of taint or poison in it. Real art is made by an artist, and involves creative decisions. Algorithms can't do that. People hate that sense that the image is inauthentic or 'not real', and if the AI art is curated well enough that they don't notice it's AI, then they were fooled, and people hate being fooled. If I say I hate AI art, you show me a picture, I like it, and you reveal afterwards that it was made by an AI, I don't conclude that maybe I'm wrong and AI art is fine. I conclude that you tricked me. You're a liar, and I condemn you.
That may sound uncharitable, though for what it's worth I'm anti-AI-art myself. Part of my concern is indeed aesthetic (the majority of AI art is recognisable as such; maybe high-quality human-curated AI art can escape this, but most of it is samey trash), and part of it is ethical (I admit my skin crawls a bit even to think that my writing might have been included in AI training data), but honestly, a lot of it is instinctual. AI art, like AI writing, is... well, impure. It feels dirty.
I'm upvoting not because I agree with you but because I appreciate you articulating your position so clearly.
The best kind of upvote!
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
It's not.
The best way to understand people on the other side of a culture war issue is to start from the assumption that they really do genuinely believe what they say they believe.
Sure, that would upset anyone. But there are also many non-artists who don't like AI art. Also, people who have objections to AI painting also tend to have objections to AI music and AI voice acting, even if those areas don't overlap with their personal skill set. Which is evidence that the objections are principled rather than merely opportunistic.
I believe in the virtue of charity, but - come on. Do you really think that a person's private motivations for being in favour of/opposed to X are identical to their publicly stated motivations more often than not? At every point on the political spectrum?
Alternatively, they believe (correctly, in my view) that generative AI is a war with multiple fronts, and if you want to win a war you have to win it on all of these fronts lest you fall victim to a rearguard action down the line. If AI visual art was banned but AI voice acting was seen as fair game, it's only a matter of time before lots of people start noticing that this seems kind of arbitrary and unfair.
Well, it gets very complicated. People can be unaware of their own motivations, they can believe one thing for multiple different reasons, they can tell half-truths, they can believe something one day and not believe it the next.
I would just say that, as a general methodological principle, one should start by trying to find where the authentic principled disagreements are, rather than immediately jumping to cynical conclusions.
Sure. But this isn't a psychologically realistic model of AI detractors. I assure you that the people who feel passionately about AI visual art feel equally passionately about voice acting.
More options
Context Copy link
More options
Context Copy link
We need a flashing banner along the lines of "Yes, your opponents actually think that. No, they aren't pretending to just to make you mad."
I've never for a moment thought that people opposed to AI art were actually fine with it but were just pretending to hate it to own the
libstech bros.More options
Context Copy link
More options
Context Copy link
I don't think this follows. The only way some behavior is evidence that some belief in a principle is sincere is if that behavior is costly to the person, e.g. giving up food for some religious holiday or even the Joker setting money he stole on fire in The Dark Knight. I don't think making this kind of objection is costly to these people; if anything, it seems gainful in terms of status within their social groups. At best, it's evidence that they understand the logical implications of the principle they're espousing.
So what do you think would be an appropriately costly test for the anti-AI-art position?
That would have to depend on the specific principle at hand. If it's, say, that training an AI model from public data is stealing, then, perhaps if they approve of AI art tools confirmed to have been trained only from authorized images, even if it causes them to face the ire of their peers who still disapprove of it, or even if it causes them to lose out on commissions.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I wouldn’t want to accuse everyone who is down on AI art as being insincere or a dirty rotten motivated-reasoner -many people freely admit their concern is mainly for the livelihood of artists-, but I have seen these discussions play out many times on many different forums. I have rarely seen the ‘AI-art is stealing’ argument withstand even the barest scrutiny. It is often pushed by people who clearly do not understand how these models work, while aggressively accusing their opponents of not understanding how they work. As @Amadan pointed out in his far-better-than-mine post, when faced with the hypothetical of an ethically trained AI, people do not declare their issues are resolved, which indicates that the core of the disagreement is elsewhere. It smacks of post-hoc reasoning.
I think the actual root of the objections are sympathetic. Artists are high status in online communities. People see a threat to them, empathise, and develop the core feeling of ‘AI-art bad’. From there we are into arguments-as-soldiers territory. Everyone knows that stealing is bad, so if you can associate AI art with stealing, even if the association makes little sense, then that’s a win.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link