This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
FWIW, I think the argument that this argument is nonsense is nonsense. That's not to say, that I think the argument is necessarily correct, but the immediate dismissal, usually with some analogic assertion is too pat.
AI training is a pretty novel category, and while it's 'like' other things, I disagree that it's enough the same that it can be dismissed as an extension of what's come before.
I think the argument that 'copyright laws and IP and automation somewhat breakdown in new territory and are at least worthy of renewed consideration', is valid and not immediately dismissable as nonsense.
If your view is that we need to redefine what 'stealing' is in order to specifically encompass what AI does then yes, you can make the argument that AI art is stealing, but if you do that you can make the argument that literally anything is stealing, including things that blatantly aren't stealing.
AI training is novel, but I don't at all agree that it is so novel that it cannot possibly be placed into the existing IP framework. In fact I think it fits reasonably comfortably. I do not believe there is anything that AI training and AI generation does that could be reasonably interpreted to violate any part of IP law, nor the principles upon which IP law is based. You cannot IP protect a style, genre, composition, or concept. You cannot prevent people using a protected work as an inspiration or framework for another work. You cannot prevent people from using techniques, knowledge, or information gleaned from copyrighted work to create another original work. You cannot prevent an individual or company from examining your protected work. You cannot induce a model to reproduce any copyrighted work, nor reverse engineer any from the model itself. Indeed, carveouts in IP law like 'fair use' - which most people who decry AI art would defend passionately - gives far more leeway to individuals than would be required to justify anything generated by an AI.
The issue here is that when we're talking about "stealing" in the copyright/IP law sense, the only way something is "stealing" is by legally defining what "stealing" is. Because from a non-legal perspective, there's just no justification for someone having the right to prevent every other human from rearranging pixels or text or sound waves in a certain order just because they're the ones who arranged pixels or text or sound waves in that order first.
So if the law says that it is, then it is, and if it says that it isn't, then it isn't, period.
So the question is what does the law say, and what should the law say, based on the principles behind the law? My non-expert interpretation of it is that the law is justified purely on consequentialist grounds, that IP law exists to make sure society has more access to better artworks and other inventions/creations/etc. So if AI art improves such access, then the law ought to not consider it "stealing." If AI art reduces it, then the law ought to consider it "stealing."
My own personal conclusions land on one side, but it's clearly based on motivated reasoning, and I think reasonable people can reasonably land on the other side.
I think that human, natural language definitions of 'stealing', 'plaigiarism', 'copying' etc are not totally fluid. These are words with specific meanings. If someone wants to argue that AI-art is bad on consequentialist grounds then sure, crack on. But 'stealing' is not a catch all term for 'bad'
Whether or not AI-art is bad, I maintain it is not theft.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is the way I see it as well. When people say "stealing," they actually mean "infringing on IP rights," and that raises the issue of what are IP rights and what justifies them. As best as I can tell, the only justification for IP rights is that they allow for us as a society to enjoy better and more artworks and inventions by giving artists and creators more incentive to create such things (having exclusive rights to copy or republish their artworks allows greater monetization opportunities for their artworks, which obviously means greater incentive). The US Constitution uses this as the justification for enabling Congress to create IP laws, for instance.
Which is why, for instance, one of the tests for Fair Use in the US is whether or not the derivative work competes against the original work. In the case of AI art and other generative AI tools, there's a good argument to be made that the tools do compete with the original works. As such, regardless of the technical issues involved, this does reduce the incentives of illustrators by reducing their ability to monetize their illustrations.
The counterargument that I see to this, which I buy, is that generative AI tools also enable the creation of better and more artworks. By reducing the skill requirements for the creation of high fidelity illustrations, it has opened up this particular avenue of creative self expression to far more people than before, and as a result, we as a society benefit from the results. And thus the entire justification for there being IP laws in the first place - to give us as a society more access to more and better artworks and inventions - become better fulfilled. I recall someone saying the phrase "beauty too cheap to meter," as a play on the whole "electricity too cheap to meter" quote about nuclear power plants, and this clearly seems to be a large step in that direction.
Yes, but AI art does not rely on fair use. The argument that the copyright issue is nonsense is that in almost no other circumstances, except where a EULA is enforced, does copyright limit the way someone can use a work. It only means they can't copy it. But the case against AI art would have to extend the concept of copying a work beyond any reasonable point in order for those restrictions to apply. You can't copyright concepts or styles for this reason, only specific works. Obtaining legitimate copies of works and assimilating them for novel synthesis has never implicated copyright before.
But this is the core of my objection to the objection. LLMs are a novel paradigm and the expectation that previous legal frameworks that were designed for other paradigms should work just as well here without reflection is my objection. It is question begging to answer the question of how copyright out to work around AI to how it worked in non-AI.
That is not to say that it necessary should end up somewhere different. What I am rejecting is the simplistic, predetermined conclusion that it's not different so isn't different. IP protections are not some immutable natural force, and society should have a right to consider refinement in the face of massively disruptive technological innovations. that said...
Realistically nothing can be done anyway. Anything would be impossible to enforce, so I'm not going to lose sleep where you can't do anything anyway.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link