This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.
The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.
First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.
Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.
However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?
The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!
I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:
Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.
Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?
This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?
None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.
What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.
I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.
Among the responses to this post, one thing that I saw several times was that deepfakes do not affect the person they are made of, and so are ethical, or at the very least, there's no case for regulating them. But I think, as was mentioned at least once, that there is a case to be made that they are comparable to libel. That is, they are able to distort the reputations of people in a negative way. This is bad, and I think is something that can be pointed to as a harm to the person in question.
Furthermore, I think that the graphic nature of a deepfake would probably make it have a more substantial and lasting effect on the perception of someone in the eyes of its viewers than would merely verbal allegations, once fabrications of both varieties were learned to be wrong.
I don't think this is a complete answer to what's going on with my moral intuitions here, because I have a similar gut feeling in this case to someone dreaming up, rather than fabricating, illicit scenarios with someone, which means my intuitions are probably not quite the same as what I have written above, and this is probably to some extent a justification of those intuitions, but I still think this is at least a facet of what is going on that is worth considering.
It's also probably worth keeping in mind that a lot of people care a lot about what people think of them as an end-in-itself sort of thing, even aside from tangible effects on their lives. People want to be liked, respected, etc.
Have a disclaimer saying "everything you are about to see is fake". Libel problem solved.
But I'm not sure that it is, given what I said in the second paragraph there, that this is the sort of thing that might lastingly affect how you view someone, even if you know that it is fake. I might be wrong there, but that seems plausible to me, and that would mean that while, sure, maybe it would get rid of whatever legal claims you could make, just saying that it's fake might not entirely work to prevent it from producing the harmful effects.
Humans aren't perfect Bayesian intelligences, and this might be one place where the differences show up, maybe.
That makes sense, but then you can't use existing notions of libel to justify your intuitions.
I think libel is still useful for thinking about it. Not to say that it violates the laws, and so should be illegal already (well, I assume there's no case for it being libel with the current laws, a lawyer would know better), just to say that the same reason that we might think libel laws are good laws might apply here. I was more arguing that it's fundamentally the same sort of thing as libel is, not that it's actually legally libel.
But again, I'm not sure how much of that is me rationalizing.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I don't agree with limiting the transhumanity of our imaginations. Period. The individual should have access to the quality of life improvements technology makes available. Including vividness of imagination.
As soon as your data is inside another individual's desktop computer, not only is it their data, it's literally part of their corporeal being. It's inside that which makes them who they are and determines how they function and what they can do. I don't take this as a metaphor at all. This laptop is the hand with which I reach out and touch the world. This word processor is the mind with which I structure my thought. This is ME. And I will not be chained nor see my brethren chained without righteous fury.
Whether other people should be allowed to publish deepfakes of you is a different discussion.
At the very least, humans should have the same rights to protect their image that Micky Mouse has.
Ultimately, I don't believe in Micky's rights though... my ideals are freedom maximizing, but this produces contradictions in a world where people require scarce resources to flourish.
When humans see two things coupled, they correlate them. In a world where any two concepts/styles/IPs/morphs can be coupled at a touch of a button, most such correlations are in some sense spurious. Humans need time to adapt to this.
More options
Context Copy link
There's a difference in that spying on people involves an intrusion into their private space and sometimes even the threat of more violent acts. It means I have some way of getting into your house, or taking control of your personal devices, or that I have snuck into changing rooms that I shouldn't be in. It's wrong because the steps to doing it in the first place are wrong and because it signals that you don't respect boundaries, your sense of security in your own home is damaged if someone does this to you.
The only instances of deepfakes being used to cause harm that I can think of involve some second step which it is much less philosophically complicated to sanction. If I want to use it to damage your reputation, well that's straightforwardly wrong, but the wrong there is in how the image was used not how it was produced. It's wrong in a simple way, the way in which painting a nude and telling your partner that you posed for it in my living room (assuming that's out of character for you) is wrong.
When someone doesn't want you to see deepfakes of them, doing so violates their boundaries.
So, when is it wrong to violate boundaries?
We don't consider all boundaries sacred, and in fact consider some of them dumb or even harmful.
But even when someone violates a non-sacred boundary, we still usually think of them as a jerk.
Where does this one stand?
More options
Context Copy link
More options
Context Copy link
My gut says: you can appreciate all the deepfake porn you like as a consumer, but MAKING deepfake porn should be treated exactly like libel.
The problem isn't coomers doing their thing, it's someone searching your name and getting convincing video of you getting blasted that is three reposts deep in a chain and no longer has any implication towards fakeness.
This is not likely to happen to a pleb and won't be a huge problem for eg somone already famous, because our theoretical background checker would see that @questionasker is big on the internet so it's probably fake; but even so.
More options
Context Copy link
Interesting questions. My instinct tells me that it's not THAT wrong. My gut feeling is that the shame of having nudes of yourself circulated is that it shows that you did something "naughty", or you're "that kind of girl", or you're not the sort of person who would avoid taking such pictures in the first place. Basically, I feel like the shame comes from having taken the picture, not from the fact that other people are seeing it. Or it comes from simply being dumb enough to not be careful about who you give the pictures to, so that the video/images don't end up circulated. Or some combination of the two. Either way, if we live in a world where anyone can be deepfaked onto a lewd picture or video without them having taken any action at all, then there's no shame in it. Absolutely everyone has a naked body, after all, and almost everyone has sex and a sexual side. There is no shame in having either of those, just maybe using them in specific ways.
But they do feel shame! If you read the tweets about the atrioc incident you can clearly see women are very upset, so if not shame specifically there is certainly a feeling of violation. To be honest, reading those tweets they strike me as post-hoc justifications, I think most women cannot clearly express why it disturbs them so much but it certainly appears to. My explanation is it comes down to losing control of the exclusivity of their sexuality, a primal reaction that they didn’t reason themselves into. This would also explain, I suspect, a disparity between men and women’s opinions on this issue. I think most posters on the motte think deep fakes are not a big deal because they are mostly men. Women and men both tend to care a great deal about the exclusivity of women’s sexuality, and women and men both tend not to care much about the exclusivity of a man’s sexuality. Even if they can’t articulate why women feel this loss in a visceral way.
I do not think you'll get very far by believing that any of the emotional states presented for mass public attention by YouTubers on social media are authentic. Least of all on topics directly relevant to their personal branding.
More options
Context Copy link
That's actually a really interesting take, and I think it sounds like it could be true. If we wanted to appropriate woke terminology, we really could frame women being upset about deepfake porn, as women simply being upset at the loss of a privilege they have and care about, that men do not have: the ability to utilize the exclusivity of their sexuality to get what they want. To be honest, then, I think the best thing for the world would be to have deepfake porn technology be proliferated. That could help even out the sexual power dynamic between the sexes.
More options
Context Copy link
More options
Context Copy link
I disagree and I think I'm not the only one here (source: widely accepted social norms around clothing, nudity and sex).
Whenever this topic comes I'm usually surprised by the number of people who try to make the case that it's not actually a big deal or how confused they are that people would get offended by deepfakes of themselves being created. Personally, if I found out that someone had made fake porn of me or anyone I care about (or distributed actual pictures of them naked/etc), I would immediately go kick their teeth in. I would do this because I would be incandescently furious that someone would do something so flagrantly insulting and disrespectful and then be dumb enough to let me find out about it.
And I know that everyone imagines these kind of things already, but there is a world of difference between imagining and actually producing/sharing a video/picture. In the same way that pretty much everyone is digesting food but I don't want to see it and I definitely don't want it on or near me.
OK, but this reaction is isomorphic to Wahabis going "If another man talked to my wife unchaperoned..."
Just because you were brought up fundamentalist doesn't mean it's objectively acceptable to do violence to people for victimless crimes.
Then it's probably a good thing for me and the Wahabis that we are indifferent to the question of if something is "objectively acceptable" and instead are concerned with what we personally find to be acceptable and the means by which we may align reality to our respective visions.
More options
Context Copy link
What about just getting them fired and banned from using banks?
More options
Context Copy link
More options
Context Copy link
I don't think that those norms fundamentally disagree with me, either. If you're someone who has a naked body or has sex, that's all well and good. But to defy the norms is what will get you shamed, because it says something about you. For example, everyone has the capability of being naked in public, but only people who are slutty stupid or crazy would actually do it.
It might just be that the norm exists (don't be overtly sexual in public), and different people, like you or I, will interpret the underlying reasons for it in different ways (you think it's that people should feel some amount of shame for having a sexual side, and I feel that it's just that you should only feel shame if you're not careful enough to keep it concealed except from the right people).
Once again, that's not so different from what I said. I may disagree about "furious that someone would do something so flagrantly insulting and disrespectful". But I also said that part of the shame that comes from having nudes is simply being dumb enough to let other people find out about that they exist. That sounds similar to "and then be dumb enough to let me find out about it".
More options
Context Copy link
More options
Context Copy link
Yeah, this is my intuition too. People aren't harmed by being seen naked and/or mid-coitus: people are harmed by having their friends, family, and coworkers know that they're stupid and/or slutty enough to be taped, then leaked. In the deepfake case, you weren't stupid and/or slutty enough to be taped, then leaked, so the sting is all but gone.
Would I want friends or family to watch a deepfake sex tape of me? No, but only to the same extent that I wouldn't want them to imagine a sex tape of me, either. And the problem is I have with the imagining is that they're incestuous perverts, not that I've been exposed as a slut who doesn't do their due diligence checking the hedgerow for peeping toms with cameras.
More options
Context Copy link
More options
Context Copy link
The object-level issue for me is uncontroversial: there is zero damage from deep fakes, bans on technology to produce arbitrary content (without intent to do direct harm) constitute an unconscionable intrusion and the beginning of a civilization-ending slippery slope, people who support this censorious approach (or even have moral intuitions in favor of it) are barbarians, and I can only wish they be politically disenfranchised somehow.
Naturally, they think the same, or less, of me.
Maybe that's too harsh of me. I've skimmed replies to this tweet today and they left me a bit shell-shocked. Ion had the misfortune of his tweet showing up outside his bubble, somewhere in Normie Twitter. So... Well. Most of the time there is no argument being made, they just gloat about him being «ratioed», or ask sarcastically «would it be okay with you if someone made gay porn with your likeness» or say «imagine this was done to your mother/daughter/sister», assuming the shared intuition of obvious harm – and, by implication, hypocrisy and sociopathy in Ion's support for AI fakes of other people but not of himself. (There's also the rhetoric of consent/boundaries/violation, peddled by women and feminists, which IMO speaks to profound egocentric entitlement and belief that women are owed desirable perceptions, and this deserves more scrutiny, but I won't go into it).
Some responses provide a semi-rational steelman, though:
And here:
And my response is, that would be totally okay, I do not need low-value people in my life. If someone (not dearly beloved but clinically demented) cannot/isn't willing to distinguish real and fake images based on context, or just has strong emotional reactions to highly-likely-fake images and can change attitude towards me on their basis, that person is a long-term liability and should be discarded. This, of course, is the generic principle behind provocative behavior and – in the extreme case – the saying «if you can't take me at my worst, you don't deserve me at my best», popular with crazy bitches. It's strategically sensible for them, because their life strategies are incompatible with being sanctioned for their worst. And disengaging with people who can't take the fake of me at my worst makes perfect strategic sense for me.
And then I realized how deep into a high-IQ, high-decoupler, high-trust bubble I am. For these normal (if very online) people, relations with others are quantitative in nature. They do not filter aggressively. Even if they themselves are more or less undamaged by fake stimuli, they can be materially harmed by losing «followers». They love their friends and family, among whom many are unable to mentally separate contexts. So these people's violent emotional reactions are attuned to their social reality, to the semi-magical realm of voodooist village or bullying in public school, where a literal straw man can go a long way in burying your reputation and, in the limit, you physically.
This just speaks to the need to separate society into more impregnable bubbles, I guess.
P.S. There's an additional, somewhat orthogonal aspect to this, explored e.g. by @SubstantialFrivolity downthread:
I've said it before and I'll say it again: Hajnal-type people, guilt-culture people, are fucking weird and freaky, and The Motte would do well to extend some charity to non-First Worlders who show little sympathy to them and readily join the anti-white brigade. They boast of individualism and freedom, but are chained at the very roots of their souls, and proud of those chains. They exoticize and mock Asian cultures for putting «face» and «shame» first, but it's much more healthy and, indeed, liberal and respectful to maturity of others (considering the inevitable consequence of moral blackmail) to care about your public reputation, than to agonize over concealed inner judgements – particularly by people who are not invested in being judicious, as is inevitable in modern societies that are just so much bigger than your dumb medieval Anglo village with a single parish. Their «Christian» morality is Harm OCD reified into a religion, a permanent backdoor inviting any psychopath to tinker and steer them (which inevitably happens in their collectives of substantial scale). Far from surpassing childish innocence, their moral code is more appropriate for ants than for adult humans.
But then again, they are the perfect substrate for building comfortable societies others will enjoy.
I admit that getting the population into the Goldilocks zone that combines dignity culture (which would allow people to fantasize, make and publish images without absurd social opprobrium), reasonable face culture (which would allow them to do so without OCD-like pangs of ridiculous inner guilt, so long as they don't intentionally harm anyone) and avoids utter atomization, degeneration and extinction, is probably a task for 200 IQ social engineers. Ones we (I hypothesize) have got pursue other projects. If it were achieved, it'd probably look something like Eastern Europe, only more functional.
P.P.S. One consideration those outraged people are missing, or perhaps deliberately staying mum on, is that more deepfakes = less deception. With the proliferation of highly realistic fake porn, the prior for a given image being real decreases. Right now a microceleb e-girl's never-before-seen nudes may as well be authentic leaks, and in fact most of the time they are (or self-promotional «leaks», as it happens). When manufacturing their near-equivalent is a matter of two clicks, the prior is «duh, nice GPU bro» and the reputational and emotional damage, even in their ostensible paradigm, trends toward zero. But for those who profit off exclusivity of their images of indecent nature... well, their woe is a separate matter, and undeserving of being laundered through this moral rhetoric.
I don't think the distinction here is 'hajnal-type people'. There are a lot of normal british white women saying the same stuff as indian women in the comments. If the distinction has cultural roots, it can spread rapidly, and it doesn't seem to have race-related genetic roots considering there was a lot more 'honor culture' in britain or germany a few hundred years ago.
They may be saying the same things, but this is an example of convergent evolution in actions rather than the two groups having the same beliefs. No different to how I don't kick random people in the street just like how Westerners don't kick people in the street. But in my case it isn't because of some misguided notion of "equality" as Westerners like to believe, but simply for the same reason that I don't kick a cat I see on the street: namely because it is animal abuse.
More options
Context Copy link
More options
Context Copy link
I think the matter is more subtle than this.
I obviously think most people can, on the abstract level, distinguish between real and fake images. However I'm not willing to use this fact to jump to the conclusion that most people, including many people valuable enough to keep in one's life, wouldn't have strong emotional reactions to some types of even fake images depicting a person, especially images of a sexual nature. And again however much on conscious level they know the images are fake, I feel like the reactions many people have to these images could at least somewhat change their attitude towards the person ostensibly depicted in a real and meaningful way.
I think most people think differently about a person after fantasizing about having sex with them, than they did before such a fantasy crossed their minds. I certainly think that most people would think differently about them, and would almost certainly in some unconscious way treat them differently, after fantasizing about such a thing 100 times. And I think as much is even more true if they've had access to photorealistic depictions of this fantasy that are fake but produced by something other than their imagination, in other words, images that are much easier for their sexual-lizard-brain to believe are real even if they on a higher more abstract level know that the images are fake.
Other than that, you're right that legally these things shouldn't literally be banned. It was a mistake to include that set of questions in the body of my post. And you're right that I think most of the online discourse surrounding the subject misses the mark one way or another, which is nothing new when it comes to subtle moral/ethical issues.
Aside from all this, though, the disconnect I felt existed in my intuition was resolved by another commenter, who described the delineation as such: anything in my head or on my hard-drive, and exclusively in my head or on my hard-drive, is entirely my business. But as soon as I i.e. start to publish online what is on my hard drive, the probability that persons depicted in even fake pornography will find out that someone has done as much starts to approach 1. And, that's where I've started to cross a line. This is more or less what the quoted 'steelman' arguments you found on twitter are getting at, even if still for somewhat wrong reasons: publishing that material, making it highly likely if not certain that the depicted persons will be made to know of it, is what is wrong, at least morally/ethically. By doing so I've made it their business, where previously it was only my own. Regardless of the particular way in which they're affected, that you might personally think shouldn't matter to them, i.e. loss of followers, family/friends potentially seeing it and not knowing its fake, or at minimum, that they just don't like having to know about it -- ultimately it wasn't necessarily my right to make them deal with any of these things, even if I think they shouldn't care about them. The commenter who described the analogy I found so apt likened it to fantasizing about a person sexually, and then directly telling them that you have, in fact, have fantasized about them sexually. Maybe you think they still shouldn't care. But as far as I'm concerned, by doing this you've made something that was formerly solely your business, into their business, in a way I don't think you should.
More options
Context Copy link
More options
Context Copy link
I suspect this technology (deepfakes) will not amount to anything by virtue of creating a negative attention feedback loop. As others have pointed out you could photoshop people's faces onto naked bodies for as long as photo editing has been around. I'm sure a few hundred highschoolers do it every year, they have a laugh with their friends and then forget about it. Deepfakes being new and more high-fidelity means they have a temporary fascination about them but I don't see how the underlying activity is different. You deliberately create a sort of optical illusion to half-convince yourself that you can see a person naked when in reality you cannot. Yes the possibility of using a deepfake for libel is much greater compared to just photoshop but then again lying isn't a new invention either. To the degree that this technology becomes ubiquitous its allure will fade.
Interesting perspective.
More options
Context Copy link
More options
Context Copy link
Why does the pornographic aspect matter so much?
There are plenty of ways AIs can be used to misrepresent someone, many more offensive, objectionable, and damaging than pornography. You might have Trump reveal what he did on Epstein's island, or Biden discuss what he really thinks of trans people. There's not some magical quality that makes sexual misrepresentations inherently more worthy of social considerations than those things; personally I'd strongly prefer someone make a deepfake of me getting fucked by a horse than put a racist rant in my mouth.
So maybe as a principle we might agree that anyone should be granted the right to call for government intervention when something fake is posted about them. But that opens an entirely different can of worms. Anyone can plausibly claim anything is fake; some of those things will be real. And so you need gatekeepers to choose which fake things are really real and which real things are really fake. Those gatekeepers will be subject to capture.
That seems like a strictly worse world than one where people learn "never trust or believe in anything on the Internet."
deleted
It's one of the more popular copypastas. 1 2
More options
Context Copy link
The Trump one has him speaking a bit too quickly and cursing too much, which broke the realism a bit. Easily fixed with better training and a better script.
And yeah, it's all horrifying and horrifyingly inevitable. Imagine if someone took the Biden copypasta one and put it in the voice of a parent or partner and sent it to a transwoman. Or even using her own voice. (All you need is 10-30s of audio to generate the voice.)
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
One interesting tangentially related issue: deepfakes could destroy the CSAM market, because it will be far easier and safer to generate fakes than the documentation of a real crime. If someone wants internet point among the abusive pedophile community, they'll get far more of it from generating fakes (and claiming they're real) than the real thing itself.
On the other hand, there's a plausible argument than ubiquitous CSAM could induce more people to commit child sexual abuse. I'm not sure which effect would dominate.
Didn't Aella have a post a while back suggesting the government itself create a giant, free database of AI CP as a way to undermine the market?
I don't know if it would work, but it does seem like an interesting idea that only a rat-adjacent person would come up with, for good or for ill. It's never going to happen regardless, so there's no need to worry about what the real world effects of such an experiment would be.
The government will never have a CSAM database (generated or not) that it uses to divert pedophiles, regardless of the effectiveness or not of that particular, almost parodically, highly-decoupling idea.
In what I'm suggesting, though, it's not the government or external entities that will be doing anything. It's pedophiles themselves, who'll organically disrupt and flood themselves with fakes, with no way for someone to prove it's expensive authentic abuse.
The one thing I might worry about with that is that pedophiles are only using porn as a substitute for the harder to get thing that they really want, and they likely take pleasure in sharing evidence of their crime regardless of who pays for it. Flooding the market might make it harder to detect the real thing without actually dissuading pedophiles from abusing children.
More options
Context Copy link
More options
Context Copy link
Rationalists aren't the only people willing to break taboos, and i've seen the idea of 'public cp database' in many other places, eg edgy political debates. And this was before AI, just "database of old, already publicly shared cp made available and legal while current stuff kept illegal to reduce incentive to make or look for new stuff." rationalists just combine it with 'very smart' and 'committed to kind and reasonable discourse'.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
My general read on this stuff is that our moral framework including freedom of expression and thought imply that these things cannot be punishable moral transgressions, and the ick factor comes more from the way it changes our expectations of the phonography viewer.
They aren't causing real damages by doing this (except in the case of distribution and claims of authenticity which is covered by the moral frameworks around libel). Using your likeness moderately infringes your intellectual property but in my experience most people don't have that strong of moral reactions to IP violations. I think there is some sense of sexual property that is infringed in that you should be able to extract desserts from your sex appeal and reproductive potential, but I think there has been a lot of pushback against this moral precept as part of the sexual revolution. The pornography viewer hasn't done anything wrong yet but they have revealed that they want something from you (implicitly, that you don't want to give them).
Imagine you own a boat and your neighbor fantasizes about having your boat. I think it is clear that his fantasies don't constitute immoral action, but it brings into question every interaction you've had. When he gave you his old garden tools to help you get started was that genuine generosity of a lie to get in your good favor. If you leave on vacation can you trust him not to steal it. Some of these are resolved by disclosure, e.g. if your neighbor gives you the garden tools in exchange for lending him the boat for a fishing trip, but it doesn't resolve the unmatched value functions.
More options
Context Copy link
I think a reasonably sensible analogue is “cutting someone’s face out and putting it on somebody else’s body”.
The fact that it’s AI doing this, so the lighting and skin tone matches doesn’t seem that meanfully different to me.
What is different is (the fear of) the breakdown of norms. Some highschooler photoshopping a naked picture of their crush is probably bad, but doesn’t seem to merit an Official Response (e.g. a school assembly telling students not to do that, a law about it being illegal, scanning students school computers for offending files, etc)
On the other hand, high school boys passing around deep fakes of naked celebrities, and then that morphing into passing around deepfakes of their classmates make a personal vice into a social problem.
Again, I don’t think AI is directly related to this. Using photoshop to make these pictures isn’t particularly different, it just raises the effort required (and decreases the creep facto -- spending 2 hours to photoshop one girl makes you seem like a loser)
More options
Context Copy link
The only violation here is one of licensing; the deepfakes use a person's likeness without their permission. Other than that, I don't see any problem.
Atrioc is only in trouble because, as he said through his tears in his apology, he's done all he can to make his chat and audience inclusive, especially to women. He cultivated an audience of offense-seeking hall monitors and now it's biting him in the ass when he's made a mistake, as everyone does. If he was a fucking degenerate none of his audience would have cared. I can't imagine someone like Asmongold's audience particularly giving a shit, beyond mocking him viciously for paying for porn. He made this rod for his own back, to some degree.
As other people have said, this is no different in concept from NSFW fanfics the likes of which litter sites like AO3, or NSFW "fan art" or celebrity photoshops anything else. It's just better, that's all. And yeah I can understand people feeling uncomfortable seeing works like that about themselves. But they can always just not look.
If it's there, though, they'll still know it's there, even when they're not looking at it. Thus they will suffer some psychological harm they otherwise wouldn't have suffered, if it just wasn't published in the first place.
Is it moral for me to publish something, if the very fact that it has been published will cause someone to suffer psychologically? I think unless the value gained by publishing that thing is high (high in a relative sense, as in, greater than zero) it is immoral to do this. And I think the value gained by something like porn is basically zero.
The same idea applies to using people's likeness in memes. Like for the woman in the "first world problems" meme--I am sure that being the literal poster-child of getting-upset-over-silly-things isn't what she wanted out of life.
If those memes are distributed for free--and they are--does the woman have the right to ask websites to take them down?
More options
Context Copy link
So? We can't go our whole lives avoiding doing anything that causes people trivial amounts of "psychological harm". That kind of mental safetyism is abhorrent and would have us all walking on eggshells for our entire lives, and having our behaviour dictated by utility monsters. Or empathy monsters, I suppose.
The value gained from porn obviously isn't zero, as people pay for it, and people enjoy it.
This kind of distress is felt by a vast majority of the human race, at least that portion who even understands what the Internet and photos are. Unless you're claiming that most of humanity are utility monsters, I'd suggest that if you don't feel such distress, you are extremely weird and you should avoid typical-minding on this subject.
This kind of distress is contingent on societal values though. It'd be incredibly weird in $X_BC to not care about the virginity of your daughter or future wife, less so now. "There are deepfakes of me" seems like something people would adapt to and stop being distressed about if it became commonplace (and there's a good chance it will, the interest is there and a stable-diffusion sized deepfake model you can torrent and run on your pc is technically doable). Is 'psychological distress' even bad, absent something worth being distressed over? Compare to ... confusion and doubt in a tough intellectual problem, it isn't comfortable or pleasant, but it isn't bad.
More options
Context Copy link
Distress is over-egging it a lot. Mild discomfort seems like it should be more appropriate, at best. Literally what is there to be upset over?
Woe is me, someone finds me attractive enough to go to the effort of faking a video of me having sex so they can watch it and pretend it's real. And people are paying for it! Oh no how awful I must feel, being in such demand.
If there's one thing we know about humans, it's that they absolutely hate feeling sexually desirable.
You are extremely weird and you should avoid typical-minding here.
Unnecessary. You already made this point, less antagonistically.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
IMO this is the correct answer. It is in fact wrong to fantasize about what someone looks like naked, or having sex with them, or what have you. It's very common, yes. But it's still wrong.
Let's say for example that you regularly fantasized about some female friend being naked. Furthermore, let's say you never told a soul but did write it in a diary which you kept safe and never let anyone see. Some might say you did nothing wrong. But even so, if your friend decided to snoop in your diary and found that out she would be profoundly creeped out, and the friendship would be seriously damaged. I think the same would happen for a male friend too, of course, this isn't a gender thing.
I don't think this is a good argument. First of all, we don't agree that it's acceptable. We simply realize that it's impossible to tell, so we can't do anything about it. Those aren't the same.
Second, I don't think that whether a norm is enforceable has any bearing on whether the activity is actually wrong. Even if we can't catch a murderer and bring them to justice, we don't say "well I guess it wasn't that wrong to murder that person". The immorality of an act, and our ability to punish that immorality, are unrelated to each other.
Coming back today, you seem to have disengaged, so if you're not interested in replying that's fine, but
Could you elaborate on the reasons you find this morally wrong? A lot of words were written, but I don't think anyone has a good idea of why you find it so.
Sorry, I'm not trying to disengage so much as I just plain didn't have time to reply to you and it fell off my radar. Your last reply was challenging enough that I knew I would have to sit down and think about it, but never came back around to it. Thank you for the reminder, I'll try to remember to actually give you a real reply sometime today.
More options
Context Copy link
More options
Context Copy link
Jesus agrees, but I couldn't think of another moral code that would prohibit that.
Yeah, but most people ignore the next line, about cutting out your own eye if it causes you to sin.
I think possibly only the early Christian writer Origen and the Russian Skoptsy ever took this advice 100% literally. Every other Christian prefers to believe that Jesus' advice is metaphorical in some way.
I could see that line being an example of Jesus no-selling a claim of displaced responsibility, though. "It's not my fault, it was my eye that was responsible for the sin!" "If you're serious, get rid of that sinful eye, and be free of (this) sin...wait, no takers? Who knew."
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is appealing to consequences that only result if you leave records of your fantasies. I don't think it can extend to thoughts without completely changing the underlying situation.
Imagine applying the same standards to almost any other fantasy or imagined scenario you can have with another person.
Is it wrong to imagine yourself in an action movie with another person? How about if you do it frequently, and write it down?
Is it wrong to imagine the reaction another person will have to a gift you plan to give them?
Is it wrong to imagine conversations with other people?
Is it wrong to imagine punching another person?
I just don't see what line sexually fantasizing about another person is supposed to be crossing that these other things don't. I think policing thoughts is harmful and unproductive, and it is better to just accept that people all around you are imagining and doing things with their remembered images of you in their brains all the time. If people remember me at all when I'm not around, I'm flattered more than anything, even if they are remembering me in a negative light, or projecting me into a scenario that is harmful or embarrassing to my imagined doppleganger.
I think at best you could get a norm that amounts to, "If you fantasize about someone you know in real life, don't leave a paper trail."
I would add to this the very common self-help advice to visualize the success you want to have. As in imagining yourself winning the race, award, promotion, etc. And one of those et ceteras is "get the girl." Is it morally wrong to imagine oneself asking out a potential partner? Getting a yes? Having a great conversation over dinner? The first kiss? These don't strike me as remotely creepy. Why is "we have a great time together" creepy when you add "getting it on?"
More options
Context Copy link
The problem with your argument is that you assume those other things don't cross lines. But fantasizing about hitting someone does cross a line, for example. It's bad to do that too. If I had to try to generalize a principle out of this (which I'm not sure I have the chops to do), it would be something like "don't fantasize about doing something with/to someone that they wouldn't want you to actually do with/to them". Fantasizing isn't bad in and of itself, it's the fact that you're fantasizing about something they would not be ok with that upsets people. Thus, fantasizing about having a conversation is fine because having a conversation is fine. Fantasizing about punching someone in the face is bad because punching them in the face is bad.
I also think you're really missing the mark if your takeaway is "just don't get caught and it's ok". I mentioned the diary because it's the only real way for someone to find out, but it isn't the record that would bother someone. It's the fact that you are doing it at all. "It's ok as long as I don't get caught" is literally the moral code of a child, but as an adult one should realize "no it's wrong even if nobody will ever know".
I don't think it's worth spending a lot of time on, but this sounds bat-shit crazy neurotic unhealthy self-flagellating.
Or do you just have something against imagination and fiction entirely?
As a different perspective, avoiding fantasizing about things that would be bad to do in real life sounds like an aspect of virtue ethics. It is neurotic and unhealthy to focus on something that will never happen. Epicureans would focus on obtainable pleasures. Buddists would say that these desires cause suffering. And so forth.
I think @SubstantialFrivolity is arguing that there is a very real moral and psychological injury being done to the people engaged in making and consuming these AI Generated images. I don't know if they would extrapolate to porn in general, but I would.
I mean, I mostly agree that it's not productive, and often not healthy, to spend a lot of time thinking about things that won't happen.
I think bringing in a moral judgement onto it makes no sense though.
For me, morality and health are intertwined. Any time someone says "should" they are making a moral judgement. Any time someone says, "I shouldn't do this, it's not healthy" they are making a moral judgement. "I shouldn't eat dessert, it's not healthy," is a moral decision that increases the virtue/habit of prudence and fortitude.
More options
Context Copy link
More options
Context Copy link
Fantasizing about sex with (uninterested female friend) isn't just about 'having sex with them immorally', it could also be a part of motivation to see if they are interested / pursue them, or even in a conservative moral framework attempt to court them for marriage and then have sex. "If something is obtainable" is not something one can know in many cases.
Do you think there is actual benefit to fantasizing about having sex with someone, in the eventuality that you actually get to have sex with that person at some point? I am not very certain that imagining having sex with a woman, picturing her liking this, enjoying that, actually helps when you encounter the flesh and blood woman, who likely acts and enjoys completely different things. In fact, I think it probably hinders a fruitful, mutually pleasing sexual encounter.
My argument is fantasizing about sex is ... part of or deeply related to desiring sex, in (same analogy as before) the same sense that 'imagining tasty food' is part of wanting that tasty food. This may be described as 'wanting it so badly you imagine it', but I don't actually think they're separate, or that 'imagination' is a discrete thing separate from normal thought. If you, just as a casual action, plan to reach for a cup, do you "imagine" reaching before you do? Not really, but ... sort of, partially, vacuously?
So 'imagining sex with someone' is just a normal thing. It's possible to spend too much time imagining it and not enough time in pursuit, and that could 'make the sex worse', but I don't think it's made worse in the normal case of imagining it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Would you extend this standard to all forms of fiction, be they novels, movies, or video games, in which the protagonist harms or kills others? Or only if such harm is justified in context or the morals of the story are considered appropriate or applicable to the real world?
I don't extend it to novels where a protagonist harms another, has sex, or does any specific immoral action. I would extend it to a form of fiction where the sole point was to dwell/glorify violence, sex, or a specific immoral action. Most forms of fiction provide some sort of philosophical evaluation of right/wrong, and utilizes immoral actions to demonstrate this. Or they provide a psychological snapshot of someone else's viewpoint, which broadens the mind of the reader. Or they provide a glimpse into another way of life.
Something like Agony in Pink, on the other hand, takes a little something away from everyone who reads it, be it time or a tiny amount of psychological well-being.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Right now I am fantasising about having sex with either you or @vorpa-glavo. I am respectfully appreciating the other of you. Who have I injured? No one, because things that happen exclusively inside my head have no effect on either of you. What injured your friend with the diary was learning about your fantasy, not the fantasy itself - before they invaded your privacy they were unaware and as a result unbothered.
What if it was a dream? What if I dreamed I was sexing either you or vorpa while a duck with the face of my father sung Uptown Funk backwards into a cucumber? Knowledge of this might affect our relationship negatively, and if you read about it in my dream journal one of you might be upset, but am I really to blame for the random firing of synapses in my head? No, and if it had remained a dream no injury could be considered. It's not 'it's ok as long as I don't get caught', it's 'it's ok as long as it doesn't affect reality'.
You might then argue that fantasies often provoke real world actions and I would agree, and say that uncontrolled impulses are much more morally fraught, but that doesn't implicate the fantasies themselves.
I don't think "nobody was injured because nobody knows" is a reasonable defense. I don't think that there needs to be an injured party for something to be wrong.
Also dreams are an entirely different thing than actively fantasizing. The latter is a choice you make, the former is firing of random synapses in your brain. Intrusive thoughts that you don't dwell on are similarly not wrong. I've had dreams where I cheat on my wife, and I feel scuzzy in the morning. But once I get out of the post-dream haze, I realize I didn't actually do anything wrong. However, if I were dwelling on a fantasy about cheating on my wife I would be doing something wrong because that is a choice and is under my control.
Okay. I accept this as a description of your values. I entirely disagree and do not emphasize.
No harm no wrong is my view. And idle horny thoughts are the very peak of inconsequential harmlessness.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think there's a legitimate concern that this kind of rule or principle would be too broad in its application, and lead people to unhelpful policing of their own mind.
People who tend towards internalizing disorders like anxiety and depression are already predisposed to get very inside of their heads, and second guess themselves. Telling people like that that they need to worry about whether they're thinking about other people in a respectful way seems like it's just giving these people more tools to torture themselves over their inappropriate thoughts.
I'm fairly emotionally stable, and don't tend towards internalizing disorders, but I have a few friends that do, and the inside of their minds sound like horrible places. They're constantly being unkind to themselves, and even when their lives are going well they feel guilty and can't allow themselves a moment of happiness. Telling people like that that they have to feel bad about sexual thoughts about other people, or fantasies, is just not going to be good of them.
It's going to, at minimum, create OCD people who constantly have intrusive thoughts about fantasizing about the people around them, and who then scrupulously beat themselves up for their failure to live up to the highest human ideals.
I'd rather have rules that don't stigmatize normal parts of human cognition, and don't have the risk of being taken way too far for a portion of the population.
I think the norm of, "It's perfectly normal to sexually fantasize about people you know, but don't let your fantasies affect how you treat them", is a much more actionable norm with fewer downsides, compared to, "Sexually fantasizing about people you know is morally wrong, and you should probably feel bad for doing it."
I think that's a fair point, but that to combat that we should emphasize the difference between thoughts that just pop into your head and thoughts you actively entertain. It's only the latter which poses a moral problem, not the former.
I do sympathize with the plight of people who struggle with mental difficulties. I am one of those people. But I also don't think that the solution is to say "well it's ok" out of concern for their well being. There has to be a middle ground, it seems to me.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think at least 80% of men regularly fantasize in some way about the naked appearance of / sex with some female acquaintances? Offense at that idea is some combination of: openly stating 'i fantasize about sex with you' being a strong signal of intent, that is then responded to as a signal of intent, in a way that 'a guy staring at a girl, and what that implies' isn't, and plainly incorrect ideas about 'sexualization' being bad. "Fantasizing about seeing a girl naked" is just part of considering/desiring/intending having sex, in a similar way that imagining good-tasting food is related to wanting good tasting food, and given said desires are good and useful said fantasy doesn't seem bad.
Single men today would not be creeped out by a female friend fantasizing about sex with them.
This is true, but practicality is part of morality because morality is about actions and their consequences, and practicality influences the consequences. Weird analogies: If one of my roommates collected his dead skin, powderized it, and sprinkled it all over the apartment when I wasn't looking, I might be mad. But everyone does that naturally, so ... whatever, it's fine. If herpesviruses were rare and preventable, transmitting them would be considered really bad, but they're pretty universal so whatever.
Yes, and? It's still wrong, even if 100% of men did it.
I disagree with your assertion of what morality is about. Consequences don't even come into it for most systems of morality (and indeed I personally think consequentialism is generally mistaken). I certainly don't agree that practicality is part of morality. It can be part of the enforcement mechanisms of morality, but it isn't part of morality itself.
I tried to answer the 'yes, and' in the rest of the paragraph - men imagine sex because they want to have sex, and having sex is good because (if you are a progressive) sex is fun or (if you are more right-wing) sex leads to children, and either way 'men desiring sex' is good, even in a reactionary society men desiring sex motivates them to get married.
Can you elaborate on why such fantasies are immoral? Your post read like "If a female friend knew about it, she would be creeped out and it'd damage the friendship". But this is just argument by "other people believe it" - if my great-great-grandparents were alive, and they knew I was atheist, that'd damage our friendship - charitably this is because they'd hold incorrect beliefs about the relationship between "being atheist" and morals or character. You might claim that 'a person's feelings matter even if they're wrong' ... but them knowing and being upset is a hypothetical. This points at some wrongness that these women understand ... but presumably you understand it too, so why not just say it?
[tangential] This isn't true, because the list-of-virtues or list-of-good-things was either (usually both) intentionally designed to lead to good outcomes (like a religious or legal code), or culturally evolved for said good outcomes. Like, if 'thou shalt not kill' is on your list-of-virtues, it's there because killing leads to death, which is a ... consequence. Ofc many disagree with that. My point wasn't exactly 'consequentialism always' though, just - even if 'don't kill people' is moral not because of its consequences but because it's a moral fact, like a law of physics, 'kill' still means 'when you cause somebody to die', and death is a consequence of actions. There's clearly a relationship between "it's immoral to kill people" and "don't drive while drunk", but that's because the consequences of 'driving while drunk' is 'you're worse at driving' which leads to 'sometimes you crash into someone'. And that's all you need for my 'morality is about actions and their consequences' bit. And then the way that 'imagining a naked woman' is actually immoral becomes very relevant - is it immoral because constitutes improper desires that might be followed through on? Is it as immoral to imagine someone in very revealing clothing, but not exactly naked, as them naked? What if you've already seen then in that, say at the beach? Naked as immoral as having sex? These all matter if one wants to reduce you or others doing these potentially immoral acts. But it also makes the proposition seem less plausible - why do the laws of moral physics reach into imagination?
More options
Context Copy link
I think it's important to flesh out why you think it's wrong. The assumed fact that 80% of men do this seems like strong evidence that it is normal behavior and normal behavior is not ordinarily considered morally wrong. It is my understanding that the Christian perspective on this is that imagining anyone naked is cultivating lustful thoughts, which will naturally lead to sin. In your system, is it wrong to imagine your wife and mother of your kids naked? Is it wrong to fantasize about eating at a buffet until you have to unbuckle your pants? Is it wrong to fantasize about winning the lottery? Someday getting a sweet oxen like your neighbor has?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'm of two minds. First, I'm not even the kind of person that is included in the snide "Men are imagining all the hot women around them naked anyways" camp, I don't know if I'm just not as much of a horn dog as other men or if it's a weird preference thing but it just never occurs to me to imagine people I see in the real world naked. So if I typical mind other men I have some sympathy to the kind of gross feeling people have about this and am not very sympathetic to the idea that men are entitled to this specifically.
On the other hand I feel like this is of the type of objection that "think of the children" always pattern matches to me. I would quite like the future where I can conjure up ai companions, maybe modeled after real people and I'd like those ai companions to not have little experience blocks like nudity to interrupt the immersion of them being real. And the no nudity bit is just the foot in the door. First it's heavily clamping down what companions modeled after real people, then it's any companion with a likeness to real people and eventually it's nude ai companions is rape to real women actually. I'd like to cut this off at the source. We're going to have to get over the fact that images, videos, audio and any other type of media are going to get very easy to fake very soon, the cost to stop this future is very steep and probably can't even be sustained long term in a global marketplace. So lets think a little bit more about where to cleave the reality we are soon going to find ourselves in at the joints instead of just having out future depend on what order the objectionable bits come into play.
And I think the reasonable border is something like attempting to actually pass a piece of fabricated media off as real is illegal and wrong, and creating one clearly marked fabricated is ok even if the motivations and methods are open to critique.
As for why I think the AI angle is important, it's because in actual use these things are going to be creating companions based on reference to other people. If I want a companion to narrate some stories I may ask for one with a voice like Morgan Freedman or some other reference point that is going to attempt to imitate someone. You already see it in all the different generation models we've seen. "Write an essay like scott alexander, paint this scene like Picasso, Read this text like norm" it's inevitable that soon deep fakes will only be made by AI and to make AI unable to do this would cripple AI.
More options
Context Copy link
Barring some development in physics that enable something like time travel, I don't see how an AI program like that could even theoretically be possible. The information just isn't there, and it's trivial to come up with extremely plausible examples where no AI could reconstruct the exact way someone looks naked - e.g. if someone has a tattoo on their stomach or something.
As such, I don't see this as a gradient. When we're talking about AI or deepfakes, we're necessarily talking about using computers to create brand new illustrations that didn't exist before. In the future, AI could (and likely will) get so much better that we could use clothed pictures of someone to easily and quickly create illustrations of that person that look, to a very high degree of accuracy, as if it were just a nude photo of that person. However, given that that accuracy can't get to 100% or even all that close to it, it's categorically different from, say, snapping a nude photograph of that person.
To me, the strongest argument against deepfakes is the potential distress caused to someone who might accidentally (or intentionally, for that matter) run into such things of themselves. I place this in the same category of suffering as the suffering of some Fundamentalist Christian or Muslim at observing gay people kissing or a cartoon of Mohammed; given how much damage some gay couple or a bunch of pixels can do to these individuals (i.e. none) it's suffering caused entirely by the individual's decision to deem those things as causing themselves suffering. A bunch of pixels that are arranged to look like an accurate photograph of someone nude simply lacks the ability to do anything to affect that person beyond what that person chooses. At best, perhaps a bunch of people jacking it to pictures of someone would lead to those people treating that person differently, but a. I don't particularly see why this would be true and b. even if it were, the issue would be the people choosing to treat that person differently, not in them having access to such pictures.
I've been thinking a lot of this question given certain internet drama I'm plugged into and this analogy was really helpful to me to firm up my own internal position on this so thanks.
More options
Context Copy link
More options
Context Copy link
This matches my intuition. For someone to just generate deepfakes they just keep to themselves? I've got no problem with that. For someone to distribute those deepfakes around, possibly (but not necessarily) passing them off as real has the potential for harm.
In the case of spying, I think punishment is valid even if it isn't technically wrong because spying will usually lead to information being used in harmful ways, like revealing facts or blackmail. Even if spying is done without those intentions, sometimes secrets are just too good to keep, it's playing with fire. Deepfakes don't have that same problem.
I'm starting to think along similar lines. It seems like its the actual distribution of the deepfakes that sets it apart in my intuition, not even necessarily because of the images being distributed in and of itself, but because distributing such images necessarily means they will be available publicly, and if they are available publicly that means that the depicted persons might learn that people are doing as much (creating realistic porn about them, 'viewing' them in a realistic way naked) which is what typically seems to cause the depicted persons psychological harm. Being that its wrong to cause people psychological harm, this is what makes it immoral. I'm starting to think a similar distinction would lie between i.e. masturbating while fantasizing about someone sexually (and keeping that you did as much entirely to yourself), and masturbating while fantasizing about someone and then telling that person that you did so.
More options
Context Copy link
More options
Context Copy link
Fake celeb nudes created with programs like Photoshop have been popular for decades, I remember there being sites dedicated to them, how porn forums would have subforums for them, and how people got annoyed when they inevitably got mixed into collections of actual celeb nudes. (During the moral panic about deepfakes a few years ago the subreddit for them was banned.) So discussion of deepfakes should account for the fact that they aren't particularly novel. The main difference is that they can be used for video.
Incidentally while I don't think I ever heard of a celebrity commenting on nude photoshops, I remember at least one who stopped doing nude/sex scenes because with the internet people could share videos of those scenes without the rest of the movie.
Maybe. Couldn't it be true that these types of images were unethical to create/consume for as long as they've existed, and there just wasn't proper recognition of the problem?
Another commenter brought up the fact that perhaps a significant reason as to why at least distributing deepfake porn is immoral could be because, by nature of the fact that one makes them openly accessible on the internet in order to be distributed, one makes it likely that whomever the deepfakes depict will find out that such porn is being made of them, and it is specifically the psychological harm inflicted by knowing porn like this exists of oneself that accounts for (or accounts for the majority) of why such porn is unethical for people to make or consume. This would also explain why previous iterations of 'fake nudes' weren't as highly debated: because they weren't as commonly distributed until now (perhaps because they weren't as realistic, and thus not as popular).
Manually created photoshops are generally higher quality than deepfakes, doing it with AI is just more automated, and thus more useful for applications like video, more obscure celebrities, or larger quantities. I'd say that as a proportion of the internet and of internet pornography, celeb nudes (both real and fake) have noticeably gone down. In 2006 one of the earlier Simpsons episodes referencing the internet has Comic Book Guy downloading nude Captain Janeway. (I'm not sure if Kate Mulgrew had any real nudes, but I guess that might be different in the world of The Simpsons anyway.) Janet Jackson's 2004 Superbowl nipslip was a major inspiration for creating Youtube. It's just that the internet as a whole has grown, so celeb nudes are now smaller compared to behemoths like Pornhub even if larger in absolute terms. And of course now celebrities themselves have an internet presence, and there are all sorts of micro-celebrities, while the culture might be less focused on the biggest celebrities than it was.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
That observation is a very useful starting place. When I find myself in a similar confusion, I try to switch my perspective to a more traditional view by imagining it involving my kin. Like: "What would I want to do to the guy who did this to my 18-year-old daughter?"
If a guy uploaded to pornhub a realistic sleazy deep-fake porn with my daughter's image and distributed the link within my community, I'd be contemplating the percussion sound of a baseball bat striking his kneecap.
Now that I have an anchor to my reaction, I can explore its possible reasons.
The modern US culture is (broadly) a culture of dignity, where "sticks and stones may break my bones but words will never hurt me" is an aspirational ideal. If I aspire to this ideal for myself and my hypothetical 18-year-old daughter, then the sleazy deep-fake porn is "words" that I and my daughter ought not allow to hurt us. We would then treat the incident as we would one where someone created a fake Linked-In account for my daughter, or a controversial blog post written in my daughter's name, or if someone hacked my daughter's Twitter account and posted some unsavory tweets in her name.
In a culture of dignity, I would assume that my daughter's dignity cannot truly be compromised by something she didn't do (in this case: make a sleazy porn video). I would understand the need to correct the record--have pornhub take down the video, issue a clarification within our community --and I would regard that task as an annoyance.
However, underneath that culture-of-dignity veneer lurk centuries of cultures of honor. It doesn't take much for me to get into that mindset. By creating the deepfake porn and distributing it among my community, the guy compromised my daughter's honor--altered for the worse her reputation among my community--and by extension he compromised my honor. Swift baseball-to-the-kneecap plus spreading the word about the retribution is pure restorative justice.
(But what if the guy didn't distribute the deepfake? Like, what if I found it by browsing his laptop? The threat of distribution is there. So my gut response is to get immediately angry and see that he erases the video and promises never to do that again. Presumably, if I am browsing the guy's laptop, the guy is part of my community and I will have social levers to ensure compliance.)
The question is then: what culture does my community have?
If it's Blue Tribe PMC: my daughter's reputation will rise by spreading word about (a) her stoic response to someone's attempt at reducing her dignity, (b) our levelheaded pursuit of legal means of redress, and even (c) our high-brow discussions on why our culture regards sex as shameful in the first place.
If it's Red Tribe Appalachia: out comes the baseball bat.
I'm not sure why you're bothering to make yourself one degree removed by making this be about your 18 year old daughter.
Imagine: I get access to pictures of your face, and I'm annoyed about your opinions about deepfake porn, so I decide to get some completely legal revenge. I create a deepfake porn video of you being tied up crying and screaming, before someone shoves a ring gag in your mouth to make it easier to throatfuck you, which several faceless men proceed to do, followed by someone sticking a barbed dildo into your ass and leaving you there, fade to black.
I smack a giant "this is a deepfake porn from Deepfake Porn Productions" watermark across the bottom of the screen, making sure not to obscure the semen dripping artistically from your slightly bleeding mouth. I send this video to your coworkers, friends, and family (all above age of consent, of course, I wouldn't send porn to minors). I even carefully label the video so they know exactly what's in it before clicking, I don't want to trigger them. I also upload it to pornhub, why not.
Why involve your barely legal 18 year old daughter in this scenario?
Let's say you feel the specific problem is that it's sexual harassment for me to upload it anywhere. It's fine as long as I don't sent it to your acquaintances, or it's fine as long as I don't upload it to reddit, or whatever. Is it okay for me to let only you know that I have an entire library of such videos? I promise no one but me will get to see them, I just like the deepfake of your voice begging me to stop.
Is the idea that it's fine as long as I keep it secret and never take my laptop full of videos of you to a repair shop where a technician gets to see you taking dick like a seasoned pro (who knew your asshole could gape that wide, wow -- just to be clear, this is a deepfake, this is not real, it's just your face attached to that body type I carefully matched to be as close to yours as possible)? We're past the revenge porn scenario here, I'm keeping this all to myself, it's just that I find it really satisfying to watch someone use a barbed sound on your urethra while a face that looks like yours is crying about it.
Anyway, like I said. Your daughter isn't necessary in this scenario. We can keep the discussion entirely to the ethics of me doing this to you. Concerns of anonymity on the motte aside, how do you feel about sharing a photo of yourself with me after this comment?
Two things about this scenario -
It seems like this video will definitely not remain secret. In your scenario, there is no way for it to remain secret - even if you never ever show it to anyone a computer technician will see it, and even if that didn't happen you still need to tell me you have the videos and that you need to see me begging you to stop. Because the videos by themselves are meaningless. The act of sharing them is a necessary component.
I am not friends with people who want to violently sexually violate me. I get the impression that we are actually enemies, maybe work acquaintances or we are rivals for a lady's affections? Either way I am truly, deeply sorry for dominating you so totally and completely that you have built an entire library dedicated to fantasising about my submission. Also though, thanks for telling me about it, because now I truly own you. Could you strip the deepfake warnings off it however, and pretend it's real? There is zero challenge in destroying someone showing others their violent yet impotent seething.
More options
Context Copy link
I think it says a lot about The Motte that this comment--which is obviously leaning super hard into deliberately poking emotional buttons--was met with multiple dispassionate responses that take the position offered seriously. And I think they were correct to do so!
This comment showed up on my "volunteer mod" list, and I seriously considered both the "warning" and "AAQC" options. Went with "neutral."
This kind of comment with deliberate shock value in an obvious attempt to cause an extreme emotional response to create a sort of "ad absurdum" proof (i.e. implicit message being "according to your stated values, this EXTREME and OBSCENE thing happening to YOU would be allowed; your stated values don't look so good now, does it?") happens every once in a while in this forum, and though I've often found it amusing, I've also started to find it frustrating. Because when they inevitably get the types of responses that this one got, (i.e. "Yes, that'd be fully allowed. And?"), there never seems to be any follow-up to continue the conversation. And that's a shame, because I feel like there's potentially an interesting conversation here. It's legit fascinating to me that some pixels arranged to look like a photograph of oneself doing XYZ could be offensive to one based on how offensive XYZ are, and not only that, that it's so obviously offensive that it's used as an "ad absurdum" endpoint to use as a "gotcha" against someone's values.
More options
Context Copy link
More options
Context Copy link
I am old, married, and no longer give a fuck. But I would care if it were my daughter.
I appreciate you taking the time to vividly describe the hypothetical experience. I know that your intent was to make me feel disturbed or disgusted, but that's rather the point of this discussion: it's about exploring our intuitions on the subject.
More options
Context Copy link
Well if you made deepfake gay rape porn featuring me(and distributed it) I would consider that a legitimate grievance, but not one justifying extralegal violence. If we were friends and you made deepfake porn of me for personal use only, I would probably stop being your friend, but not otherwise hold much of a grudge. If you made deepfake porn of any description about a female relative I would consider it to justify extralegal violence regardless of intent to distribute.
I wager myself much closer to the median American than you are on this issue.
More options
Context Copy link
Can't speak for them directly, but personally the daughter would be relevant because I would care significantly more if it were my daughter than me.
To answer further questions, if you sent it directly to my friends + family I would be very unhappy (though that's rather the whole point of the anonymity concerns).
If posted online with my name (so it would show up on Google etc, though once again rather the point of anonymity concerns) I'd be moderately unhappy since that means there's a decent chance friends, family or potential employers would stumble upon it.
Posted without my identifying info, I'd be a bit wigged out if people I knew personally happened to stumble upon it but its existence on the net to be used by strangers would not bother me much.
If kept on your hard drive for you and maybe a horny pc repair guy to find it I wouldn't mind at all, assuming no personally identifying info attached so the horny pc repair guy can't do scenarios 1 or 2.
If it were my child (thinking on it I would mind quite a bit if it were my son too), I would be distressed to a greater degree about all the above scenarios.
Hope that helps clear it up, that the degree of separation is being used because it is perceived as worse.
More options
Context Copy link
More options
Context Copy link
I find it interesting that Americans in general tend to often fall back to interrogating themselves with "what would I wish for if it happened to me?" when resolving questions of crime and punishment and ethical dilemmas. In terms of my own cultural programming, this seems wrong and immoral, and somewhere in a class with determining ethical conduct in retail by asking "what would I do in this store if I were absolutely sure that nobody could punish me for it?", which I guess you could simply call sociopathy. (In fact, to me, to proactively give up some of what you would and could claim for yourself seems like the essence of prosocial behaviour.) I can't pinpoint at what point and how it was conveyed, but if this is a European-American difference, it may explain why American prison terms and conditions are so notoriously draconian in comparison to ours.
I imagine you'd protest the comparison between shoplifting/abusing the staff and visiting punishment upon those who wronged you, but then I'd wonder what is the salient difference. If it's that your victimhood in the latter case gives you moral license to take more of the pie, well, you've now justified victimhood olympics (another very American phenomenon); if it's the detail that the case you are imagining involves your daughter and rules against selfishness do not apply if you are acting to defend someone else, you've justified a whole array of /r/talesfromretail stories involving motherly Karens.
More options
Context Copy link
Thanks for the interesting response.
More options
Context Copy link
More options
Context Copy link
The fact that this is even a debate kind of sickens me. Only in this fallen modern era would it even be a question whether or not the discomfort of primarily women (which has been elevated in the current era not according to any reason but only according to an emotional hysteria, driven primarily by the same women who benefit from it (or at least feel compelled to use it to pathologically seek power that they can't even healthily enjoy as an expression of sadistic bitterness over the modern widespread weakness and hence unattractiveness of the opposite sex) and their male followers ("simps") who have often become so emasculated that they no longer even try to conceive much of a path to earning the favor of such women other than by slavishly obeying them for temporary headpats, that has amplified itself almost into a religious fervor) and their investment in the "sanctity" of their image (obviously highly debatable given how many of them choose to present themselves) takes precedence over the basic rights to freedom of expression, freedom to fantasize, freedom to create, etc.
The answer is obviously no. There is nothing about deepfaking that is categorically different from writing stories about someone, drawing/painting them (and even before the recent AI explosion, some people could make remarkably photorealistic artwork), Photoshopping (which has been possible for decades) their head on to someone else's body (and it's worth noting that modern deepfakes are essentially as primitive, just in motion, as the most used method at the moment just involves finding a video of someone with a body that is hopefully somewhat of a plausible match for the desired face and then inserting that face on to it, again just in motion), etc., and hopefully there is still a general agreement (not saying this to build consensus, just expressing what I have only ever been aware of the general consensus in the modern era always having been) that anyone who wants to use the state monopoly on force against people who do these things because the subjects of them might be made *uncomfortable* is a totalitarian lunatic. (I remember that before JK Rowling was a villain for opposing alternative sex lifestyle roleplaying (not a sneer, just my attempt to more accurately describe the phenomenon of "transsexualism"), she was a villain to some for vociferously opposing Harry Potter fanfiction, with the argument that Harry Potter sex stories for example would violate the rights of its movies' young actors by likely imposing their images on scenarios they didn't consent to being widely mocked.)
The whole "deepfaking" controversy is just using slightly new technology to launder into the public discourse the same old big brother bullshit that's been rightfully rejected many times before, except they may yet succeed this time (with "yet" being relative, as it's actually already illegal in a few states) because the rational faculties of their targets have been so broadly degraded and their discourse so thoroughly poisoned with mindless, kneejerk reactionary (which I've, ironically enough, almost always found those who are the most anti-reactionary in the political sense to be the most in the general sense) feminine emotionalism, safetyism, and exaggerated negative utilitarianism (so long as it's in favor of protecting the right demographics, the most sacred demographics, of course, as obviously this issue would not be one at all were men the primary subjects of discomfort here).
It is also quite ironic that it is mostly the side of people pretending to be highly opposed to/seeking a severe contraction of the carceral state pushing this. This is just more evidence to me of what has seemed obvious from the beginning, that these people are not against harder "crime and punishment" and "law and order" crackdowns than they've ever bemoaned, just against the punishment of particular crimes they associate with their favored client demographics (particularly/only when committed by members of those demographics who are also in good ideological standing, but they can't quite say that so explicitly yet) and the opposite for their disfavored ones. In their ideal world, Kevin gets 20 years of hard time for putting Pokimane's face on Viper Vixxen or whoever (especially if he seems like a "chud", maybe less if he has a history of serving the regime loyally, in which case he may get to lessen his penalty via subjecting himself to a routine of humiliation and self-criticism), but Tyrone gets therapy and cookies for stabbing him to death. (And does anyone want to bet how much they'd push for women to get punished for deepfaking Bieber or the BTS boys? Of course most men are still not invested enough in their egos to be incapable of separating fiction from reality, so they're unlikely to care anyway.)
If they were really against the worst excesses of modern surveillance authoritarianism as they claim, the last thing they would do is try to invent a fifth horseman of the infocalypse to give glowies and spooks yet another reason to treat any bit flowing through any digital system as a possible if not likely piece of illegal contraband on the run, "justifying" even more invasions of technological/digital freedom and privacy. But this is again because they're not actually against hammers as they claim, only against them being used on certain nails. This is after all the side that invented the "No bad tactics, only bad targets" mantra.
I think part of this is because so many have forgotten what rights are, or at least what they should be in practice, that is how they should function. They have fallen into the trap of, because "rights = good", consequently thinking that rights can only protect fundamentally also 100% good, squeaky clean, Reddit-certified Wholesome™ Mr. Rogers behavior (or at least what they see as that through their ideological lens), or at least not what they see as its opposite, which is how nonsense like "Hate speech isn't free speech." spreads even though such a statement is blatantly contradictory on its most basic semantic level. In actuality, as a loose heuristic, rights are more appropriately understood as restrictions on power (as they are formulated in the US Bill of Rights for example).
Rights are rules where giving authorities the power to violate them would likely make those authorities shittier and more prone to causing problems/hurting more people than whatever problems they could solve by violating them. Rights are when giving authorities the right to search anyone's asshole at any time is worse than whatever people are smuggling in their assholes, thus we say "no searching assholes except in very strict, limited, and justified circumstances", thus "people have a general right not to have their assholes probed for contraband." This isn't based on any determination that most of what anyone is smuggling inside their asshole is any good; indeed most people who have to smuggle things in their asshole probably aren't smuggling much nice (depending on your stance on drug use anyway, though I'd say at least fentanyl which is probably a very common asshole passenger nowadays is close to objectively evil).
So to tie it back to deepfaking, the choice comes down to preventing women from occasionally feeling uncomfortable about fiction about themselves vs. trying to protect what's left of the chastity of all of our digital, informational, expressional, and private assholes. Again, I think only in modern femworld would this even be considered a choice worth pondering for more than a second.
Women's feelings are not god. They don't even warrant being taken that seriously in many cases (to be fair the same is also true of men, though not as often I don't think). That's really all that needs to be said about it. Sorry you're uncomfortable ladies, but that doesn't mean that the entire boot increasingly stamping the human face for what seems increasingly like it might be forever needs to be at your beck and call. (Of course me or anyone else saying this will accomplish nothing at least in the short term, but the decay of society cannot be reversed until these ideas are fully absorbed by modern men.)
People find you attractive, including those whose attraction you might not reciprocate, which you know because even if your content is "SFW" you've built your whole career on it (and you've never objected to it when they were giving you money, which is also part of the issue here, as this style of deepfakes has been around for years but now many of these creators have Fanslys etc. and are making money off of them), among other things (like at least 70% of the reasonably attractive ones not infrequently walking around in public half naked nowadays). Get over it.
You remember incorrectly. Rowling never opposed Harry Potter fan fiction, and in fact is one of the more pro-fan fiction authors out there. She did object to porn fics being available on sites predominantly visited by young fans, but afaik never took any kind of legal action.
As for your rant about deepfakes, I don't think they should be illegal, per se (I think specifically using them to perpetrate fraud or damage someone's reputation is, at the very least, cause for a civil action), but I also think people are entitled to demand sites remove deepfakes of themselves. Like, if you want to create your own personal wank material with Emma Watson, or your neighbor's daughter, keep it to yourself. What's in your head (or on your hard drive) is nobody else's business. Putting it in public is like telling your neighbor's daughter that you jack off thinking about her. If you make it public, you make it her business (and her father's, to put it in terms that you consider relevant).
So you think it should be illegal if those sites don't?
Nah. The right to share the products of one's fantasies, expressions, creativity, etc. is inherent in all of the associated rights.
Sure but it's my choice if I choose to make it their business in a particular context.
Maybe it becomes their business but that doesn't imply any obligation for the state to do anything on their behalf.
Anyway I swear it was Rowling but maybe it was Meyer or some other author of a similar context or maybe it was just erotic fanfiction they were opposed to. (Actually I think maybe the controversy was that Rowling disapproved and tried to take down fics with even small amounts of sexually suggestive content. I don't know. All I know is at least some fics were targeted by someone. In any case the analogy stands even if the details aren't correct.)
Edit: I think I'm right about Rowling. Maybe she changed her mind over time but there's definitely a history of her targeting fan content:
https://old.reddit.com/r/harrypotter/comments/8nphgj/jk_rowling_vs_the_internet_a_history_of_harry/
This not much different than the people who are fully supportive of AI-based image techniques, but only on their terms, that we're discussing. So I think she's a good analogy here, especially since I again do recall some of the discourse being about how it violates the actors' image rights since everyone inevitably associates their appearances with the characters now.
Yes, it does, because the state has made it their business to prevent them from doing anything to protect themselves.
The morally correct response to someone telling your sixteen year old daughter that he enjoys thinking about her while jerking off is ‘if you ever speak to her again I will kill you’. The state has decided to ban this option, and so it is incumbent on the state to imprison(or otherwise deal with) people who justify that recourse. The debate is about where to draw the line, not about whether the state should be involved.
Maybe if you're a violent psycho who is a ticking timebomb waiting to go off, though in that case I'd rather the state move on you.
(By the way, if you feel this strongly about people not jacking off to your (hypothetical?) daughter, then I sure hope you're equally as committed to keeping her completely modest in garb and demeanor. The moment you so much as let her walk around in front of other males in tight leggings (assuming she's attractive), all bets are off, whether they communicate that to you or not, if you want to try to appeal to some more traditional code of behavior. Many such daughters being jacked off to with their fathers unable to do anything about it other than seethe.)
Yeah, no. By this logic, it is incumbent on the state to imprison or otherwise deal with people chewing loudly because it has prevented me from simply murdering them. (You might say that chewing loudly could never possibly justify murder, but perhaps if you had dinner with some of my family members you might disagree.) That is nothing more than naked totalitarianism. (I don't actually support murdering or imprisoning people for chewing loudly of course. I am just pointing out that your argument is contingent on the notion that a particular behavior deserves a particular degree of punishment in the first place, which is obviously highly debatable. You're trying to launder in this premise as automatic.)
Yes, this applies so long as anything at all is illegal (like murder, which I'm pretty sure has been prohibited in some form in every society). It's also a meaningless statement.
Yes, I am aware that men think about women while masturbating, and that teenaged girls are attractive to the opposite sex.
Informing a woman or girl you’ve masturbated while thinking about her is creepy behavior* that will foreseeably be received as a threat, and there’s no possible reason to engage in it. Behaving in a sexually threatening manner towards women and girls justifies lethal violence from the men responsible for them. It’s been that way since time immemorial and the only exception has been if they’re just whores who forfeited their right to male protection(which was not the topic up for discussion). Things which are threatening are not the same as things which are merely annoying. Women have a right not to hear implied rape threats and their husbands and fathers have a right to police the things said to them.
*unless you’re in a relationship where she’s into that, I suppose, but I’m not talking about Reddit sex positive weirdos here.
That depends a decent amount on the context.
Maybe, if being "responsible" for them also means they have complete and absolute just and proper property rights and masculine dominion over them (which is also how it's been "since time immemorial"). Otherwise they are merely simping to some degree. The natural price of masculinity taking responsibility for the feminine is the feminine's complete and absolute obedience in return. So if you are not advocating for this then you are simply advancing cuckoldry under the guise of chivalry (which I suspect because you're framing the issue here as an injury to the female as opposed to her owner).
That's like at least 97% of modern women/girls over the age of 13 or so though, so I kind of think it's implicitly up for discussion. The actual society we live in is not the one you're describing.
If we're talking ideal ideal world (obviously my opinion influenced by my ideological presumptions here, though I think it's a lot more traditional), men have a right to not hear implied threats against their exclusive use of their property and women have very few to no rights. Again, the injury is to the man (hence why "rape" evolved as a synonym for "steal", because it's stealing another man's property). But even then I think in most cases going to the absolute extreme over someone saying they find your property attractive is a little much. If somebody said they liked my car, I wouldn't automatically in all circumstances threaten them like were threatening to steal it.
The fact that you think relationships where the girl finds her partner sexually attractive enough to enjoy the idea of him wanting to masturbate to her is the domain of "Reddit sex positive weirdos" says a lot here.
My entire point is that we are not living in the kind of society you’re imagining, we’re living in a society where the state takes on the function of protecting women from sexual violence and predation. And the state, if it’s going to take on that function, has the responsibility to actually do that. Which in turn means that it needs to protect the privacy of the nude bodies of non-sex workers(and no, wearing a bikini does not make you a sex worker, and I say that as someone who does not approve of bikinis).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Yes. Just like if you publish a libelous article about me I can demand the site take it down. They can refuse, of course, and then I can sue, but the end result is, theoretically, men with guns enforcing the law, yes.
Okay. If you tell someone that you enjoy jerking off while thinking about his underage daughter, no, he can't have you arrested for it.
People who expect no consequences for anything they say or do as long as it isn't actually illegal tend not to like the consequences and then suddenly become very interested in legal protections.
You're not. You said she was opposed to fan fiction. She's not. She was opposed to minors being exposed to porn. You might disagree with her wanting to impose conditions on writing fan fiction, but that doesn't make her opposed to fan fiction (and in practice, she's never done much to enforce her terms except on large commercial sites).
Technically, fan fiction is still at best quasi-legal, and authors who are actually anti-fiction can and do force sites to remove fan fiction of their works entirely. Rowling could, if she wanted to, go after the many sites that do host sexually explicit HP fan fiction, but she hasn't.
It's quite different.
It's not.
It's not libelous if it's not presented as true. If I write a fictional story about you raping an eight year old that is explicitly presented as fictional, then as much as that may disturb you, you can't do shit.
This is random pseudo-macho posturing that's irrelevant to the argument. But yeah in any case I will definitely take my chances with your average weak modern Reddit heckin' dad versus the state. (I mean even if I did deepfake someone's daughter, which I wouldn't at all especially now since I think current deepfakes are primitive and cringe, I'm not exactly going to go telling them about it, since, yes, legal or not that's pretty shitty or at least dumb etiquette, and if I shared it online I'd do so anonymously, but still. I've masturbated to a lot of people's daughters and I'm pretty sure none of them know anything about it except maybe the dads of the girls I've openly dated, though even that's not many because I have a weakness for fatherless girls.)
...Is it supposed to be some sort of flaw in my argument or "gotcha" that I am very interested in legal protections... for that which I think should be legal? That's kind of the point, yes.
I mean, as the link shows you, many actual HP fanfiction authors disagree. Many people also disagree that Diane Feinstein is opposed to guns (after all, you can disagree with her wanting to impose conditions on them, but that doesn't make her opposed to them, right?). Unless you can explain exactly what's "quite different" here and thus wrong about my analogy, I think the whole debate is a pointless back-and-forth of semantic vagueness.
Again, I'm talking explicitly about the "You can't write a sex story about Hermione, because when people think of Hermione they think of Emma Watson's image, and Emma Watson didn't consent to have her image in your sex fantasies." argument I've seen about erotic HP fanfiction. (I'm not saying Rowling made this exact argument directly herself. She was just a convenient segue.)
This does not appear to be correct:
"For example, in 2009, in the “Red Hat Club” case, the plaintiff was awarded $100,000 in damages by a Georgia court for a fictional portrayal modeled on her. The “original” claimed that her fictional counterpart, falsely depicted in the bestselling novel as a sexually promiscuous alcoholic who drank on the job, defamed her. From a libel defense perspective, this drawn-from-life portrayal failed, in part, because the author included personal characteristics that made the plaintiff recognizable, and mixed them with other traits that were false and defamatory, but, still believable."
If you can successfully sue because you were portrayed as a slutty drunk in a work of fiction based on you, I suspect you may be able to sue for being portrayed as a child rapist.
It's not a slam dunk, and often fails, depending on how closely the fictional version is recognizable to the original, but it does appear that you can indeed "do shit."
Fair. I'd be interested in seeing what some sort of disclaimer specifically targeting this achieves though, something like "X is quite obviously not a child rapist. There is no evidence that X is actually a child rapist nor is it believable based on all known information about them that they could be one." etc.
I mean, that's one case. There's also this meme about Glenn Beck and the guy who registered a domain for it won his case.
Plus, I don't think this can apply to deepfakes. If I write a fictional story about you doing X, then perhaps that can come with some implication that it's some veiled satire suggesting you might actually do it. But if I make a deepfake, I mean it's in the name. It's fake. It is very clearly not you doing it.
Sure it is entirely unclear how it would work with deepfakes if at all. But the deepfake is presumably recognizable as the subject (as that is the entire point) so you could perhaps get away with deepfaking them having sex as this is something they likely do. If you deepfaked them onto child pornography that might trigger something similar. That is highly speculative though. I suspect different courts and jurisdictions will go different ways.
More options
Context Copy link
More options
Context Copy link
Link to quoted article
A different article says of the same lawsuit:
Indeed, hence why it is not a slam dunk, but it is something.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I could probably sue on the basis that it causes me reputational harm, though my understanding of the law is that I'd have a hard time establishing actual damages.
People are coming up with all kinds of other scenarios, about Photoshopping a dick into someone's mouth or creating a deepfake of someone being raped and tortured, and not all of those things are illegal. I don't necessarily agree that none of them should be.
I'm in favor of enabling subjects of deepfakes to issue takedown demands, though enforcement will be very impractical in practice.
It's not macho posturing - I said you should keep your sexual fantasies about your neighbor's underage daughter to yourself and not share them with her or her father, and your response is "Nuh uh, it's not illegal!" I mean, sure, not everything that is wrong and unethical is illegal, and you certainly can go around telling everyone about your sexual fantasies. Reverting to "It's not illegal" when discussing ethics is a dodge.
No, they think she was trying to impose conditions they didn't like. Also, that link is four years old and references events going back much further, in the early days of online fan fiction.
The vast amount of HP fan fiction that Rowling has tacitly (and in some cases, explicitly) approved disproves your entire argument.
I actually doubt that many people disagree with that. But if you said "Diane Feinstein wants all guns to be illegal for everyone, period," that's a claim that may or may not be true (who knows what she really believes?) but it's not supported by any actual words or policies from her. If you said "She wants heavy restrictions on guns," that's obviously true. But "JK Rowling opposes certain kinds of fan fiction in certain contexts" is not the same as "JK Rowling is anti-fan fiction." It's not even a good parallel with your Feinstein analogy.
That's more an ethical argument than a legal one. Emma Watson would not have legal standing to demand that erotic fiction about Hermione (or about Emma Watson - RPF exists) be removed. But Rowling could demand that the former (though not the latter) be taken down, on the basis that fan fiction is, as I said, at least currently considered an IP violation, though this hasn't really been tested in court.
How would it cause you reputational harm? If anything you'd be the victim of malicious fiction, and victimhood is a reputational benefit nowadays.
Well I disagree. Fiction is fiction and thus automatically possesses a rightful presumption of being implicitly harmless (as it is, quite literally, unlike most important harms, not tangibly real) absent a more pressing justification than someone's discomfort over their depiction, whether it's because it's extra realistic looking (but again, still not actually tangibly real) or not.
I mean this is of course speaking in terms of abstract ideal legal policy. Strategically speaking, if you want to make deepfakes illegal in any sense and force those who want to see fake Pokimane or fake Emma Watson getting railed into the depths of the darknet where stuff like child porn also circulates, thus strengthening the entire enterprise of private and anonymous content contribution opposing the unjust power of the modern digital hegemony, then that's probably a win for people like me.
Uhh no. My response actually was:
"Maybe it becomes their business" doesn't in any sense imply some overall objection to the principle of generally keeping such things to yourself on etiquette/behavioral grounds, and simply advancing the viewpoint that a certain behavior is not a concern of any formal power is not some blanket approval of it as ideal behavior in all contexts (many such cases of people unfortunately believing the opposite nowadays though). "That doesn't imply any obligation for the state to do anything on their behalf" is in fact some of the weakest commentary on a behavior you can give, other than again through the flawed modern lens that so frequently crops up where if you're not advocating calling the SWAT team in on something then you must be its biggest cheerleader or at least trying to excuse it.
But we're not just discussing only ethics, unless the only compulsion you're advocating for being behind those takedown requests is that it'd be the right thing to do.
And the Reddit admins were only trying to impose conditions on your subreddit that you didn't like.
And? My whole original point is that JKR was criticized for opposing fanfiction primarily in the past.
(Anyway I'm just going to ignore the rest of the stuff about whether it's reasonable to say that JKR opposes or ever opposed fanfiction or not since it's completely tangential and I'm not in an autistic enough mood today (which is not to say I never am) to dive into this conversation spiraling into dozens of tendrils of barely relevant side disputes (and I'm not saying you were the only one engaging in this up until this point by any means).)
Yes. And it's a bad one in my view. And it's similarly a bad one for Emma Watson/Hermione deepfakes too.
And I also oppose this, though debating the validity IP law is mostly again a whole other subject.
Again, to me, the central dispute is whether openly, explicitly fictional content (again, if it's lying about being fictional, then that gets into the realm of fraud which is a whole other matter) should be prohibited because it makes its subjects uncomfortable or feel "violated" or however it's formulated (as I'm not seeing any other real justifications being advanced). I say no.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'm amazed at how succinctly this delineates ethical boundaries that appear basically airtight to my intuition (my intuition being where the problem lay in the first place). I'd go as far as to say that this essentially resolves the topic for me.
Yes, I think when it's distributed / public there are interesting ethical questions, but inside people's heads it seems entirely their business, and no moral issues whatsoever. (The only hints of moral issues are that it increases the chances of it making it outside of their heads).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This has a wider application than just porn. Someone makes a deepfake of you doing something illegal, are you going to be so gung-ho about the freedom to fantasise and create?
Most systems, including the US, draw a distinction between libel/slander and protected free speech. If someone makes libelous videos of me, and shares them around, then obviously I would have objections and would have legal recourse against that person.
I think at least some instances of deepfake porn would fall under libel laws, but not all of them.
More options
Context Copy link
Sure, as long as they're not trying to represent it as real (and if they don't the crime there is various forms of fraud, misrepresentation, false reporting, etc., not anything to do with fantasizing or creating, as it'd be essentially the same crime as if they just claimed I had done those things without the added fake evidence, again supporting my claim that not much if anything you can do with deepfakes is categorically different than anything you could before them). They can fantasize all they want about me being a bank robber or serial killer. I'll be flattered that they think me specifically being the criminal in their fantasies makes the scenario cooler.
More options
Context Copy link
More options
Context Copy link
I can maybe accept most of what you're saying specifically in regards to how it answers to the question: 'should these images be illegal?' Perhaps it is a violation of the principle of free speech to have things like this be made illegal, or a slippery privacy slope, or a needless cession to emotional people, etc. That being said, whether or not it should be made illegal, I expect that it will be legally harder if not fully illegal to do this kind of thing in the near future. But I digress.
Many others in the thread are also focusing specifically on the legality dimension, which I regret and for which I bear responsibility. I was generally more interested in talking about what is moral/ethical, and less so in talking about what is and should be legal, even though I indeed asked as much in the body of my post. Even if these things are not illegal, the morality of them is still hugely important, as it determines who gets 'cancelled' etc.
And to that end, in figuring out what is ethical or moral, I think feelings do matter. For example, I think it would be immoral in many situations to do an action that I knew would make a person "feel bad" just because doing that action gave me sexual gratification, legality of the action notwithstanding. If I was trying to design the fairest US government, I might not make deepfake porn illegal. But if I was trying to be as morally/ethically upstanding of a person as I could be, there are plenty things I shouldn't do that are still legal.
I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy. I'm not sure what I think when it comes to making highly realistic porn of them. If you were superman and had X-ray vision, would it be unethical or immoral to look into the women's locker room? If not, why does everyone seem to think it would be? If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?
Another commenter prompted me to believe that a significant amount of how unethical it is lies in whether or not deepfaked person knows about the porn being made of them, because knowing that it exists is what inflicts psychological harm on them. I think I agree about this. However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?) Teen coming-of-age movies from the 80s didn't seem to think so (the shift in attitudes between then and now might be pertinent to the discussion). Regardless, currently I do indeed think that i.e. spying on the womens locker room would be unethical, and I think most people today would agree that its probably wrong to do such a thing. This is the angle that I'm really trying to disentangle here, the moral and ethical angle, and less so the legal one.
OP didn't say "feelings don't matter". They said "women's feelings aren't God" i.e. are not the sole, overriding consideration in ethical disputes.
Case in point: some women apparently dislike being "objectified". I don't really care tbh. What goes on in my skull is my business.
Because it is a violation of actual privacy: the actual woman is in that room, with a reasonable expectation of privacy and you are peeking in. Even if it wasn't sexual there's all sorts of other concerns with such snooping (e.g. can they steal your stuff now that they saw your locker code)
With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way? That isn't the real Jolie, it's a virtual image that isn't even perfectly accurate.
What if it was trained on topless images of Angelina and perfectly matched her in her physical prime? I think an argument could be made that she removed privacy here herself, in a way she can't expect to get back (we can't unsee her body either way)
I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?
Besides the reason already given above? It's more reasonable to imagine you will never be caught for private files on your computer vs. peeking into someone's bedroom. Simply not being physically there reduces the risk of detection and thus harm to the person.
This is the main thing I am trying to get at with the locker room/fantasizing examples. The current AI can inpaint nudity onto clothed pictures of people without necessarily having serious flaws or inaccuracies. (Not to say, it always succeeds at doing this. Just that it can reasonably often.) And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.
Technically, I suppose, it can't be known by the person operating the AI algorithm if the person has i.e. a mole on the chest, etc. So maybe, because technically uncertainty might remain, i.e. without actually being able to look at a real topless image of the subject, and thus verifying that the nudity-ai-inpainting is highly similar, there is still some sense of privacy maintained? Because even if the inpainted-nudity actually is extremely similar to their topless form, this isn't known to the person creating or viewing the deepfake?
Regardless, overall, the pertinent fact is that the current level of technology is at a level where it is indeed possible to get outputs, at least somewhat often, that the depicted person themselves could or would mistake for real nude photos of themselves. This seems to me to be functionally very similar if not the same as looking at someone changing/naked without their consent or knowledge. You're right in the sense that it doesn't imply other security concerns in the same way as an intruder present in a changing room would, but I'm not sure that's whats actually wrong/disliked about peeping toms; I feel like a significant amount of the dislike of the idea of someone seeing you changing is the actual fact that they know what you look like naked (and maybe also the knowledge or likelihood that they are fantasizing about you sexually). I.e. most people would be as mostly as opposed to a person using X-ray glasses, or more realistically a hole in the wall, to look inside their locker room while they changed, as they would be opposed to someone i.e. hanging from the rafters. I can't know for certain, though, at least personally I guess, because to my knowledge I've never been the victim of any such situations.
Well, as far as legality goes, it seems like copyright is the main way people take down unwanted deepfake porn of themselves. Regardless, though, I'm less so interested in the legality and moreso in what should or shouldn't be generally considered acceptable ethically or morally speaking, for which perhaps privacy or violations thereof, and perhaps other things, do seem like a relevant concern.
Porn stars not only self-select based on their agility in smoothly changing positions in front of cameras--incidentally, a skill shared with politicians--but also for how good they look naked. If an AI image generator is trained on naked bodies of porn starts, its AI-completed naked version of me will look amazingly better than I actually do.
Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.
The distribution of body fat also varies significantly for overweight women, and this is also mostly hidden or distorted by clothes.
I'm aware of this. The point is that not everyone with good-looking (pornstar-like, if you would) breasts, decides to become a pornstar. Thus, these types of people are vulnerable to having very realistic versions of their breasts recreated with pornstar data, despite never themselves putting images of their actual breasts out onto the internet. Additionally, there's plenty of data of non-pornstar-like breasts out there to train data on. The point is not that AI will always generate topless versions of people that are very much like what their breasts actually look like, its that it can with at least some relatively degree of frequency.
More options
Context Copy link
More options
Context Copy link
Making a deepfake porn of someone for noncommercial purposes should be fair use. It's clearly transformative, and it doesn't have any effect on the potential market for the work unless you think the copyright owner will sell their own picture for use in porn and this makes it harder to do so.
Maybe true, but I guarantee you that the vast majority of people paying money to host websites that distribute deepfakes are doing so for commercial purposes. I.e. the streamer in question had accessed a website which required him to pay 15 dollars to use
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
One extracts real, factual information. One does not. Your actual naked body is your information. How I imagine your naked body to look, or how I conceive of it by using glorified Photoshop+ to put your head on another naked person, is my information.
What if the AI is so good at being photoshop+ that, using a picture of what you look like clothed, it is able to create a nude comprised of the exact same pixels/information that would be present in an actual photograph you took of yourself while in the same pose except naked? In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.
To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible, lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves. (This is where the technology is actually at, right now. Boobs aren't exactly the most difficult thing to draw, especially at 512x512 resolution.) In this case, even if it's not the exact same information, then, it seems to me to be functionally the same information, for all intents and purposes. So is it okay for me to use an AI to access what is for all intents and purposes the same as information which is otherwise immoral for me to access?
Actually I'm pretty sure it's completely impossible, given how many bits of entropy there likely is in the variations of how naked bodies can look, particularly things like moles, random scars, etc. (and it's even possible to get new marks on your body over time, like again scars, which means even if there really is some perfect correlation between one's facial appearance and the exact configuration of moles, birthmarks, etc. on one's body, that still doesn't account for one's lived experiences (to think I've found an appropriate occasion for that phrase)) and also variation in genitalia appearance. There's also weight gain and loss which happens constantly and is a huge factor too. (Of course this would also depend on how much of a person's body they've shown off in information accessible to the AI and thus how much is left for it to guess.)
Even this seems unlikely, again given the amount of (changing) variation involved. The moment one mark isn't there or the genitals are off in any way (and there's a lot of details in both penises and vulvas) or the pubic hair isn't how the person keeps it (or how they were keeping it in a plausible timeframe of the photo), etc., the whole thing is done. Just because it's AI doesn't mean it can do magic. It can't brute force cryptographic keys any better than any dumb algorithm, and again I believe that the informational entropy involved in both formulations of your scenario is likely similar to that range.
In any case, I would still say that even if you did hit on a perfect prediction/match (which as I will explain, is still not actually perfect in practice), it is still not a matter of you accessing my information. Like let's say we both have the password "hunter2" on this site. Are we then accessing each other's private information (since, after all, what's more private than a password) every time we log in? No, because the context is different. In one context, "hunter2" is my information. In another context, "hunter2" is yours. The only way you could be accessing my information is if you tried to use the "hunter2" information in my context, that is, tried to use it to log into my account.
Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction. (After all, information and uncertainty are inherently interlinked. It's not actually a perfect prediction, at least from the perspective of any observer or process we care about (that is, I'm presuming omniscient oracles are not in play in our social relations here), until you can confirm that it is, just like, going based off of the password analogy, if you're a bruteforcing a password, it's only meaningfully "right" at the moment you confirm it is and are successfully able to log in with it, not just at the moment the equivalent string is first generated and enters your computer's memory.)
Except, in that case... you've already necessarily seen the actual 100% real thing, so why do I care about the perfect prediction of it anymore? (I mean if you've only ever actually seen it in person but have a record of the prediction then that may change things, and I could address that, but this is all getting into kind of silly technicalities based on an implausible scenario anyway so I'll end it at this extension of it.) If I type my password in front of you and don't do a good enough job of hiding it, then I obviously don't have to worry about you bruteforcing it anymore (nor do I care extra above how much I care about you seeing the password that you may have bruteforced it prior, from the perspective of it being private) because the worst end result of that has already necessarily happened in the process of confirmation.
I suppose you're right about this.
More options
Context Copy link
More options
Context Copy link
What if you're a very accomplished painter and you're able to tell from someone's clothed body exactly how to paint them such that it matches their body when nude?
Maybe that should have the same moral or ethical implications.
More options
Context Copy link
More options
Context Copy link
No, I think OP (and myself) are considering the tangible possession as a key factor. The information may be functionally identical, but is not actually the same.
In the version of the hypothetical where the AI actually can exactly recreate the way a person would look naked in a certain pose, using only a clothed photo of them in that pose as reference, we can agree that the information is 'actually' the same, though, right? One pixel at location x,y, with color #f0d190 is 'actually the same' as another pixel at the same location x,y, with color #f0d190, regardless of whether or not that pixel exists there because it was reverse-engineered by AI, or normally-engineered to be there as a result of being captured via digital photo.
Even granting that, they are two separate copies of that data. The ability to possess one does not make it ethically or legally justifiable to steal the other.
More options
Context Copy link
No. In that hypothetical we lack the information that the picture corresponds to something that exists in reality. I can ask a random number generator for 512*512 pixels, and the result can be equivalent to a picture of you, but unless I somehow find it that it is equivalent I am lacking information that I would have if I took a photo of you.
I suppose you're right.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
For the celebrity women depicted in the AI porn, the fact that they are celebrities amplified their distress. They are used to seeing images and videos of themselves publicized and remembering the context of when they were taken, and then they see a highly accurate AI version of themselves in a sex scene. This can cause psychological harm in a person — humans are not designed to see something like that, I don’t think it computes properly in the brain. An AI scene of being sexually victimized (in essence, arguably) is fundamentally different than making a photoshop with a person’s face due to the sheer realism.
It’s also probable that there is harm when someone faps to the AI video versus to an instagram image, because it’s like a part of the mind would think the AI video really happened. That’s how realistic AI porn may be or can be in the future. So knowing that someone you know, that part of his mind remembers the scene when he thinks about you, is truly disgusting.
With that said, I’m not even sure if Atrioc (the streamer who was caught watching the video) actually did anything immoral here, as opposed to massively unfortunate. We know he paid for an AI porn membership, but not that he specifically paid for the creation of his colleagues’ AI generation in porn. You can imagine that a man on a porn site will see “woman I have crush on in AI porn” and essentially be compelled to click on the topic link. The ease with which men can watch porn and click on links reduces his moral culpability because men are simply not evolved to exercise such willpower. If you put most men in his situation, would they click? When they are in a hyper-aroused hypnotic state, and all day every day they click links by whim, and they know that no one would know he looked at the streamer’s AI video? I pretty much think every young man who is aroused would do the same, so this negates the immorality of his action.
Morality is who you are in the dark, but I'm gonna pretty strongly push back on the framework of "compelled", here. Having your dick in your hand doesn't turn your brain off.
More options
Context Copy link
This piece and the relevant closeness and connection of the person he's viewing changes my moral intuition significantly. My reaction to someone viewing a deepfake of a person they don't know and are unlikely to ever meet amounts to, "how could anyone possibly care about this?". In contrast, deliberately viewing content of someone you're close to (deepfake or real) is different because of the impact of lust on interpersonal reaction. The framing here was even stronger, with it being related to his best friend's girlfriend, which is someone that it's strongly taboo to lust after. Considering your note regarding willpower, I have to confess that it's unlikely that I could skip out on a link to a close friend's wife or girlfriend that I'm attracted to, but I would have a strong intuition that what I'd be doing is morally wrong and that it could plausibly impact my interaction with her going forward. On the flip side, if it was my wife, I would simultaneously not be all that mad about it (I do think it's adjacent to fantasizing), but also fairly suspicious of someone that clicked - it's at least information that I wouldn't want to know about my friend's attraction to my wife.
The main body of your post seems basically right. But in regards to this bit in particular, I have to say from a purely anthropological standpoint I'm fascinated with how much unanimity of agreement there is in this thread that 'whether or not one knows that the act has taken place' is a very important element of the quandary.
Are there many other acts for which whether or not they have taken place isn't nearly as important as whether or not the relevant parties know that those acts have taken place?
More options
Context Copy link
More options
Context Copy link
That argument proves too much, it could be extended to groping or assault. "Your honour, this was a hot bitch walking down the street and my client is a red-blooded young man. Of course he got turned on! What else would you expect? So the arousal negates the immorality of his action in assaulting this woman".
For that to be the case, groping a woman has to be morally identical to watching AI porn, which is certainly not the case. A man can be compelled from an insult to throw water at someone, and a sufficiently strong insult would compel most men to do this, but that doesn’t mean that most men would stab the insult-giver. In regards to AI porn, viewing it once comprises an iota of the full moral harm which is still less than groping a woman. Our willpower makes some calculation for moral harm especially when the victimhood is salient, but you’re comparing taking five minutes extra on a lunch break to committing embezzlement here. They may belong to the same category of action but not the same category of moral harm.
Atrioc’s sin which would otherwise be negligible (admitting to watching AI porn around a table of male peers would get laughs, not derision) but was amplified because he accidentally made it a frontpage topic and he was friends with the victims. But Atrioc’s error was the accidental publicity. Had he not accidentally shown the tab, the moral harm would be mild because the victims wouldn’t have found out. He would just be one of the 10,000 people who have shared blame in watching the content, but this is mitigated by things like plausible moral deniabilit etc
As for willpower x sexual desire discussions, I actually do think the young male sex drive will lead to groping in cases of inebriated collegiate hookups, but that’s a discussion involving complicated elements. Putting 17-22 year olds together with alcohol and potent music is a horrible idea that will lead men to non-consensual grope, provided the man has a strong enough sex drive and the woman’s interest is ambiguous. That’s why all of history had proscriptions against this.
More options
Context Copy link
More options
Context Copy link
Yeah I'm pretty willing to forgive the streamer guy specifically, especially considering your points as well as that I have little-to-no horse in the race. As to your other points:
By this do you mean to say that the main reason that these videos might be unethical is because knowledge of the existence of the videos causes psychological distress in the people whom they depict, not necessarily because i.e. the depicted people's consents have been violated?
This example prompted me to think, though on a tangent only somewhat related to what you're getting at. I'm not sure the 'part of the mind thinking the AI video really happened' thing is what sets it apart. But I think that the knowledge of whether or not someone thought about you in that way is definitely part of what matters. Whether or not someone made an AI porn of you isn't relevant to you unless you know about it -- this fits with my intuition, because re: the completely imagined sexual fantasies point, even though I and most people consider those benign, the calculation changes if person A who masturbated to an imaginary fantasy of having sex with person B then went and told person B that they had done as much. Suddenly that becomes immoral/unethical to me in a way almost similar to the AI nude situation. So I think this might be getting at the distinction for me: what really matters most is if people know that this stuff is being made about them. And in the case of these popular female streamers, the fact that the pics/vids are being distributed basically means they are being forced to know that such content is being made of them. It would be like if 10,000 weirdos were constantly whispering in their ear that they jerked off to the thought of them naked, which is different than those 10,000 weirdos jerking it but not telling anyone.
It's a minor but non-trivial point that many of the female streamers flirt with openly encouraging guys to watch them based on their sexualized appearance. This dynamic is going to happen anyway--attractive news anchor has been a thing since TV started--but the streamers very often take it to another level. Pink cutsie hair, accentuated cleavage, tight pants and "accidental" butt shots, etc. To put it crudely, if their target audience is thirsty simps willing to pay for their streams, I think that should factor into whether they subsequently have a right to be creeped out when those simps imagine the streamers naked, beat off, whatever. 10,000 weirdos telling Martha Stewart that they jerk off to her is very different than 10,000 weirdos telling Pokimane the same thing. Pokimane is actively, if stealthily, cultivating that response in her viewers, Martha Stewart does not.
You're right that many female streamers cultivate an audience in this way, but some female streamers do not and yet still have deepfake porn of them made. So to avoid getting caught up in this we can just restrict the discussion to solely what is right or wrong regarding the porn made of the latter group.
I agree with what you're saying here, in general. And I think that even if the thirst streamers didn't exist, the ordinary streamers who are just streaming-while-female would still end up with subscribers just there to fantasize about dating them. Anytime a female does something on the internet, some guy will try to "send bobs and vagene" her. There's a hilarious example out there of a guy posting Botticelli's Birth of Venus on twitter and getting marriage proposals. With that in mind, deepfakes are inevitable. There are even deepfakes of Martha Stewart, after all.
At the same time, the rise of monetized streams and sites like onlyfans (spit) have really weaponized this tendency. That's bad for the guys whose wallets are getting drained, obviously, but it's also bad for the normie women who just want to share their hobbies. The thirst streamers are definitely part of the problem and they're making everything worse for everyone. Because of that, I have no sympathy for deepfakes of thirst streamers.
More options
Context Copy link
Is this actually true? I'm curious to learn of some examples for, uh, research purposes.
More options
Context Copy link
More options
Context Copy link
I’m reminded of the infames of Roman law, who, alongside certain other restrictions, were not granted reputational protections by the state. The category included nearly anyone involved in prostitution as well as most other entertainers(actors, gladiators, etc). It’s still in use in a few canon law contexts.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
How does this get out to the public? Shouldn't even extremely lax opsec be enough to keep information like that private?
Yeah he apparently was looking at the porn websites on the same computer he uses to stream, despite being more than rich enough to afford a second computer/tablet etc. He left the tab open after viewing, went to stream the next morning, and then when he alt-tabbed between his game and some other application, the thumbnail of the website was visible for like 1/4 of a second in the alt-tab menu. It didn't even come out until 4 days after the stream took place because it took that long for someone to notice it by scouring the stream VOD, which they must have only happened to do by chance. Either way yeah, idiotic on his end to be looking at any kind of porn on his streaming computer, let alone unethical porn.
More options
Context Copy link
Opsec in this case was infinitely lax, since he was streaming his desktop with the tabs open.
SMH so isn't that the equivalent of watching porn on your work computer while at work? I guess I have to update my priors, I thought that was pretty much reserved for effectively unfirable boomer government bureaucrats.
Oh yeah, lots of streamers are incredibly lax about that kind of stuff. Plenty of cases of questionable stuff popping up on their search history or open tabs when streaming, to the point where some intentionally leave things up as a joke (like an Amazon tab for buying a shovel along with a Google tab of local nature reserves and a Quora about how deep to dig a grave, etc etc)
I guess this is what happens when your work life and your "internet life" don't need to -or, indeed, can't - be segregated.
Basic work-play segregation practices like "have a work laptop just for work" don't really seem relevant...until they are.
I imagine that even just using a different internet browser for streaming might help with that.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Like that would work.
Sure. Someone made random period perfect fakes of you, a completely unknown person with interesting CV gaps, and uploaded them mislabelled[1] onto spankbang 5 years before AI porn became a thing.
[1]. leaked porn on tube sites is mislabelled or somehow hidden, to make takedowns harder. In time maybe there's going to be facial recognition to help with this, but incentives of people running porn tube sites are against timely takedowns.
Common sense says it seems highly unlikely anyone would bother to deepfake porn of someone who doesn't look exceptionally interesting.
So, e.g. the type of girl who can make money selling softcore pics can can get away with that excuse, but some random girl who is unexceptional really shouldn't, unless she has a stalker.
Any random girl can believably claim to have a stalker, though. The bar for being good looking enough to have a stalker is... so low as to be literally impossible to trip over, IME.
Eh, if no one's ever seen him or etc, and the claim is made by someone known to be unreliable..
Neither of these seem like particularly difficult barriers to being believed in a claim like this.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
The law, what is legal and what isn't, I suppose isn't as much of what I'm interested in rather than what is moral/ethical. Plenty of countries have already at least de jure banned deepfake porn, the US probably will too eventually. So my bad for including that question in the body of my post.
More options
Context Copy link
I think that it's not so much a victimless crime as a non-coercive one. Suppose that a comedian is very good at doing impressions of Joe Biden. Imagine that the comedian's impressions are so good that Biden's public image suffers enough to flip a few states and enable Trump to win the 2020 election. Was Biden a victim, in a non-trivial sense? It seems natural to say yes, but even Biden wouldn't want the comedian banned on that basis. People do not have a right to a good public image.
Alternatively, suppose that a journalist sneaks into Biden's private residence, takes a picture of him on the toilet, and then puts that picture online. The journalist has violated Biden's privacy and should be punished. There's a crime, but not just because there was a victim of the journalist's actions.
Is AI nudes generation more like the first example or the second example? I think it's like the first example. There doesn't seem to be a principled distinction between good impressions/drawings/etc. and deepfakes.
However, if a deepfake is being presented as someone, then I suppose that could be a good target for the law. Similarly, if somebody who looks a lot like Biden does an impression of him going into somewhere that his voters would regard as sordid - like a conference for conversion therapy techniques - and presents a video of this act as the real thing, I can see grounds for punishment.
So is imagining what someone looks like naked/fantasizing about having sex with them a similarly non-coercive crime, then? Either way probably 'victimless' is the wrong word to use, but I'm not sure how much effect that has on my problem.
Maybe really good drawings of a non-consenting person's likeness having sex/naked are wrong to make as well, and should be illegal.
Sorry, I should have clarified my position: it is non-coercive and should not be a crime. If it were a crime, there would be a "victim" in the sense of someone harmed, but the Biden impression example illustrates how harm is insufficient grounds for criminalising an action.
A really good drawing of someone could make them a "victim" in some sense and some circumstances, but it should not be a crime. Deviantartists can rest in peace.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link