This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
OP didn't say "feelings don't matter". They said "women's feelings aren't God" i.e. are not the sole, overriding consideration in ethical disputes.
Case in point: some women apparently dislike being "objectified". I don't really care tbh. What goes on in my skull is my business.
Because it is a violation of actual privacy: the actual woman is in that room, with a reasonable expectation of privacy and you are peeking in. Even if it wasn't sexual there's all sorts of other concerns with such snooping (e.g. can they steal your stuff now that they saw your locker code)
With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way? That isn't the real Jolie, it's a virtual image that isn't even perfectly accurate.
What if it was trained on topless images of Angelina and perfectly matched her in her physical prime? I think an argument could be made that she removed privacy here herself, in a way she can't expect to get back (we can't unsee her body either way)
I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?
Besides the reason already given above? It's more reasonable to imagine you will never be caught for private files on your computer vs. peeking into someone's bedroom. Simply not being physically there reduces the risk of detection and thus harm to the person.
This is the main thing I am trying to get at with the locker room/fantasizing examples. The current AI can inpaint nudity onto clothed pictures of people without necessarily having serious flaws or inaccuracies. (Not to say, it always succeeds at doing this. Just that it can reasonably often.) And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.
Technically, I suppose, it can't be known by the person operating the AI algorithm if the person has i.e. a mole on the chest, etc. So maybe, because technically uncertainty might remain, i.e. without actually being able to look at a real topless image of the subject, and thus verifying that the nudity-ai-inpainting is highly similar, there is still some sense of privacy maintained? Because even if the inpainted-nudity actually is extremely similar to their topless form, this isn't known to the person creating or viewing the deepfake?
Regardless, overall, the pertinent fact is that the current level of technology is at a level where it is indeed possible to get outputs, at least somewhat often, that the depicted person themselves could or would mistake for real nude photos of themselves. This seems to me to be functionally very similar if not the same as looking at someone changing/naked without their consent or knowledge. You're right in the sense that it doesn't imply other security concerns in the same way as an intruder present in a changing room would, but I'm not sure that's whats actually wrong/disliked about peeping toms; I feel like a significant amount of the dislike of the idea of someone seeing you changing is the actual fact that they know what you look like naked (and maybe also the knowledge or likelihood that they are fantasizing about you sexually). I.e. most people would be as mostly as opposed to a person using X-ray glasses, or more realistically a hole in the wall, to look inside their locker room while they changed, as they would be opposed to someone i.e. hanging from the rafters. I can't know for certain, though, at least personally I guess, because to my knowledge I've never been the victim of any such situations.
Well, as far as legality goes, it seems like copyright is the main way people take down unwanted deepfake porn of themselves. Regardless, though, I'm less so interested in the legality and moreso in what should or shouldn't be generally considered acceptable ethically or morally speaking, for which perhaps privacy or violations thereof, and perhaps other things, do seem like a relevant concern.
Porn stars not only self-select based on their agility in smoothly changing positions in front of cameras--incidentally, a skill shared with politicians--but also for how good they look naked. If an AI image generator is trained on naked bodies of porn starts, its AI-completed naked version of me will look amazingly better than I actually do.
Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.
The distribution of body fat also varies significantly for overweight women, and this is also mostly hidden or distorted by clothes.
I'm aware of this. The point is that not everyone with good-looking (pornstar-like, if you would) breasts, decides to become a pornstar. Thus, these types of people are vulnerable to having very realistic versions of their breasts recreated with pornstar data, despite never themselves putting images of their actual breasts out onto the internet. Additionally, there's plenty of data of non-pornstar-like breasts out there to train data on. The point is not that AI will always generate topless versions of people that are very much like what their breasts actually look like, its that it can with at least some relatively degree of frequency.
More options
Context Copy link
More options
Context Copy link
Making a deepfake porn of someone for noncommercial purposes should be fair use. It's clearly transformative, and it doesn't have any effect on the potential market for the work unless you think the copyright owner will sell their own picture for use in porn and this makes it harder to do so.
Maybe true, but I guarantee you that the vast majority of people paying money to host websites that distribute deepfakes are doing so for commercial purposes. I.e. the streamer in question had accessed a website which required him to pay 15 dollars to use
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link