> their deeper purpose is to humiliate, shame, and objectify women
While I understand the desire to stamp-out the use of deepfakes to insert unsuspecting women into pornographic images and video, I feel at least as much energy should be spent in eliminating (or at least reducing) the harm that comes to women's reputations as a result.
The fact that a deepfake can humiliate and shame a woman and even cause her to lose her job and friends exposes a problem in our society that I feel is at least as problematic as that which would cause someone to create the deepfake in the first place.
> The fact that a deepfake can humiliate and shame a woman and even cause her to lose her job and friends exposes a problem in our society that I feel is at least as problematic as that which would cause someone to create the deepfake in the first place.
I agree wholeheartedly.
There is no reason for naked human body to be considered shameful, regardless of the gender. Reasons for such beliefs are usually based on either keeping up the status quo, or some archaic or religious belief in "dirtiness" of sexuality (except in the restricted case approved by the said tradition/religion).
They might not be just naked. It's porn on the internet, they could be made to do literally anything in them. Agree with the sentiment, but the reality is much much harsher than just prudishness of society about nudity.
Sexual violence is weaponised to destroy a woman's life.
The society allows it.
The men that create the (frankly disgusting) deepfakes described exist within the same society.
The men can't see why a woman should be humiliated by fake videos.
The society allows it.
If someone draws, or paints, art of you in a pose / situation you don't like; using traditional means of ink or pencil etc, are you as violated? Have you rights to redress from the artist?
Why should it be different if they use a different tool to create the picture?
I don’t know why people ask these kinds of questions. It’s like saying bomb ownership is ok because firecrackers can be bought. Clearly there’s a difference in scale.
In any case there shouldn’t be any difference I would imagine. There’s already precedent in United States, but circumstances matter.
> I don’t know why people ask these kinds of questions. It’s like saying bomb ownership is ok because firecrackers can be bought. Clearly there’s a difference in scale.
Yes, exactly, and by asking these sort of questions we can clarify what that scale is, exactly.
When exactly does a "firecracker" turn in to a "bomb"? What are any objective difference between the two? etc.
Here is some deepfake porn I just created of Donald Trump and Hillary Clinton:
\ Õ /
\ | /
|
----D <-----O
/ \ /| \
/ \ / | \
You could say "well, that's not very good deepfake porn as it doesn't look anything like them, and it's just a bunch of stick figures", but what exactly does it mean for something to be a "deepfake"? Is at some point a painting realistic enough to be a "deepfake"? Is using a look-a-like okay? Where are the limits, exactly? What exactly makes a "real" deepfake worse than my stick figures?
It you're asking seriously, no, a painting shouldn't ever be a called a deepfake. A photorealistic painting is called a photorealistic painting. The term deepfake refers to a type of thing, it's not a metric of some quality of an image, nor is it a reference point within such a metric.
A deepfake is a deepfake because it is a synthetic ("fake") image that depicts a real person/thing and is the output of a program that utilises deep learning.
But what amount of realism is accepted? If the result is still in the uncanny valley, is it OK? How far can we dig out of the valley before it becomes not OK?
Obviously your example is at an extremely low resolution but there’s a good argument that making erotic art of someone without their permission should be illegal. It’s obviously not as bad as a deepfake but it’s still a pretty bad thing to do and there’s a good argument that it’s harmful enough it should be legally stopped from happening. In countries without freedom of speech laws I can 100% see this being the case but in the US there’s an argument that there’s a line between a person’s right to make creepy harrasive sexual art of someone else without their permission as it’s technically free speech and the point where that causes enough harm that they are causing harm and freedom of speech protections no longer apply.
I have trouble imagining a reasonable person being interested in porn containing a minor government official, let alone being confused as to its authenticity. I can't even see reasonable people being interested in genuine celebrity sex tapes.
The audience for that seems limited to malicious weirdos who already have zero respect for the victim. Any influence it might have on their opinions seems irrelevant.
Of course, this has no bearing on the personal violation one feels when exposed to this sort of abuse. But for deepfake porn to somehow influence public perception is hard for me to grasp. Are there any examples of real reputational damage stemming from this?
to say nothing of the personal feeling of violation, the individual in question seems to be a relatively young and ambitious individual, likely harboring aspirations for a more prominent political career.
The potential consequences of such a revelation that “they might’ve been a pornographic actor” could give rise to a future scandal during an election. Even the slightest insinuation of involvement in the adult entertainment industry can inflict significant damage, particularly for women.
We’re actually seeing AI-generated imagery being used more and more in political propaganda (eg https://www.nytimes.com/2023/06/25/technology/ai-elections-d...). It’s not unreasonable to think this will become a major point in a future election, even if it hasn’t yet.
> such a revelation that “they might’ve been a pornographic actor”
I ask again, is there any evidence that such allegations have ever been taken seriously?
I can't see fake porn having any reach, except amongst low-IQ consumers of internet trash, who are not only susceptible to obvious propaganda, but are already politically aligned against the victim.
Other AI generated content may have more impact, but that's another conversation.
If you're wondering whether people spreading juicy and false rumors about candidates' sexual matters have had an impact on election outcomes, the answer is definitely yes. Maybe this specific type of sexual rumor hasn't had a major effect, but considering that this technology is still relatively new, it's not something to be too complacent about.
On the topic of people believing outlandish things on whimsy evidence: Plenty of people were convinced Hillary Clinton was running an underground child pornography ring in a pizza basement based on paintings hung in someone’s house.
Maybe they were all politically aligned against her from the start, or maybe they weren’t and it affected the 2016 election. How would we know?
One can do polls to establish eg.
"do you believe this story", "did you support Clinton before this", "did this change your voting preference"...
While no poll is 100% accurate, as long as you make it statistically significant and introduce a good control (say multiple fake and real stories, with some of them positive too), you'll get good signal and ability to say exactly if it did affect the election.
> I can't see fake porn having any reach, except amongst low-IQ consumers of internet trash, who are not only susceptible to obvious propaganda, but are already politically aligned against the victim.
You seem to have made your mind and no further discussion is possible.
If someone were to draw a watercolor painting of a likeness of a President one doesn't like having sex with an equally despised congresswoman, most would agree that this is a classic "free speech issue" and is the very sort of protected speech that first ammendment clearly protects.
Ignoring tangential issues, like the copyright of the images used in the "deep fake", how is this situation any different?
I don't know - I think somebody needs to name the bubble of conceit that came into the world when photography was invented, since that created a philosophical issue, rather than a resolution issue. We went from an implicit understanding that newspaper engravings were 'representational' to a blind belief that photographs were records of the real world, notwithstanding that they were being manipulated and selectively used (first for amusement, later for propaganda, etc.) right from the start. Until we name it, the distinction between porn pencil drawings and photorealistic depictions doesn't have any supporting and exclusive definitions and terminology.
> If someone were to draw a watercolor painting of a likeness of a President one doesn't like having sex with an equally despised congresswoman, most would agree that this is a classic "free speech issue" and is the very sort of protected speech that first ammendment clearly protects.
You would be wrong; there are many legal issues which can protect identifiable people from having their likeness used, especially if it is (1) without consent and (2) in a "private place" (or depicted in an intimate/private situation).
The person whose likeness is used would very much have recourse to sue, whether it's the painter of a watercolor or someone who generates a deepfake. The same principles apply.
By virtue of appearing to be a photograph. The medium itself is a message (cf. Marshall McLuhan). An image which appears to be a photograph conveys a meaning to the viewer that "this depicts a real event", whereas if I draw a picture, even if it is of something plausible but which as never happened, there isn't a perception that the image is being presented as evidence of an event which has taken place. If we know that enough people will draw such an inference from a statement, it can and should be treated as potentially libelous.
Your line of questioning makes no sense. You are introducing the medium as an issue -- why? If you have would have a legal leg to stand on, ask a lawyer. Do you have a moral right? Yes, of course, you've been violated.
Also, the "traditional wholesome art mediums" thing is a bit weird to use for an argument, to me anyway. The act is made much more disturbing and violent owing to the image being created by hand, with care and attention.
You seem to be arguing in bad faith, perhaps from an emotional standpoint. But you seem to be genuinely stuck on the terms 'violation' and 'violence', so:
The post I replied to introduced the term 'violated'. It asks "are you as violated?" -- you'll note that 'as' being present indicates that something can be more or less violated -- it was also their assumption that in either case there would be some amount of violation felt. My reply to their post is based on the framework that they established. Take it up with them.
To take action with intent to harm another is violence. To (seek to) inflict damage, destruction, pain or suffering is violent.
Shouldn't have to in what sense? Someone claiming/implying it's you? That's already considered libel. Someone's machine assisted fantasy that looks kind of like you? Tough luck, people are allowed to imagine things you would rather they not imagine. If you have not been posting your real naked photos online, fake ones will not depict your private parts accurately anyway.
Freedom of thought and freedom of speech.
Why shouldn't I be able to produce virtual deepfake porn of you if it's clearly or implicitly labeled as fake?
"My rights end at the end of my fist." is a common phrase. And it's demonstrable, due to "christian" society, that nude women are punished socially for it. We even have capture at the credit card companies for buying porn, and prostitution is illegal mostly everywhere. And even events where clothing is optional, is set to 21, and not 18.
I await the day that simple nudity is just that - no clothes. But we're still in basically the 1600's with puritanical backwards thinking that they can't even be shirtless like men, for "sexual reasons". I only think about "nipplegate" back in 2004 superbowl where Jackson's nipple was what, 4 pixels? And people collectively freaked out?
Until our society can fucking grow up from 5th grade antics, yeah, its harm. And you shouldn't need to be told not to harm others, but again, collective age is 11/5th grade.
Do you also support blackface? If not, would it be Ok to have these rules?
- If you actually do blackface, you are accountable by current social standards on that matter
- If someone releases a deepfake of you in blackface and doesn't make clear it's machine generated, you can sue them for libel
- If some releases a ML photo of you in blackface and clearly states it's machine generated, you are accountable for nothing and they are accountable by current social standards regarding blackface.
Nowhere did I state anything about the law, or otherwise rules enforced on others. Nor did I say anything about suing people.
I was talking about personal responsibility in not harming someone else. If your ethical guidance is so broken, you need a law to tell you what you should and shouldn't do, I feel tremendously sorry for you.
Does portayal of blackface harm? Yes, it does to the BiPOC community, with terrible racial stereotypes. So don't fucking do it. Again, this isn't a hard concept to figure out.
If we all tried to do things that even minimally considered "is this thing going to harm others" and not doing it, we would have a better and more respectful society. And no more laws needed.
If you want to criminalize some act or behavior, the burden of proof is on you.
What makes you feel entitled to use the State to penalize me for exercising my speech?
She does not like virtual deepfake pornography. Tough luck. Donald Trump does not like the media calling him a loser. Tough luck.
Not at all, but I am realistic enough to know that we cannot have over 8 billion separate definitions of the word “harm” that are actionable. There needs to be a standard that is agreed upon and the law is the source of that.
Has there not already been some kind of court cases for photo-realistic paintings or drawings that incorporate people's likenesses? This seems like something that should have been figured out by now, even before the AI-apocalypse!
I'm no fan of the way that technology is going, but the author seems to be confused.
The fact that people share these videos is rapidly becoming immaterial. Face swapping software doesn't require some sort of underground network, one person can do this for themselves without the videos ever being shared.
So whilst she may be able to avoid the harassment problem of having specific videos thrust into her face on Twitter or whatever - they'll still be out there, just behind closed doors.
Or even barely behind closed doors at all - given suitable software, it's not hard to come up with the prompt "Fred Bloggs sitting on a cucumber in stockings".
Honestly, there's so much porn being dumped on the internet every day if you'd avoid drawing attention to it and quietly torpedo it via DMCA, yours would quietly disappear. But you could also write an article about it in a major paper and ensure it never goes away, and people get pissed at you and make more.
Give it another year or two for TTS to become commodity and she'll have way worse problems (like identity theft, and bomb threats made in her voice, spoofed from her number) to worry about.
That they focus so heavily on distribution of the images e.g. "premade" deepfakes.
The issue is that it is becoming so trivial to generate imagery that no distribution is necessary, unless we consider powerful image generation tools to be akin to munitions of some sort and ban those too.
The best realistic outcome is that the technology becomes convenient and ubiquitous enough that people's default reaction is to distrust it. Suppressing the technology only makes it more likely that people will trust it, which increases the total harm of it.
That outcome seems rather unfavorable. It would undoubtedly raise doubts regarding the credibility of literally any video or audio content, invoking the "liars dividend” and benefiting the most untrustworthy more than any others.
> Google Alert informed me that I am the subject of deepfake pornography
I have so many questions on this line alone, it's a world I didn't know existed. What is a Google Alert? How does Google know this deepfake exists? How does Google go about determining a real vs deepfaked pornographic video?
Google Alerts is a service that allows you to receive alerts on a search query. It's been available since forever and for some reason Google hasn't killed it (yet).
Rats, that's much less interesting than I'd hoped. So it's not out hunting for porn of you, it's just saying some search result like 'Jane Doe Porn' had a new hit?
Typically, I just use it to see if my name appeared in news somewhere. I don't get quoted that frequently these days but it's something I like to keep my eye on.
Pretty much. Although in my case it just reminds once every few years that I set up some queries about an apparently uniquely named project that doesn't really exist any more, and the infinite monkey theorem.
I'll just add that it's highly likely this person just has queries set up to find mentions of their name. It's pretty common for people who (are likely to, or hope to) appear in news or other publications to set up Google Alerts for their name/company/etc..
It's not just porn the problem. Fake news or information is a problem too. And this reminds me of the "Joan is awful" episode of Black Mirror as another kind of disturbing activity using deepfakes. Or making fun of a public personality in some way like in Sassy Justice ( https://www.youtube.com/watch?v=9WfZuNceFDM ).
But is complex to draw hard lines in this because asymmetries between public people and not so public ones doing that, or good uses of the technology, freedom concerns and the potential of AI generation. Its not so much the tool, but certain people that misuses it. And stopping the bad uses in a generic way may cause more harm or abuses than not doing it.
> You can be against this lady, but also be against deepfake porn targeting women only at the same time.
To be clear, I am against deepfake porn. But I'm even more against these kinds of people who make a career out of abusing real victims to create positions of power which they then abuse for unrelated actions that were the real reason in the first place.
Just like how you can be against child porn and yet raise some eyebrows with some of the suggested policies like Apple's scanning all internal photos.
If you're willing to let morally questionable people gather more power because of bad things, you'll be surprised to find out more bad things happening compared to if you just didn't care.
She strikes me as exactly that kind of psychopathic person, and whatever action she's suggesting I'm sure to avoid doing it. And I don't believe her for a moment.
Women's and men's porn works differently; there's long long been an entire genre of "real-person-fics" where actual celebrities or simply notable people are depicted, textually, engaging in sexual acts and emotional intimacy that fits the same bill.
What I'm saying is - Tumblr beat you, and the precise mechanism of AI generated pictures of celebrities seeming to show more women is true, but you'll find a proportional amount if you were to look at where gay men post their porn.
"I'm okay with bad things happening to someone because I don't like them" is the petty mean-spirited nasty nonsense that plagues these kind of discussions. I think it's absolutely horrible and vile.
Besides, it's about the larger issue, not this one particular person.
Was, not is. The board was dissolved almost a year ago and she resigned from it over a year ago. Also "who" not "which", unless your intent is to demean and dehumanize the person you don't usually refer to other people with "which" in a context like this, "who was the head" would be the correct phrasing.
Because it otherwise seems reasonable that a person formerly I charge of preventing false and misleading claims is also against people producing fake porn of them.
In general fake porn is a scourge anyway as it is very clearly used almost entirely on women, and it is extremely clear that women are always the target when it is used to intentionally abuse and harass folk.
While I understand the desire to stamp-out the use of deepfakes to insert unsuspecting women into pornographic images and video, I feel at least as much energy should be spent in eliminating (or at least reducing) the harm that comes to women's reputations as a result.
The fact that a deepfake can humiliate and shame a woman and even cause her to lose her job and friends exposes a problem in our society that I feel is at least as problematic as that which would cause someone to create the deepfake in the first place.