This article, seems biased, lacking of seeing the rest of the bigger picture problems with deepfakes, and it fails to produce any proof for any of its claims. The picture comparison, is laughable, at best (especially if you have tried to make a perfect deepfake--- it is actually not THAT easy).
There is absolutely nothing new, added to this discussion, by this article. It also seem gender provocative and charged, as the article choose to exclude male deepfakes as an equal probable problem, without even commenting on it? I suspect deepfakes to be just as valid for turning voters, blackmail, false imprisonment, and the likes - and that would include both genders.
This feels like bait/trash journalism, to me - On a topic that is worth a real discussion, in regard to real life problems,
>What we can do?
>
>In this situation, the responsibility to combat deepfakes largely lies with the government and big tech companies.
What you should also do, dear, is assume your government is corrupted and that big tech companies are soulless Leviathans motivated only by extracting some easily quantifiable monetary profit from you, abusing you as much as it can within its accounting. At worst, you were wrong and actually your local government is only steered by competent benevolent people that will understand that trust doesn’t come without checks, and big tech organizations are actually all non-profit in disguise which are striving to make the world a better place for every single living being.
Also, the key issue here, as it at always been, is how people behave with each other, not which tools they use to actually implement these interactions.
The real challenge is not that much "how do you regulate tools", but "how do you make people act with mutual respect at societal level".
Don't you think it feels incredibly degrading and violating to have your face stitched into porn in a way that's supposed to be as realistic as possible? Without your consent?
Doesn't really matter if the video is real, the people who this is done to have not signed up for this. And you know people are jerking off to it. I don't doubt for a second that it's absolutely repulsive to see yourself in a deepfake porn.
A couple of popular streamers who have had that happened to them have reported feeling similar feelings of shame and violation to the way they felt when they experienced actual sexual assault. That's pretty alarming and should count for something, no?
Also, this gives a completely new dimension to parasocial relationships. There are lots of people struggling with real realtionships that find solace in live streams. As they continue to engage in the community, they can begin to feel like they have an actual relationship with those streamers, which is obviously dangerous since it's such a one-sided thing. Deepfake porn elevates this even further and makes it that much more likely for people to develop an unhealthy obsession with people that they will likely never meet, instead of working on their real interpersonal relationships.
On the other hand, if said deep fake was never publicly revealed, and is only utilized by the person who made it (and not distributed), then is there harm done? The deep fake subject is completely ignorant of it in this scenario.
In other words, is the harm not in the production, but the wide distribution of the material, and the harm is in the reputation of the subject (even if explicitly labeled fake), and not in the actual contents?
And so this begs the question - if an AI could deep fake something on demand, without ever distributing it except to the requester, then is there any harm done?
Well, consider revenge porn. A couple shares images or makes videos of themselves, never publicly revealed, only utilized by the person / people who made it, no harm done right?
But then the relationship breaks up and one of the partners publishes the footage. Or their account or device get compromised and the material goes public.
There's a series of new laws with regards to revenge porn, and deepfake porn will likely fall under that category. It's not about whether it stays private, it's the risk of it going public.
I mean this issue isn't anything new either, it's pretty straightforward and common to edit a celebrity's face onto a nude model.
So you say deepfake is a powerful tool and the responsibility bears with who is using the tool.
Like guns are powerful tools but some guns are military grade and illegal if possessed by the public because of their power of destruction. It’s not about the tool per se but the possible damage that classify the gun, even if I’m a responsible being and will never misuse such tools, out of safety I’m not allowed to own several classes of guns.
>Don't you think it feels incredibly degrading and violating to have your face stitched into porn in a way that's supposed to be as realistic as possible? Without your consent?
Not really.
> And you know people are jerking off to it.
Why would I care what you're jerking off to? Maybe it's thoughts of me. Maybe it's photoshopped pictures of me. Maybe it's a deepfake video of me. But unless you phone me up in the middle of the night to tell me about it (i.e. harassment), I couldn't care less.
"Ah but what if someone emailed your grandma with a video of you having sex with ten guys" - well yeah, that would be bad but it would be bad if they emailed my grandma with a poem about the same. The harm there would be the intent to harass not some psychic damage caused by you creating images.
>A couple of popular streamers who have had that happened to them have reported feeling similar feelings of shame and violation to the way they felt when they experienced actual sexual assault. That's pretty alarming and should count for something, no?
It's bad that they had those feelings, yes. But feeling shame for what someone else does with your image is clearly irrational. Understandable maybe but I'm not sure we should making public policy on whether someone feels icky about something.
> That's pretty alarming and should count for something, no?
No. What's truly alarming to me is this notion that people have that they own and control "their" image and likeness. They don't.
Rich celebrities with lawyers get to extract rent from corporations for use of "their" likeness but there's pretty much nothing normal people can do against other people with a computer. At the end of the day it's just data that can be freely manipulated and like all privacy issues the only possible defense is to prevent that data from coming into existence in the first place. It's not even a new problem, even before stable diffusion you could easily find on the internet people photoshopping other people naked for kicks.
Which is why I don't have any sympathy at all for public figures, especially the ones who deliberately publish that data and profit off of the attention they receive whether emotionally or financially through advertising. They asked for the attention and they got it. Now they want government tyranny to control what kinds of attention they get? I don't accept that.
> Don't you think it feels incredibly degrading and violating to have your face stitched into porn in a way that's supposed to be as realistic as possible? Without your consent?
Not really?
> I don't doubt for a second that it's absolutely repulsive to see yourself in a deepfake porn.
Then don't watch it?
People jerk off to others all the time, it's harmless.
One pupil got suspended for drawing horns or something on a photo of a school principal. It's the same category of offence. People should grow thicker not thinner skin. But being a victim in modern world pays because outrage sells.
Whoa - I think we need to be cautious about dismissing genuine concerns around being deepfaked as just victimhood. The issues surrounding deepfake porn are complex than that - they can have significant psychological impact on individuals.
Also we really can’t ignore the lack of consent and the abhorrent violation of someone's image. This can’t be boiled down to growing “thicker skin” - there’s pretty major ethical considerations as well as the potential harm it can cause to a person's reputation and mental well-being.
The knowledge that the neighbours next door are having gay sex might cause a significant psychological impact on a hyper religous couple, but nonetheless wedo and should ignore that impact as it cannot be mitigated without infringing on important freedoms.
Absolutely not. Victim blaming is about making victim responsible for the victimization happening. What I'm saying is that in modern attention economy becoming a victim sometimes has a positive economic value for people, especially already popular people, like streamers. So they have monetary incentive to publicly feel more harmed.
I think the opposite - once deep fake porn becomes good enough, blackmailing/revenge porn will no longer be a thing. Anyone can just say that those images are fake, and porn of celebrities, exes, politicians will all be worthless.
Looking forward to that, when digital images/video no longer means anything to people. Hopefully then people will lean to ONLY trust and value real in person interactions... Until of course we invents holodeck ;-)
The problem with bullying is the bullying itself, and not the specific way in which it is performed. It does not matter if you are being bullied old-fashioned analog, or hypermodern digital. The issue is that we basically have never thought of a reliable way to protect children from other children, in the age category where it is necessary. It is one of those semi-taboo topics where we actually prefer to look away and hope a solution will solidify on its own. Because we dont want to accept that children can be cruel assholes. At least not our children! Maybe the spoiled brat from the neighborhood, yeah, that child is... But not mine! My child is sweet honey, the best human ever conceived...
I also see the opposite happening: real footage of someone doing something nasty, and claiming it's a deepfake. If we get to the point where deepfakes are indistinguishable from the real thing, does it mean footage can no longer be used in court?
That'll be an interesting one to see play out. Thing is though, forgeries / fake evidence in the form of image and video editing isn't anything new - how is that challenged in court these days?
I can imagine they have an expert witness testify that it's fake or not, but they can and have been wrong as well.
> does it mean footage can no longer be used in court?
Perhaps we'll see the rise of cryptographically verified videos: the camera signs the output of their sensors, proving it was captured from the real world.
That's been suggested periodically for many years, but I'm not sure I'd ever trust it. I think people would always find a way to spoof the inputs, even if it's just by the analogue hole of focusing on a screen. This would be non-trivial, but both private and state espionage groups would be willing to pour a lot of money into the prospect of being able to have a "undeniably real" doctored image
Quite apart from any moral considerations that other commenters have addressed, in cold business terms such deep fakes directly compete with the works of these models and therefore it’s not in their interest to permit their circulation.
Who knows? Maybe society will change for the better once this technology becomes widespread. Maybe people will finally stop publishing their lives for the world to see.
It has the power to deceive and leave a lasting impression on those who may see it in passing and not realize its fake. Also, it can be demeaning considering the faked subjects would have no say consent-wise in the distribution of the faked content.
Think of what happened with Hulk Hogan, who had a sex tape released without his permission. Though it was real, it left a lasting impression on a given audience, and it tarnished his reputation (further), and he successfully sued because, iirc, the video was “leaked” without his consent.
It's pretty abhorrent to practically digitally unclothe someone (unless they fully consent to it), and more times than not the content produced by this will be shared online to countless other people.
The thing is, you cannot guarantee that everyone will know it's fake, and when you weaponise it against people who can't distinguish it there'll be consequences.
Currently we see people found to be doing Adult Content online fired from their jobs, but what if a technically-blind boss gets sent a ton of deepfakes of an employee and is falsely told they do content like that on the side? What if a doubting partner suddenly receives deepfakes of their partner with someone else? What if your political rival starts spreading deepfakes with wildly rabid claims about yourself which the media picks up and reports on – bullshit has spread around the world by the time the truth has tied its shoelaces.
Think about the absolute inane content people believe on Facebook. Think about the misinformation that gets spread over Twitter just because the fake information would be funny. Think about the stuff you've seen online which turned out not to be so real.
Even if the images generated aren't real, the subject of them can't convince the world they're not, and there's no way for them to stop the violation of their self-image.
Defamation, identity theft, abuse of individuals directly by circulating porn of them in their communities. Fake or not, it's very impactful. It's not an ideological issue but a very practical one, it's already causing harm.
You might be thinking of celebrities where the idea that they have done porn is so far fetched that it's obvious it would be fake. But what about minor personalities like youtubers? It could tank a career if it's unclear something is real or not. Just like a false assault allegation, it will still cause a lot if harm even if the truth prevails.
Lastly, on an individual level, people are going to get hurt by sociopaths circling imagery of coworkers or whatever, it's messy human social business and the pragmatic person might imagine the truth is all that matters, but it definitely isn't. You can ruin someone's social life and mental health with mere rumours, let alone what will look like proof of ireputable activity.
I don't think that's what concord meant. They meant that as convincing fake imagery becomes easier to produce and more common, people will learn that any image could easily be faked, regardless of how photorealistic it appears. i.e. people will lose all trust in "photographs", so fake images will lose their efficacy as blackmail.
Exactly. The "victim" of a deepfake simply needs to say "it's fake". The burden of integrity is then on whoever is indulging the fake images, spreading them or raising questions about them.
Added benefit, as you say, is that any image can now be faked. So even a real photo of someone caught in the act, could be dismissed by the victim as fake if they wanted to make the problem go away.
Currently, with skill and effort, a photo can be faked. People know that a given photo may be fake, but generally expect photos to be real unless they are motivated to think otherwise. The idea is that once the ease of producing a fake reaches a certain threshold (a layman can go to a website and type "$PERSON engaging in $QUESTIONABLE_ACTIVITY") then fake photos will greatly outnumber real ones and trust in photographs will fully erode.
People won't realize it's fake... and people won't care either. You struggle to see the harm because you haven't been a victim, and I presume you're not a woman either. So yeah, sorry to get personal but you're missing something.
Why does it matter if people don't immediately realize it's fake? Why are those people looking at pornographic images?
For any harm to happen, the status of the image as real needs to be confirmed and sustained over time.
If the victim says "it's fake", why would people doubt that or not care? That would just mean those who give oxygen to the fake images, are themselves in danger of spreading false information and losing all credibility. Only trolls and trashy people bypass concern for their own credibility.
As for "consent", who cares? It's a dirty fake image made by losers, and is the modern equivalent of words written on a public bathroom wall: "For a good time, call Jo". Jo never consented to that, but who cares, it's already content from the gutter.
It seems like a really, really difficult task to enforce any laws that might develop around this. This isn't like paparazzi nudes that get out every few years - it's just math, and thousands could be generated overnight with little compute.
I don't know what the laws should be, but I think there are some serious limitations on what's possible/practical at this point.
There ARE laws against this kind of thing, defamation and whatnot. It's the same as editing a face onto someone else's (naked) body. No new laws are needed; the method has changed, not the intent nor the crime / violation.
There are a lot of technically possible things you can do with a computer that are currently illegal. Just because you are able to do something doesn't mean there won't eventually be legal consequences, even if enforcement is impractical.
Pornographic image editing is "THE danger of deepfakes"?
And somehow it's a problem for females (sic)?
Why this focus on pornography?
Pornographic image editing has been around forever. Deepfakes is just another tool in the toolbox. This article fails to see the wider consequences of deepfakes outside of pornography as described by the Wiki page cited in the article.
I'd say the real danger is the potential to destabilize society by faking authority and spreading misinformation to manipulate people for financial or political goals. Pornography is a tiny part of that.
Go on Civit.ai (the hub of custom diffusion models besides huggingface), make an account and login. You may notice that well over 60% of all models are porn, especially Waifu porn.
Even without logging in the majority of models market themselves with cute/scantly clad women. That’s why they focus on porn, because that’s what everyone is doing with AI image generators.
There is nothing inclusive about perpetuating gender stereotypes.
Men are allowed to feel the same sense of privacy and intimacy over one's own body that we afford women, and pretending only women get harmed by the rampant abuse of photoshop on steroids doesn't do anybody any good.
It's not like this is some new occurrence. Just made easier by what's out. Anybody that will take the time to do these things would put the time into it through another medium. Just more AI fearmongering.
And of course... only mentioning women as victims. Right.
There is absolutely nothing new, added to this discussion, by this article. It also seem gender provocative and charged, as the article choose to exclude male deepfakes as an equal probable problem, without even commenting on it? I suspect deepfakes to be just as valid for turning voters, blackmail, false imprisonment, and the likes - and that would include both genders.
This feels like bait/trash journalism, to me - On a topic that is worth a real discussion, in regard to real life problems,