Hacker News new | past | comments | ask | show | jobs | submit login
Downranking won't stop Google's deepfake porn problem, victims say (arstechnica.com)
17 points by rntn 5 months ago | hide | past | favorite | 53 comments



I kinda don't know how to feel about this one.

This kinda seems like a continuation of the problem with revenge porn, child porn, etc just at a much larger scale and able to impact a lot more people.

And that is a real problem and we should do what we can to product the victims.

But I struggle with this quote:

>. “Google has ruined lives and is shamefully inflexible at responding to developing problems,” Goldberg said, joining others decrying Google's move as "too little, too late."

Google isn't the service that is generating these images, they are not hosting them, they are just the search engine to find where they are.

Unless google blocks all porn related searches, I just don't know what realistically can be done about this from Google's prospective at any reasonable scale. The idea of a human looking at everything that google crawls just isn't realistic.

And I have a lot of problems with Google. I just, am struggling with putting this on them specifically. Instead the legislation going after those that are hosting it, making it, etc which is mentioned in the article.


They are facilitators. If Google weren't so money hungry these kind of results wouldn't rank high enough to be discovered. They should probably continuously get fined until they alter their behavior.


I don't follow what this has to do with money hunger. Is your contention that Google profits somehow from having these images be higher rather than lower in search results? Why would that be more profitable?


What should they be getting fined for? Ranking individual links according to some generic algorithm? Is there a law specifying what algorithms may be or not be used for individual or types of links?


Fwiw, I agree w you. I think business incentives are sometimes misaligned w what is holistically good for a society. Just because what is good for society cannot be clearly defined, all effort to do so must not be given up. For instance Laws punishing google when ranking CP high will def bring a change in Google's attitude towards dealing with this stuff.


*protect

*perspective


> The search engine provides a path for victims to report that content whenever it appears in search results. And when processing these requests, Google also removes duplicates of any flagged deepfakes.

> Google's spokesperson told Ars that in addition to lowering the links in the search results, Google is "continuing to decrease the visibility of involuntary synthetic pornography in Search" and plans to "develop more safeguards as this space evolves.”

> ...

> “Google has ruined lives and is shamefully inflexible at responding to developing problems,” Goldberg said, joining others decrying Google's move as "too little, too late."

> Google's solution still largely puts the onus on victims to surface deepfakes and request removals, seemingly mostly because Google only proactively removes web results when content directly conflicts with the law or its content policies.

While I'm usually highly critical of Google, it seems like they're already doing what they reasonable can? Google isn't hosting this content.

Passing legislation is a necessary first step. Do we really want to set a precedent where Google is expected to proactively censors content that is, for all intents and purposes, legal?


> Passing legislation is a necessary first step. Do we really want to set a precedent where Google is expected to proactively censors content that is, for all intents and purposes, legal?

I would expect that in a sane society, everyone would do their part to stop something from happening when that thing is clearly bad, even if it is technically legal!


Within reason, sure. I don't think the demands being made are reasonable. Google is already doing their part to stop it from spreading, but they aren't the source.

Compare that to something like malicious ads, which Google has full control over and directly benefits from.


"clearly bad" is doing a fuck ton of work here.


> "I'm here standing up and shouting for change, fighting for laws, so no one else has to feel as lost and powerless as I did," 14-year-old Mani said

Don't existing child pornography laws and school code of conduct cover the 14 year old's case in NJ?


I moved to Germany from the US, and broadly speaking (I have my own issues with German policies and politics) I like the system here better. In Germany, outside of being a “public” figure, you own the rights to your own image. Without explicit consent and compensation, you are not allowed to distribute photos of anyone. There are specific carve outs for notable events (if you are incidentally in a shot for something else, it doesn’t count), but otherwise seems notably reasonable to me. It definitely seemed odd when I first moved here, but seeing it in practice it seems fine.


The implicit premise of this article is that digitally edited photos and videos that appear like people actually are photos or videos of those people. This premise is false and so is all the reasoning that flows downhill from it. There are no 'victims' because there is no crime because there are no photos or videos of the victims involved. They're digitally created synthetics. It's not like photoshopping someone's face onto someone else's body is new.

This is just hysteria exploited for attention which is money. Calls to criminalize digital image manipulation will result in situations where making fun of people in positions of power (political or celebrity, etc) results in prison; like in Saudia Arabia or Thailand.

The implicitly proposed "solutions" do far worse damage than the "problem".


I would equate this with a form of extreme slander, so your inference is wrong. They aren't a problem because they are real, they are a problem because they can be passed off as real, and the issue is in the reputational damage. I'm not saying there's a good solution but I don't think the problem should be dismissed so easily.


There is no slander here because there is no expectation that those are the actual people doing it.


IANAL, but I'm curious: could watermarking (or prefacing) such videos with a "deepfake" indicator diffuse the slander liability?


But if they were not yet passed off as real, is it pre-crime? If I create a satirical fictional work about someone, is it "extreme slander" because a portion can be taken out of context and presented as real? Should my work be preemptively censored?


I think there's a spectrum, from actually filming someone having sex without their consent and publishing it, all the way to drawing a picture of someone and never showing it to anyone. The ends of the spectrum are clearly immoral, and perfectly OK, respectively. We need to draw the line somewhere between the two.

Paparazzi take photos of people without their consent, and sell them for money. We tend to think this is immoral but legal, and making it illegal would cause more problems than it solves. There is no restriction about such photos being naked or clothed.

Making a deepfake uses someone's likeness. Is that the same as drawing a recognisable picture of them? If not, where do we draw the line between the two? If it is, should we make drawing a picture of someone without their consent illegal? That seems too far to me.

The law is going to take a while to catch up to this. Meanwhile, making this Google's problem seems perfectly OK to me. You create a ludicrously profitable monopoly on discovery, then you get to deal with the problems that people have with discovery <shrug>.


> The ends of the spectrum are clearly immoral, and perfectly OK, respectively.

I'd agree that it's probably a spectrum of legal culpability (although IANAL, so...)

But I disagree that the extrema are clearly moral or immoral.

For example, I expect that the private, hand-drawn scenario violates Christian morality.

And I'm guessing there are other moral frameworks that are sincerely held by some persons, and which are okay with the other extremity.


I'm kinda fascinated by the Christian morality thing - I was raised Anglican and while I no longer believe I still have a fair understanding of the dogma. How does drawing a picture of someone violate that morality?

There are realistic pictures (and sculptures) of real people all over our churches. Are these immoral?

I think possibly the confusion lies in my framing of the question; I meant drawing a picture of someone at all, not necessarily someone naked or having sex.


Sadly CP laws do not recognize that one end of that spectrum is OK. If I draw a photo realistic drawing (or even non photo realistic, ie anime) of CP I’m a felon in the eyes of our pathetic laws.


This take is naive and bizarre. This is not about ‘making fun of celebrities,’ this is about whether we, as humans, have a right to our likeness and to be protected from it being used in ways we disapprove of.

This is an issue of privacy, not speech.

You do not have a right to put me in porn of your choosing. And I, as the victim, should be given maximal ability to annihilate those images, get cooperation from those who host them or make them findable, and to seek damages from anyone who makes them of me.


> You do not have a right to put me in porn of your choosing.

He does not have the right to use your flesh to produce a porn of his choosing, but does he not have the right to employ your willing doppelgänger? There is no reason to think that likeness is exclusive.


No, you do not have the rights to my “doppelganger” because you were never granted them.

This is a bizarre argument as well.

When a news reporter is doing man-on-the-street interviews, you have to sign a release that allows them to broadcast footage of you.

It is not your physical person inside the TV, it’s a video representation of you. Just as much a doppelgänger as any deepfake. And they have to have you sign something in order to use it.

They have the exact same kind of paperwork for filming porn. Because the same legalities and rights are in play.

Just because the image of you is generated does not mean: a) it’s not an image of you b) it’s not subject to these same laws.


This is a bizarre take. If your doppelgänger stars in a porn video, or whatever else, that is for them to choose. You don't have rights to them just because you happen to share the same likeness.

> When a news reporter is doing man-on-the-street interviews, you have to sign a release that allows them to broadcast footage of you.

If used commercially, yes, without such release you could claim that you were party to the commercial business venture and thus are owed due compensation. Indeed, the broadcaster will seek protections against such a claim. But a hobbyist filming you, the "man on the street", for sheer enjoyment would not require a release.


Your usage of the term "doppelgänger" and your claim that they have agency and rights... is extremely bizarre. Clearly it is the person producing the deepfake who has agency, and their right does not extend to the right to harm others' image.


A doppelgänger refers to a real person that happens to share your likeness.


Or to put it another way, imagine two identical twins, where one joins an extremely strict religious sect and the other becomes a porn actor.

Under what circumstance can the first twin sue the second to stop them from acting?

Is "people might think it's me" enough? Does it cross the line if the second twin uses their siblings' legal name as their stage name? What if it's just a nickname? etc.


To call a person with agency and rights a "deepfake" is the most bizarre thing we have seen yet. Get back to us when you get around to reading the discussion.


I thought that by Doppelganger you meant a deepfake.


Read about the right to publicity. It has already been invoked for involuntary pornography and these cases will keep being won.

Your autonomy to make images can’t infringe on my autonomy to choose not to be in them. It’s not a matter of commerce, it’s privacy.


You need to re-read about the right to publicity. You are using it outside of it's legal context of for-profit commercial trademark. The commercial benefit bit is the core and single purpose of the idea. Though you may remember those cases, those cases almost certainly involved business and that case law does not apply in this discussion.

>The right of publicity is an intellectual property right that protects against the misappropriation of a person’s name, likeness, or other indicia of personal identity—such as nickname, pseudonym, voice, signature, likeness, or photograph—for commercial benefit.

A lot of people seem to get confused about that because they cannot imagine people operating as human persons and not as incorporated persons trying to make money; a particularly bad problem on HN.


Typically they don't make you sign a release. That's normally only done if they're interviewing you. I've been recorded in protests, had my picture taken with my name in the caption, and been in newspapers and on the (local) news in numerous occasions where they didn't ask me my permission nor did they ask me to sign anything.

If you're in a public place they have the right to record you. Period. A lot of publications/news sites will blur out people's faces but that's not legally necessary.

I was interviewed by a local news station once and they asked me to sign a generic waver that let them display my likeness for the purpose of reporting the news (i.e. the interview). I read the whole thing and it was very short (as far as legal agreements goes... it was only one page). That's the only time I've ever been asked to sign such a thing despite having my likeness recorded so many times (usually in the background during various events).


Ok, what if I was a very good hyper-realistic artist, and used a real person's hyper realistic and accurate face in my pornographic 'art'?


So someone who looks indistinguishably like you (such cases do exist) would not be allowed to film porn without your consent? What if someone generates a likeness of this person (which is indistinguishable from you) with their consent but not yours?


> we, as humans, have a right to our likeness and to be protected from it being used in ways we disapprove of.

I don't agree with that. For example, it should be perfectly legal to make caricatures of someone in order to make fun of them (i.e. political cartoons or memes), even, or especially when the person being made fun of doesn't approve. I do agree that this probably does not apply to porn though.


> This is an issue of privacy, not speech.

I'm curious about this part of your argument.

It seems to imply that the only relevant framework for analyzing this issue is the privacy angle.

If that's actually the point you're making, how would you justify it?

Especially when AFAICT it's pitting two apparent goods, privacy and free speech, against each other?


Do you think cutting a face out of a picture and putting it on some other nude human for a wank should be a crime?


I think distributing that image without the consent of the person whose face you used should be a crime. Purely private use is a harder question, but I think also a much less important one.


Do you think giving the image to a friend on an IRC channel is distribution? What about if I give a physical copy to a friend at a bar at a table? Or is it only distribution if a corporation is involved like having Facebook host the photo? What is "distribution"? What is a public space and what is a private space?

Your simple solution brings in insolvable problems and incredible vagueness that will result, again, in worse problems than it "solves".


We routinely define “distribution” in copyright cases and I don’t see why the same standards would be unworkable here. Privately sharing a physical copy with a close friend is not distribution, publicly posting for all to see is distribution, and other things occupy a fact and intent specific middle ground.

I’m not sure what you mean when you say it will result in problems; I would not identify it as a problem if people avoid creating deepfake nudes of real people altogether to ensure they’re not liable for distributing them.


So, if I had lets say whatever, 10 close friends in a chat, producing deepfake porn and sharing to them is ok?

I'm not sure about this copyright argument? Lets say that there is whatever meme that is just making fun of something dumb, a "karen" doing something stupid in public. Would you argue distributing that meme is illegal?

So, lets say someone was naked in public and was fine with being photographed, then someone else cut out a publically available picture ( whatever, maybe it was a free shutterstock face) and pasted it on the body of that person, is that bad?

/shrug


If I were writing the law, I guess my instinct would be to make the first illegal and the last two legal.

I don't mean to be obtuse, I know there's a deeper point you're trying to make, but I don't get what it is. Is there an answer to these questions that would prove it should or shouldn't be illegal to distribute deepfake porn of real people?


> worse problems than it "solves".

I'm going to go out on a limb and hypothesize that you're not in a demographic that is likely to be victimized by deepfake porn of yourself.


You understand that "feeling victimized" is not a good argument right?

Ex. I'm morbidly obese. Someone shares pictures of me (taken in a public area) and laughs at me for that. Is that a crime? I might feel like shit and victimized, but thats not actually an argument that its a crime.

This sort of argument really seems like a way to try to use ick and feels versus actual logic.


Lacking any arguments regarding the subject of discusion one often finds the internet outrage farmer resorting to comments about the persons involved instead.


>put me in porn of your choosing.

Oh, I see the confusion. We're talking about synthetic images in this discussion. We're not talking about actual explicit photos or video of someone. So in this scenario "you" wouldn't be involved at all.

Although it has been long established that using actual photos and swapping faces, etc, is perfectly legal. Just because the tools are slightly faster than photoshop does not make this a new thing. if anything, the fact that no actual photos are involved makes deepfakes even less objectionable and clearly not victimizing anyone.

Technical ignorance, willful or accidental, is no excuse for advocating these extreme viewpoints in which you demand use of force against actual human people in situations where no one is harmed. Calls for organized violence like this are not acceptable.


maybe a caveat here. Also: citation needed!

Using photos is only perfectly legal if this use is within the licence agreement of the photos. as I understand it using a picture of somebody you took on the street is allowed in the sense that you can print this picture unmodified that's public domain. However taking this picture you took on the streets and make a deep fake porn out of the picture of that person is definitely not within the licence that public pictures grant you. if you take a picture of my daughter riding her bike and make a fake porn out of it with her face, don't worry I'm on your case... Same for any picture taken by a paparazzi of some prominent figure actress for example these are all licenced, good luck with the lawyers... https://finance.yahoo.com/news/scarlett-johansson-sues-ai-ge...


You're implicit assumption of commercial use here makes everything you've said invalid in most use cases discussed in this thread.


I wondered if anybody would have the courage to point this out, even on here.


This is nonconsensual so we should stop calling it porn. Sexualizing someone without consent is a crime. It is important to differentiate legal and illegal sexual content.


> Sexualizing someone without consent is a crime

It is not, "sexualizing" isn't even well defined, and it isn't even enforceable without mind reading technology.


As usual with tech giants, the burden of playing whack-a-mole with the perpetrators is left to the victims, while the company profits.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: