Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Google has destroyed the lives of revenge porn victims (nypost.com)
48 points by zxcvbn4038 59 days ago | hide | past | web | favorite | 32 comments



This article focuses on Google not being liable, but ignores the fact that the specific creators and posters of the content appearing in Google's search results certainly can be sued.

There have been a lot of anti-CDA 230 sob stories lately, which have indeed highlighted real victims of online content and abuse, but imposing liability on platforms simply destroys the free and open internet. Search engines would cease to exist if they became liable for simply indexing content created by others, and sites like Hacker News wouldn't be able to allow user submissions and comments.

We need to be careful to avoid victim-blaming, but equally important to recognize that very real harms to some people don't justify upending policies that have created enormous value for society as a whole.

Not to mention that, even without CDA 230, the majority of the problematic speech online would remain legal in the US because of the First Amendment.


One of the issue is that Google is not neutral to these results: the bot went out its way to index these pages, give exposure, and if any single person posting these videos can’t be found and sued to force them to remove it completely it will stay online forever.

In other words the asymmetry between posting something online and having it removed is exacerbated by Google exposing any copies of the information that could otherwise get lost in some random site.

Wouldn’t it be fair for Google to also provide decent support the to remove content from its index ?

From the article Google is doing some effort, just clearly not enough.

> Google’s policies dictate only two instances when they will remove content — child pornography and copyright-infringement requests.

> The current policy says Google may remove nude or sexually explicit images that were shared without consent, but the company maintains sole discretion about when to remove nonconsensual pornography.


But the "free and open internet" is a vague term. We would not say, for instance, that an internet without HTTPS or end-to-end encryption is desirable, despite it being less "open" in a surface-level sense, because we know that our underlying goals of a "free and open internet" are better served by reliable security and privacy.

I would also argue that an internet where you are afraid to send nudes to your partner is less "free and open" than one where you can. It's both technically and ethically challenging to build one where you can (one very promising approach, for instance, is DRM, which has the ethical issues of whether to build DRM and whether to portray DRM as reliable - see in particular Snapchat, whose DRM is used for this exact purpose), but it still seems like a worthwhile long-term goal.


I hope nobody really believes Snapchat DRM is going to prevent their nudes appearing on a porn site. Just take a photo of the screen with another phone.

If anybody is suggesting that it makes sending nudes safe should also be held liable for extreme stupidity.


One valid threat model for revenge porn is, I trust this person now, but if we break up I might not trust them anymore. So Snapchat disappearing messages do actually solve that problem. (And more generally, "I trust this entity now but they might become compromised in the future" is a model with some precedent in encryption, cf. forward secrecy and deniable messages which are for related, similar problems.)

It's also not unreasonable to hold that the additional effort of taking a photo of the screen falls into the locks-keep-honest-people-honest category. Of course nobody believes that a lock on their front door will prevent a sufficiently determined attacker from breaking in, yet we all lock our doors. (And we wouldn't say that "change the locks" is useless advice against an ex with a key.)


No they don't. Taking a picture or screenshot of such an image immortalizes it. Stopping people from taking screenshots is essentially rootkit or virus functionality. (Plus it doesn't stop anyone from taking a picture of the screen.)

You're trusting the person to not make this image permanent. Could be valid, could be a serious problem.

That's if they just don't make the revenge porn out of slightly enhanced deepfake cloth.


I believe search engines and social media should be held liable for the content they distribute. They could choose to moderate the content more closely, or if that is too expensive for their business model, they could enter agreements with content creators who want their content appearing in searches or online. They can counter sue the content creators.

Our society and our democracies are suffering under the vast amount of disinformation. Nobody knows what to believe or who to trust. What's real and what's bullshit. Something needs to change.


> they could enter agreements with content creators who want their content appearing in searches or online

So if you create a new business you must pay a Google tax to be indexed. What about Bing?

If you have an obscure blog about gardening or ponies or microprocessors you must pay some fee to Google to get indexed. Is it a one time fee or it's yearly?

I guess you can have a page in Facebook and they will make some arrangement with Google. The big wallet garden host can make some similar deal (probably in exchange of placing ads in your page).

What about personal pages of the professors in the universities? There is a lot of good material there, some courses have almost all the material online.


Yeah um, that's basically how Google works now. If you want to get into search results you have to buy ads.

I'm sure the Universities will cover the liability costs of their professors.

I'm not sure what point you are trying to make.


A few months ago, I wanted a recipe with a nice photo of a "salty croissants with tallow" (medialuna de grasa, they are popular here in Argentina, sorry for the horrible translation).

https://www.google.com/search?q=medialuna+de+grasa&tbm=isch

This looks nice: https://www.johaprato.com/receta/medialunas-de-grasa

Autotranslation: https://translate.google.com.ar/translate?oe&um=1&ie=UTF-8&h...

I don't get ads neither in the image search nor in the normal search. The results look quite organic. Most of them are from professional or semiprofesional cooking blogs.


When Google indexes reputation impacting personal information, it is not verifying the authenticity of the information. This leads to a potential market for reputation harming falsehoods to be SEO-ed to the top of search results.

Sure, section 203 indemnifies Google et al, but it creates an incentive for bad faith actors to exploit the system.

What is your solution for this problem?


The bad actors are still liable. You go after them to force them to remove the original content that is being indexed.

Yes, it's whack-a-mole, but that will forever be the nature of the internet or any other tool that facilitates democratized content creation at effectively zero cost.


No - Google should be liable if it enables repeat offenders to purchase ad-dollars and SEO their way up to the top. At that point, Google isn't an innocent by-stander, but enabler.

The article mentions a clear pattern, where the base URL stays the same, and small variations are employed to circumvent any legal actions, because Google refuses to acknowledge the "repeat offender" status of the base URL.


Search engines came into existence back in the late 90s as a way to replace human-curated indexes of content so that the CPC (cost per click) for display advertisements could be reduced.

Twenty years later, we are still realizing the price that we pay for universal indexing of all human-created content: we have enabled hate groups to congregate safely under anonymous identities, while simultaneously enabling the complete loss of having a "private life" away from the prying eyes of others.

Yahoo's human-curated index, and the many "books about the Internet" that were published back in the day, certainly did not have the breadth of coverage that search engines did. But in return for that, every site you visited could be assumed to have at least the value the index stated it would have. Topical, relevant, curated. And safe.

This article isn't the best at expressing this point, but they're realizing the same thing that few of my tech friends wish to consider: Stripping away the human review and approval process of maintaining a "card catalog" index of websites may have been profitable, but it comes at the cost of our humanity.


A human-curated list would be significantly shorter today, with the way the Web has centralized.


That seems false. There are more domestic airlines, more web hosting companies, more blogging platforms, and in general more ways we use the Internet.

I agree it might not be much longer in any given leaf category node.


The part that that had always intrigued me is how data aggregators seem to really enjoy the fight when someone wants their data removed. If it were really just about marketing - why waste time and money peddling to people who don’t want to be peddled to? I think at some level it’s more about being able to cause people discomfort and annoyance without repercussions, and I think there is a significant population of disempowered people that appeals to. Like some variant of the Stanford prison experiment perhaps?


Google needs to inject a bit more humanity into its business. Yes this is a special case, but certainly they already have a mechanism for skipping some listings. This is an issue of goodwill, which perhaps still Google doesn't feel the pressure to build (due to something related to monopoly position).

If businesses, from the very largest to the smallest, do not behave as collections of real humans, then we might as well be living under Skynet. Honestly, if what makes us empathetic and caring is ignored, then at some point software should replace us.

But regarding revenge porn, porn, sexuality, nudity, etc., I think it's way past time that people realize that we all have naked bits. It's really no more astonishing than a tree having leaves and bark, and the occasional fruit! I believe religion has something to do with the largely artifically-ignorant view of nakeness and sexuality, but most religion is just a disguise for population control. So whatever the name of the leadership, the fact remains that people without clothes have nipples and other bits that US television can fine you for showing (honestly, it had never occurred to me that Janet Jackson had nipples!).

For these women who have been duped and taken advantage of, based on the current societal norms, it's terrible. Google should have some human sympathy in its organization, and it should have some executing willing to demonstrate the importance of Google being part of humanity. But at the same time, I hope the younger generations can finally return humans to their more natural state.


So who is more liable. The card catalog with a reference to something awful or the actual thing that did something awful?


The "awful" part of revenge porn is not (in most cases) the image itself, but its publicity.


I really feel for these people, but if Google gets into the content moderation business it will never end. They could spend all their resources trying to manage the floodgate that would open, but it wouldn’t be enough trying to appease all the different jurisdictions, advocacy groups, and lawsuits once they become anything besides a neutral entity.

Actually I really don’t like Google and would love to see them drown under the onslaught. Plus a lot of people would get some temporary relief until search moves to someone still neutral.


any google worker here?


You know there is; HN is full of Googlers. But it's very, very tricky for them to make public statements about hot topics like this.

Executive and people with authority to actually handle situations like these are probably not here, or even they are afraid to speak openly. It takes a lot of commitment and guts to risk a career and really good income to just make a statement on a forum.


[flagged]


Immediately before that in the article:

>My clients, all aged 18-22, had answered deceptive bikini-model ads and had become embroiled in a conspiracy to perform porn that resulted in some of them being raped before and during the shoots.

Should victims have to take responsibility for the impact of the crimes committed against them? And in any scenario, I think the article brings up a good point:

>But even if we do something “negative,” to use Lieu’s term, should that mean we deserve to be connected to it for the rest of our lives?

The Internet makes it possible for descriptions of your past, truthful or not, to follow you forever. The idea of a "right to be forgotten" tries to balance against this.


At some point, hopefully, people accept that people (themselves and others) are human. Being human involves trying things, trying to make a living, exploring things, and... even sexual things. There are some fairly generally accepted limits regarding sexuality (and age of "participants", as well as consent). But humanly speaking, there is no doubt that we all have bodies with various parts, and most people get some pleasure from contact on those parts. Shaming people for doing so is like shaming someone for eating food.


This doesn't look like revenge porn to me. Revenge porn is when an ex-boyfriend shares a video of the couple having sex. This is real porn, and maybe the victim wasn't payed (in which case she should be).


lol "destroyed"? "merciliessly"? I love it in a world where a lot of bad shit goes down only the stuff that involves women and sex is automatically hyped up to the most important, even more than genocide. So every woman automatically gets a sort of 'supercopyright' where she can decide 20 years down the line what happens to every single copy of certain 'embarrassing' images of her across the entire world. Why not other types of images? I mean Bush puking on the Japanese PM was pretty embarrassing. Oh so because its a woman, even though she openly and knowingly released the pics in public, its still special. Man we really are still Puritans, still pedestalizing sex and the honor of our ladyfolk.


There is clearly more at stake here than you acknowledge. The creep job one stalkers who made the lives of these women hell beyond personal embarrassment. And these were cases of non consensual sex cited in the article. For gods sake man grow a little empathy and honor before you keyboard blast.


Harassment and rape are separate crimes. There is no need to institute a special global supercopyright for poor oppressed womyn and trample the free flow of information to prosecute those.


Search engines should treat searches on individuals as a separate, special category. Laws need to be protect personal information from Google's excuses. When federal laws on personally identifiable information were drafted, they weren't intended for a world where a megacorp indexed it all, and made it widely accessible, with no course for challenge.

U.S. laws need to catch up with reality.


Information wants to be free regardless of virtue of your claim, and the first amendment will make an impossible hurdle for would be censors.


You beg the question that free information is always a good thing, and that anything which gets in the way is censorship -- in other words, evil.

There are ways that information can be used to cause harm. If it were not the case we wouldn't need encryption. As a society, we must be capable of enacting laws that define when information is harmful and thus not allowed. Is this censorship? Is it evil?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: