There have been a lot of anti-CDA 230 sob stories lately, which have indeed highlighted real victims of online content and abuse, but imposing liability on platforms simply destroys the free and open internet. Search engines would cease to exist if they became liable for simply indexing content created by others, and sites like Hacker News wouldn't be able to allow user submissions and comments.
We need to be careful to avoid victim-blaming, but equally important to recognize that very real harms to some people don't justify upending policies that have created enormous value for society as a whole.
Not to mention that, even without CDA 230, the majority of the problematic speech online would remain legal in the US because of the First Amendment.
In other words the asymmetry between posting something online and having it removed is exacerbated by Google exposing any copies of the information that could otherwise get lost in some random site.
Wouldn’t it be fair for Google to also provide decent support the to remove content from its index ?
From the article Google is doing some effort, just clearly not enough.
> Google’s policies dictate only two instances when they will remove content — child pornography and copyright-infringement requests.
> The current policy says Google may remove nude or sexually explicit images that were shared without consent, but the company maintains sole discretion about when to remove nonconsensual pornography.
I would also argue that an internet where you are afraid to send nudes to your partner is less "free and open" than one where you can. It's both technically and ethically challenging to build one where you can (one very promising approach, for instance, is DRM, which has the ethical issues of whether to build DRM and whether to portray DRM as reliable - see in particular Snapchat, whose DRM is used for this exact purpose), but it still seems like a worthwhile long-term goal.
If anybody is suggesting that it makes sending nudes safe should also be held liable for extreme stupidity.
It's also not unreasonable to hold that the additional effort of taking a photo of the screen falls into the locks-keep-honest-people-honest category. Of course nobody believes that a lock on their front door will prevent a sufficiently determined attacker from breaking in, yet we all lock our doors. (And we wouldn't say that "change the locks" is useless advice against an ex with a key.)
You're trusting the person to not make this image permanent. Could be valid, could be a serious problem.
That's if they just don't make the revenge porn out of slightly enhanced deepfake cloth.
Our society and our democracies are suffering under the vast amount of disinformation. Nobody knows what to believe or who to trust. What's real and what's bullshit. Something needs to change.
So if you create a new business you must pay a Google tax to be indexed. What about Bing?
If you have an obscure blog about gardening or ponies or microprocessors you must pay some fee to Google to get indexed. Is it a one time fee or it's yearly?
I guess you can have a page in Facebook and they will make some arrangement with Google. The big wallet garden host can make some similar deal (probably in exchange of placing ads in your page).
What about personal pages of the professors in the universities? There is a lot of good material there, some courses have almost all the material online.
I'm sure the Universities will cover the liability costs of their professors.
I'm not sure what point you are trying to make.
This looks nice: https://www.johaprato.com/receta/medialunas-de-grasa
I don't get ads neither in the image search nor in the normal search. The results look quite organic. Most of them are from professional or semiprofesional cooking blogs.
Sure, section 203 indemnifies Google et al, but it creates an incentive for bad faith actors to exploit the system.
What is your solution for this problem?
Yes, it's whack-a-mole, but that will forever be the nature of the internet or any other tool that facilitates democratized content creation at effectively zero cost.
The article mentions a clear pattern, where the base URL stays the same, and small variations are employed to circumvent any legal actions, because Google refuses to acknowledge the "repeat offender" status of the base URL.
Twenty years later, we are still realizing the price that we pay for universal indexing of all human-created content: we have enabled hate groups to congregate safely under anonymous identities, while simultaneously enabling the complete loss of having a "private life" away from the prying eyes of others.
Yahoo's human-curated index, and the many "books about the Internet" that were published back in the day, certainly did not have the breadth of coverage that search engines did. But in return for that, every site you visited could be assumed to have at least the value the index stated it would have. Topical, relevant, curated. And safe.
This article isn't the best at expressing this point, but they're realizing the same thing that few of my tech friends wish to consider: Stripping away the human review and approval process of maintaining a "card catalog" index of websites may have been profitable, but it comes at the cost of our humanity.
I agree it might not be much longer in any given leaf category node.
If businesses, from the very largest to the smallest, do not behave as collections of real humans, then we might as well be living under Skynet. Honestly, if what makes us empathetic and caring is ignored, then at some point software should replace us.
But regarding revenge porn, porn, sexuality, nudity, etc., I think it's way past time that people realize that we all have naked bits. It's really no more astonishing than a tree having leaves and bark, and the occasional fruit! I believe religion has something to do with the largely artifically-ignorant view of nakeness and sexuality, but most religion is just a disguise for population control. So whatever the name of the leadership, the fact remains that people without clothes have nipples and other bits that US television can fine you for showing (honestly, it had never occurred to me that Janet Jackson had nipples!).
For these women who have been duped and taken advantage of, based on the current societal norms, it's terrible. Google should have some human sympathy in its organization, and it should have some executing willing to demonstrate the importance of Google being part of humanity. But at the same time, I hope the younger generations can finally return humans to their more natural state.
Actually I really don’t like Google and would love to see them drown under the onslaught. Plus a lot of people would get some temporary relief until search moves to someone still neutral.
Executive and people with authority to actually handle situations like these are probably not here, or even they are afraid to speak openly. It takes a lot of commitment and guts to risk a career and really good income to just make a statement on a forum.
>My clients, all aged 18-22, had answered deceptive bikini-model ads and had become embroiled in a conspiracy to perform porn that resulted in some of them being raped before and during the shoots.
Should victims have to take responsibility for the impact of the crimes committed against them? And in any scenario, I think the article brings up a good point:
>But even if we do something “negative,” to use Lieu’s term, should that mean we deserve to be connected to it for the rest of our lives?
The Internet makes it possible for descriptions of your past, truthful or not, to follow you forever. The idea of a "right to be forgotten" tries to balance against this.
U.S. laws need to catch up with reality.
There are ways that information can be used to cause harm. If it were not the case we wouldn't need encryption. As a society, we must be capable of enacting laws that define when information is harmful and thus not allowed. Is this censorship? Is it evil?