Hacker News new | past | comments | ask | show | jobs | submit login

Regardless of your opinion on piracy, I think Google (and ISPs) should not moderate search results.

Basically, search engines are used to find things publicly available on the internet, policing should be done on the actual hosts of the content.

Same goes for ISPs, they are here to provide access to publicly available computers, without interfering with the content or accessibility.




But Google moderates search results by definition. As in, it's not like there's an objective external measure of what the results for a given search term should look like - it's all Google that determines what to show you in what order.


First, there’s a difference between moderation by prioritization vs blacklisting.

Second, if I search for “the Pirate Bay website” I’m pretty sure there’s an objective result that should be there at the top.


I don't see a difference between not listing something and listing it below an infinite number of null results.

1 / ∞ = 0


Is this not a false dichotomy? Unless you are purposely using deprioritization as a substitute for blacklisting it would stand to reason that results 'moderated' to the bottom of the a priority list ended up there because they actually aren't considered useful/relevant to the query. So unless you are moderating in bad faith blacklisting is clearly different because it removes the users ability to find the content at all and it wholly ignores the intent of their search.


How is it not the true dichotomy? Delisting is the same as prioritizing noise over good hits.


There's barely a difference between delisting and listing as the eleventh result...


if "the pirate bay website" is written in lots of documents discussing the legality of the website, but the actual website is called "https://www.piratebay.org" and doesn't contain the words "the pirate bay website" at all, then by any objective measure that site should be way down the list of any search results for "the pirate bay website".


You honestly think that it's the wrong answer to show that website when somebody searches for that term?

The title tag of the site is as follows: "Download music, movies, games, software! The Pirate Bay - The galaxy's most resilient BitTorrent site"

And it's the number one search result for that phrase on DDG. Because search engines are actually programmed well enough to figure out that's the right answer. I'm not sure why you think it's the wrong answer to show that site as the first result, but I'm pretty sure everybody else in the world disagrees with you, and would prefer that they see the website they are searching for, instead of news articles about the site they are searching for.


We should apply this logic to something else to have some fun. You go to the shop and look for spaghetti and instead of finding it, you find all the other things you can use together with spaghetti to make a nice meal.

Now, if only you could find some spaghetti to go with all the other stuff...


This is 2021. Search engines are aiming for semantic matches.


There is a difference between having a set of rules and doing ad hoc moderation.


So writing a rule "don't show results from domains on list X" makes it ad hoc ad not a rule?

I don't think there's sombody manually removing each result from search queries by hand. That wouldnt meet latency constraints


The set of rules includes "results that are illegal to divulge in a jurisdiction will not be shown."


Since Google is doing this voluntarily, many of the blocked results were legal to divulge but Google is choosing not to, which does make it ad hoc.


This just becomes a question of pragmatism at that point - Google lacks the capacity to determine which of the blocked results are legal versus are not without incurring cost, so the most realistic approach is to recognize that a majority of results from that domain are illegal and block the domain. This is just the simplest way to enforce a particular rule in a particular case that can't otherwise be cheaply codified programatically.


Yet Google doesn't generally block most content that is illegal in one jurisdiction in all others where it is not illegal. If Google is deciding to do that just with TPB, then that is indeed an adhoc decision.


(full disclosure, I work for Google)

I think it's easy to take an absolutist approach when it's somebody else who faces the consequences. In some ways it's a rehash of the arguments when the Navalny app was delisted.

In a vacuum, "should someone follow politically motivated requests to take down content?" is an easy question to answer. But when you have to worry about consequences like giving up your freedom, I can't say that I'd have the courage to follow through. The Dutch prisons might be nicer than Russia's, but asking someone else to give up their freedom for ideals is still a tall order.


It seems to me that the right (if unrealistic) way to address this is to make searching P2P so there is no small number of people to arrest for this. Governments can't arrest double digit percentages of their citizens... that looks really bad.


Why wouldn’t this result in absolutely horrible search results? Any sort of algorithmic tweak to prevent confusing, misleading or low quality results could be counted as moderation.


So maybe hide them behind "show more results" or something. That's still better than removing them outright. But of course, copyright holders wouldn't be happy with that.


I would like to see Google highlight legally removed search results.

For example, leave a gap in the search results where the missing one would have been, perhaps with some details of the result which do not fall under the removal request (for example, the title of the page, even if the URL must be removed).


Add this to the long list of things that Google won't do for the exact reason that you want them to do it. They're not on your side any more and haven't been for a decade now.


> search engines are used to find things publicly available on the internet

This used to be the case. As the internet grew in size search providers felt it was in best interest in general user to cull these results and present their definition of best at the top. They wanted to solve the needle in the hay-stack problem by taking all the noise out and funneling the best result to the top.

Best is a subjective term and I don't think I agree with it. However, your definition is not the same as Googles. Search providers can define what their search engine does the same way a restaurant gets to decide what's on its menu even though all restaurants serve food.


Alternative take: SEO became a thing, and the initial search algorithms were not designed for an adversarial game


I don't think anyone actually thinks in these absolutist terms. I mean, it's an oft-cited whataboutism, but what about CP? Terrorist propaganda? Insert your own list of abhorrent content here.

I mean, zero moderation is some kind of internet libertarian ideal, but there's plenty of examples out there what that looks like - 4chan, 'old' reddit, parler & co (which ironically censor a lot of stuff), all of the dark web, etc.

You state "policing should be done on the actual hosts of the content", but what if the host says "lol no"? That's what's happening here; BREIN demanded (and has done so for at least the past 10 years, probably longer) that since the hosts are untouchable and not legally required to comply (these things get complicated once you go abroad), they went for the ISP's and search engines instead.

If something is deemed illegal, and the source is untouchable, you have to go for whatever passes it on. I mean hard drugs are illegal, they are shipped across the world, and only when they arrive at a port is there something that can be done to stop it.


I absolutely agree with you - we don't live in a black and white world, and black and white measures rarely work in practice.

Should Google de-list Pirate Bay? In my opinion no.

Should Google de-list some things? No doubt yes. That is how society works - your rights end where somebody else's start.

As the down-voting of parent shows, HN gets defensive when this is pointed out.


>we don't live in a black and white world

you say that and then I see this:

>Should Google de-list Pirate Bay? In my opinion no.

Looks like pretty black in white about Pirate Bay here.

Also doesn't look like you are too much against censorship. You do not even discuss whether it should be or not. I sense it is pretty white there for you too.

It reminds me how russia controlled media spread idea that not everything is black and white but when they decided to attack Ukraine using military and took Crimea and parts of other two regions with force it suddenly became black and white for the moment of attack and then again coming back to 'oh it's not black and white we didn't took anything'

I personally love how "everything is not black and white" concept is used to justify censorship and promote everything that serves totalitarian dreams.


I think a possible solution is to at least list The Private Bay just as a site but not the content within its site. So if you want to search something within The Private Bay, PB can do it within their own site.


The "content" of TPB is just hashes. Are they illegal already?


Not really, the solution is just something to offer balance between both sides.


> Insert your own list of abhorrent content here.

There are clear laws and court decisions which Google must obey. The title says "voluntarily", which is the actual problem in this case.


I guess the question is, is it actually voluntary? The ESRB rating system was established voluntarily, but that was to avoid legislation and legal action that would have happened if they didn't. Could be a similar situation here.


The basic problem with the argument against moderation is that the Internet started without moderation.

You could post whatever you wanted to Usenet. You could email whatever you wanted to anyone, and they would see it in their inbox. Heck you could “finger” to see who was online in remote networks, and “talk” to open a live chat with anyone, totally unmediated by any commercial product. You could log in to open FTPs and trade files.

We’ve been to that particular heaven, and most people didn’t like it.

Why? While trafficking in CSM was an important and awful consequence, the negative that dominated most people’s experience was spam. That’s why web forums beat Usenet; that’s why centralized webmail beat a forest of naked email servers. Etc.

People don’t actually want unmoderated search; it would be choked with spam.

What people actually want is whatever they want, as easily and cheaply as they can get it. The Pirate Bay and other file sharing services are popular not because they represent some sort of libertarian ideal, but simply because they shovel a lot of great content to people for free.

Of course they’re popular! A person handing out $10 bills on the street corner will be popular too. But it kind of matters where he or she got those $10 bills. There are societal side effects we might want to manage.


>We’ve been to that particular heaven, and most people didn’t like it.

Many of us prefer that heaven to the hell of centralized control, censorship and commercialism that we have now!


>>I mean, zero moderation is some kind of internet libertarian ideal, but there's plenty of examples out there what that looks like - 4chan, 'old' reddit, parler & co (which ironically censor a lot of stuff), all of the dark web, etc.

There is a couple of problem with this, first I see no problems at all with 4chan, old reddit, or parler. So using them as an example of something bad that should be banned is ridiculous, parler in peculiar was gaslite into a false narrative around 1/6 protests when in reality most of the communication, as stated by the FBI, where done via Facebook and other larger platforms not parler. Parler was the scape goat. Parler should fail for any number of reasons but not because of censorship, it is a terrible platform technoligically, it is terrible security posture, and various other usablity issue.

Then there is the unintended blowback this type of censorships leads to. Such as increased levels of echo chambers and extremism. Take for example the fall out from Backpage removal. Did it end trafficking, and prostitution. No, not by a long shot, it just make the criminals harder to catch, the victims harder to find, and made things more dangerous for legal age sex workers.. Good Job Government.


> I mean, it's an oft-cited whataboutism, but what about CP? Terrorist propaganda?

I like the Freenet's author take on this. He says that porn, terrorists, drug dealers and all kinds of "abhorrent" content are a price you pay for the lives of whistleblowers and people fleeing from their dictatorship states, sects, and terrorists; and for preservation of valuable, but controversial, content.

Well, it makes sense in case of Freenet, which offers full anonymity and resilient storage (you can upload content, and as long as there are people viewing it, it will propagate itself along the node connections path, and it's impossible to take the content down). Torrents, Reddit, and 4chan are different, so maybe here the trade-off will be different too.


You were on the right track with the ironic parler comment, however, 4chan and old reddit, old school forums both always had and currently have moderation. The moderation rules are just different than on other sites, but they had hundreds of humans moderating. Even the newer, arguably worse web forums out there are still policed internally, as you note.

It's a myth that there was zero moderation. What the libertarian ideal was privacy and anonymity. Where a user could say something like sharing a link to the pirate bay and not worry about it too much when it got moderated. Websites would respect a users privacy. Moderation was about cleaning the site, not sharing a users personal data with authorities. It wasn't about free speech so much as a kind of place for users on the internet with no real world consequences for sharing, for example, a link to the pirate bay.

They all moderated.


China gives the proper answer - a national firewall.

If a host doesnt comply, they get denied by the firewall.


> I mean, zero moderation is some kind of internet libertarian ideal, there's plenty of examples out there what that looks like - 4chan

4chan has rules that are enforced, especially since the split between 4chan and 4channel. What might give the impression that it's "some kind of internet libertarian ideal" is that people often not report posts. But when they do, the janitors usually do their jobs. These days even troll posts are deleted!


In principle I agree but in this case I think Google is trying to protect itself from lawsuits. Which seems like a reasonable thing to do from their perspective. Any search engine that gets big enough will eventually have a target on its back for DMCA trolls


> I think Google is trying to protect itself from lawsuits

That is unlikely. Google breaks many privacy-related laws across the globe and deals with billions of euros of fines regularly. They have an unlimited wallet and couldn't care less about spending a few billions to extend their dominance over the Internet.


That's a logical fallacy - just because Google lost money in the past doesn't mean they want to lose money in the future. We don't know what Google's utility function was in the past, and certainly can't extrapolate to this case.


Google search prints a ridiculous amount of money. I think the lawsuit costs do pale in comparison to keeping their dominance. If Google is doing this voluntarily, they must see an upside that helps maintain that dominance.


I'll concede that we can't know for sure. But given their vast legal spendings and huge profit margins, i'd argue that whatever their strategy is, legal threats from BREIN is a drop in the ocean for them.


Is not indexing certain things the same as moderating search results?

Is default but optional moderation a bad thing?


Who should, I your opinion, be responsible for applying the moderation?


the user


If you take ranking/filtering criteria away from Google and social networks they lose a lot of power but we gain a lot of freedom. They should be forced to open the filtering part of the stack, allow more competition and user input in that space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: