Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Section 230: A Key Legal Shield for Facebook, Google Is About to Change (npr.org)
291 points by Mononokay on March 22, 2018 | hide | past | favorite | 112 comments


This bill is a complete disaster. Sometimes the EFF can be a bit dramatic; this is not one of those times. It's every bit as bad as they say it is.

Here is the text: https://www.congress.gov/bill/115th-congress/house-bill/1865...

The key is 18 U.S.C. 1591. That statute creates criminal liability for those who use force or coercion to cause someone to engage in a commercial sex act, or anyone who "benefits, financially or by receiving anything of value, from participation in a venture" to do so. 18 U.S.C. 1591(a)(2). So far so good. You can get the guy who runs the sex trafficking business, not just the grunt who uses the force.

The bill amends 18 USC 1591(e) to state: "The term ‘participation in a venture’ means knowingly assisting, supporting, or facilitating a violation" of the foregoing provision.

One, this is grammatical lunacy. "Participation in a venture" means what you think it means: people who participate with intent to further the criminal enterprise. The bill amends section 1591 to define "participation in a venture" quite differently from its plain meaning. It's like a law that says: "as herein used, 'green' means green as well as blue and purple."

Two, what the heck does "knowingly facilitate" mean? Is it enough to know that, in the abstract, your service is used to facilitate sex trafficking? Because its guaranteed that services like gmail are used to facilitate sex trafficking. You couldn't say otherwise with a straight face. Or do you have actually intend your service to facilitate sex trafficking (like backpage arguably was doing)? If that was the intention, the drafters really screwed up because "knowing" is a lower standard than "intent." You learn that in the first year of law school.


>Two, what the heck does "knowingly facilitate" mean? Is it enough to know that, in the abstract, your service is used to facilitate sex trafficking?

One would hope that the courts interpret this to mean that the defendant knew that a specific act (e.g., accepting a specific ad) would facilitate a violation but still performed this act. (Or maybe the courts could simply declare the amendment void for vagueness.) Unfortunately it will probably take expensive lawsuits to resolve this.

>Because its guaranteed that services like gmail are used to facilitate sex trafficking.

Since 18 USC 1591(e) isn't limited to online services, one can even go further and show that a "knowledge in the abstract" standard would catch even electricity providers.


> Two, what the heck does "knowingly facilitate" mean? Is it enough to know that, in the abstract, your service is used to facilitate sex trafficking?

That's exactly what it means: the explicit intent of the change (as many of the people pushing it have stated) is to compel online providers to police content. (And there has been noise made that if they don't adequately step up active policing in other areas, this approach will be extended to other kinds of content.)

It is very clearly a move for safe harbor for user-submitted, not-actively-moderated content to a safe harbor only for user-submitted content where the host is actually completely ignorant.


Yeah this seems like it might actually have a reversing effect on platforms' stewardship of their content. Now that they only have to "knowingly facilitate", they'll probably defer to being on the safe side of not "knowing" in the first place.

Or is that explicitly disallowed by some other text in this statute or by some other law? For instance if Google made it absolutely impossible (e.g. w/ strong crypto) for them to "know" what's inside their users' emails, is there some law that could force Google to undo that (ostensibly in the name of national security or something like that)?


This is an interesting point. The issue is with them knowing about it, as much as them not removing it. Implementing real privacy does provide an out here, business models that revolve around data are really the only thing that gets hit hard here.

I feel like data platforms got themselves involved, the platform immunity excuse stopped holding, when they started doing their own content moderation to begin with. Once they decided to remove content they didn't like, were they not implicitly endorsing the content they left alone?


> Two, what the heck does "knowingly facilitate" mean?

Historically this has been rather clear.

Take the Silk Road. Arguably just an eBay clone on the dark web, right? Except the proprietor specifically marketed it for the trade of drugs. There was no question about what its intended purpose was; he created it to sell drugs, internal documents show he knew drug trade was going on; by all accounts, he knowingly participated in the venture.

There was a small ISP somewhere in NY. Authorities notified them that they were hosting child porn provided by a customer. No biggie, right? Section 230 and all. Except they didn't take action to remove it, which put them in the camp of...knowingly participating in its possession/distribution. They got fined.

Backpage is in the same camp. They know what's going on, even contributing to it. They knowingly participate in the venture.


“Knowingly participate” is the existing language. I agree it’s clear. “Knowingly facilitate” is the new language. That has a much broader meaning, especially when you’re talking about Internet services, which are designed to “facilitate” whatever the user wants to do.


"Especially" Internet services, but the same argument would apply to offline services, like vehicle rentals or building roads. Presumably the law isn't intended to go that far.


I see your point, and I don't necessarily disagree, but for whatever facilitate does actually mean, you'd still have to be a knowing participant in the matter.

In this context, I can only speculate the language was added wrt the editing of specific ads. They weren't assisting in their publication, they weren't supporting their publication, but they did know what they were and made changes to...facilitate...those ads staying published and under the radar of authorities.

But I do see your point, and agree that it will likely be used for some unforeseen interpretation.


You keep saying “participant.” The law basically removes the word “participation” by defining it to mean things other than what everyone would consider “participation.”

You just have to be a knowing “facilitator.” Toyota can’t deny that it “knows” that it’s cars are used to “facilitate” drug deals. It’s not a “participant” in those drug deals, but the whole point of this law is to get rid of the “participation” requirement.


Surely there is still a distinction between knowing that people use your product to do bad things & knowing that specific people are doing specific bad things and continuing to allow it.


The problem is that the cost to argue this legal distinction in a civil court could be in the $500,000 - $1,000,000 range. Fine for Facebook, deadly for your gaming or crypto forum provider.


Yeesh, I don't think you want to cite Silk Road as precedent. That investigation and prosecution is a black mark on the Law. The fact that multiple federal investigators also went to prison ought to be a clue for the ignorant, but there is far more wrong with Ulbricht's life sentence than just that...


The fact that multiple federal investigators also went to prison...

What? When did that happen? Please, could you share some link?


Some of the federal investigators stole some Bitcoins from the Silk Road over the course of their investigations and tried to cover it up and keep the Bitcoins for themselves.

https://arstechnica.com/tech-policy/2015/10/corrupt-silk-roa...

https://arstechnica.com/tech-policy/2015/06/secret-service-a...


Thanks, but the link on The Great Shaun Bridges is somewhat out of date:

https://motherboard.vice.com/en_us/article/vv7dgj/great-mome...

https://www.washingtontimes.com/news/2017/nov/8/shaun-bridge...

[EDIT:] Scroll to the bottom of the Motherboard article, click on this exemplar of LEOs' name, and read more. Apparently there may have been another LEO ridin' dirty on this investigation, only we don't know who. All we know is that she or he modified all the digital evidence that was later used to convict Ulbricht. This is law enforcement, people.


Well, your example clearly involves theft of private property, so why shouldn't they go to jail? It's worse enough that the police can sieze cash arbitrarily. If it was legal to do so for the officers' own use, how is policing different from racketeering then?


How, indeed...


Thank you! I missed those news.


I don't know if pointing out a couple of true positives is a useful counterargument that a rule will not have many false positives.

I could posit a law that says all brown-haired people should be summarily executed. You could then correctly note that the law would have caught the Green River Killer, John Wayne Gacy, and Ted Bundy. That doesn't make it a good law.


Silk Road would still be operating if they'd had adequate OPSEC. Same for other dark markets that police have busted. Operators made stupid mistakes.


What's a tangible example of someone "knowingly..facilitating a violation" of laws punishing "those who use force or coercion to cause someone to engage in a commercial sex act" that you would consider excessive?


> Is it enough to know that, in the abstract, your service is used to facilitate sex trafficking? Because its guaranteed that services like gmail are used to facilitate sex trafficking.

I think that quote from the GP directly addresses your question.


If I were a state AG, I’d bring a lawsuit based on the parens patriae provision. In the parens patriae context, where you’re suing on behalf of the public generally, the fact that the provider may not know about any specific instance of sex trafficking doesn’t help much. The argument is that the service is facilitating sex trafficking that affects residents of the state, and the service knows either that’s happening (in general) or is being willfully blind by not policing content.

I think that’s excessive, I don’t think services should have to moderate or police user generated content.


> Two, what the heck does "knowingly facilitate" mean?

In law it is not about what you know, but about what you can prove. A person's belief system or fantastic truths are not evidence of a crime.

In your GMail example you have a reasonable faith that people may use the service to engage in communications in facilitation of a criminal enterprise. Your faith is irrelevant. You must have the email communications in hand along with a valid search warrant.

That isn't what most people are losing their minds over. People are losing their damn minds, according to wonderfully mature comments on the earlier news article at ArsTechnica, on two points:

1. This law is hostile to any legally valid sex business.

2. This law holds web site owners responsible for the content that appears on their site.

My opinion on those two:

1. Sex businesses whether legal or not are generally considered a social vice much like alcohol and tobacco. There are all manners of legal restrictions upon commercial communications generally, and particularly for services and products that are considered vices. They are heavily restricted in where they can advertise and any media that violates such laws can be held responsible. This has been a thing for decades, so it is only fitting that it should apply to sex businesses which are slowly creeping towards wider legalization.

2. Web publishers get all kinds of legal passes that publishers in other mediums don't get under the guise of third-party attribution. Web site owners should be liable for the content that appears on their site. This has been true somewhat from a civil perspective ever since Viacom slapped the shit out of YouTube in a multi-billion dollar law suit[1], but it hasn't been criminally true. This law cracks open the door for potential criminal violations, is panic for businesses that publish user contributions in order to monetize third-party advertising.

Speaking about web-site owners being held liable for the content on their site this reminds me of SOPA[2]. I know this made a lot of people cry (with really big tears), but is far more upsetting to social media companies. It isn't because of potential free speech restrictions or unnecessary regulation. It upset social media business, because it would have crushed their business model.

[1] https://en.wikipedia.org/wiki/Viacom_International_Inc._v._Y....

[2] https://en.wikipedia.org/wiki/Stop_Online_Piracy_Act


This law is not good for the internet. This isn't about Google and Facebook, it's about the open internet as we know it.

And, it's high time we recognize these services to be "infrastructure". You don't persecute the street owner for vehicles that drive on it - if you could, the state would be liable for all crimes involving a vehicle


I disagree here. Child porn and trafficking are always the arguments used to go after free speech on the internet in the court of public opinion.

If we narrowly edit the law to specifically go after those cases, then it creates less opportunities for scandal later that end in much broader restriction.

Also -- although it should go without saying, those things are actually bad! So if an unintended interpretation is making them easier, clarifying (with as little collateral damage as possible) that is congress' job.


Except that's not what the bill does. The bill takes an existing law, 18 USC 1591, which provides a narrow scope of liability for "participation in a venture" to perform sex trafficking. I.e. running a sex trafficking business. It redefines "participation in a venture" to include "knowingly ... facilitating" sex trafficking, and then carves out 18 USC 1591 liability from the section 230 safe harbor.

This goes far beyond the backpage situation, which could've been reached with a higher intent standard like "intent to facilitate" sex trafficking.

If Google knows that sex traffickers are using gmail to coordinate their sex-trafficking activities, under the plain text of the statute, they'd be liable for "knowingly facilitating" sex trafficking. This will be abused by prosecutors and plaintiffs' attorneys.


Wouldn't it have to be a specific instance?

Also, What's the definition of know?

Someone somewhere in the organization knew?

you have reasonable processes in place to catch activity?

how does slang, metaphor and euphemism play into it?


Yes and no. "Knowing" encompasses "willful blindness," where you have warning signs but deliberately avoid investigating.

If I'm suing Google under the new 18 USC 2421A or the new 18 USC 1595(d), created by this bill, I'm arguing that "Google has tons of warning signs that its services are used by sex traffickers; even if it didn't know about the specific instance of sex trafficking I'm suing over, it was being willfully blind by not having a system in place to identify sex trafficking."


I'm really not trying to be facetious,

> If Google knows that sex traffickers are using gmail to coordinate their sex-trafficking activities, they'd be liable for "knowingly facilitating" sex trafficking

Is that not...good? If Google is reading my emails to sell me stuff (and I've given them permission to do so by using their service), and sees (reasonable) evidence I'm engaged in illegal activities, don't they have a responsibility to report that? Is that not the same thing as a GeekSquad technician stumbling across child porn on a laptop they're fixing?

Further, if Google doesn't want to knowingly facilitate, then they shouldn't know -- give me the ability to e2e encrypt my messages and stop reading my emails. Haven't they opened Pandora's box here by going that extra step further and data mining my email?


> Is that not...good? If Google is reading my emails to sell me stuff (and I've given them permission to do so by using their service), and sees (reasonable) evidence I'm engaged in illegal activities, don't they have a responsibility to report that? Is that not the same thing as a GeekSquad technician stumbling across child porn on a laptop they're fixing?

No, it's not good.

Any internet service provider that reaches mass adoption faces a dilemma:

- Face legal problems

- Become another arm for law enforcement (and maybe even bypass the 4th Amendment)

The prison industrial complex grows fatter.


The government & corporate prison complex has been declining for about a decade now. It has been rolled back about 20 years. It's not growing fatter:

https://i.imgur.com/ppyHKC4.jpg

By the time pot legalization gets to over half the states (in the next 10-15 years), we should have the incarceration machine rolled back to the mid 1980s, just six or seven years after the mass incarceration boom took off.


Do you really want every brother and their mother with your data to have the ability to ruin your life like that?


Maybe I'm not being clear...

In the absence of clear, well-considered privacy laws (which I'm not the least bit hopeful for, particularly internationally consistent..), Google will have access to this data. As a randomly chosen user, there's probably nothing I can do about this (assumption being the average random user can't figure out how to enable e2e encryption). So Google will be data mining my data, if they choose to. And note I'm only choosing Google here because the user I replied to used Gmail as an example, I don't think Google is particularly unique or special in regards to using their user's data for profit.

Under those assumptions, if Google finds something illegal, they should say something, right?

There are a number of "outs" here. Privacy laws, Google giving me the ability to opt out/encrypt my data, ...

Do I want this? Hard no, but that's really not my point.


GeekSquad actively searches for it, so they would probably be in the clear under this law.

Facebook knows many drug purchases are coordinated openly on college groups, even though their intent is not to facilitate them, by not searching out and reporting them they would be liable once this law is naturally expanded upon.


> Facebook knows many drug purchases are coordinated openly on college groups

How does Facebook know this, if they still have to search them out?

> ...once this law is naturally expanded upon.

Yeah, I can see that. Slippery slope and all that. On the other hand, I feel like e.g. Facebook needs to have an incentive to police their community. I, at least, don't see a clear path to prevent Cambridge Analytica-style abuses, not through legislative changes, not in this political environment...


Reading the bill, it seems two-(or maybe 2.5-)pronged: it seeks to restrict prostitution, and it seeks to restrict "sex trafficking" a term which is undefined and appears three times in the text, one of those times with the modifier "child" preceding. This seems deliberately sloppy, to give prosecutors total discretion to do anything they want. We have far too many laws like that already. Most of the time, new laws targeted at prostitution end up hurting individual prostitutes more than prostitution itself. The innovation here is to hurt e.g. hotels that have websites more than prostitution itself.

IANAL. However, I've seen "unintended" consequences often enough not to believe Hanlon. Also, it's far from clear that the best way of dealing with prostitution in this age is by prohibiting it.


As a reduction to absurdity, but also a very real threat given prosecutorial overreach:

Does your comment rationally questioning anti-prostitution laws “promote prostitution” under the words of the new bill? Does mine? Is any YC staff that sees this also complicit if they don’t remove our comments immediately?

Is a law unconstitutional if the threat of enforcement suppresses protected speech, namely policy debate?


"Promotion" would likely be interpreted in the sense of commercial advertising. A law that reached advocacy of prostitution in the abstract would be clearly unconstitutional under the standard set by Brandenburg v. Ohio, 395 U.S. 444 (1969). Only speech that proposes a specific illegal act is excluded from First Amendment protection. Pittsburgh Press Co. v. Pittsburgh Commission on Human Relations, 413 U.S. 376 (1973), and United States v. Williams, 553 U.S. 285 (2008).


Good grief, Mr. Williams was so dumb. Still, any ruling that leans so heavily on Free Speech Coalition probably isn't terrible.


Do you ear often cases of counterfeit money? Almost never because the punishment is very high and it is considered as a very severe crime. IMHO, child porn is only a scapegoat for liberticide laws. If the states wanted to fight child porn, it could eradicate it without changing any law. Even if child porn is a serious issue, I am convinced it is completely negligible compared to the number of children raped by their parents. Even stars like Woody Allen are concerned.

When I read that we should fight child porn, it means that we should give up our freedom and our democracy. Choose your fight.


> Do you ear often cases of counterfeit money? Almost never because the punishment is very high and it is considered as a very severe crime.

You have drawn the wrong conclusion.

There is less counterfeit crime because it's harder to pull off, not because the laws are tougher.

Counterfeiting money is far more complex of a crime, and only benefits one-side, the suppliers. No business, legitimate or otherwise wants to deal in fake currency.

On the other hand, children are highly vulnerable and easy to victimize. The suppliers and the buyers are both incentivized to perpetuate this crime.


> Do you ear often cases of counterfeit money?

Unrelated tangent, but not only do I often hear cases of counterfeit money, I had friends who would counterfeit it in highschool just with a little image editing and a printer. This small scale type of counterfeiting is actually by far the most common type. Won't work against newer plastic-y bills in my country though.


> Even if child porn is a serious issue, I am convinced it is completely negligible compared to the number of children raped by their parents.

This problem is worse than that though. Children abused by parents/caretakers are victims of circumstance. Problems with foster care aside, moving the child out of that environment should stop the abuse.

The problem here is that of commerce. There is an endless stream of Uncle Lesters in the queue. The child is a victim for as long as they have value on the market, and then they're found dead in a ditch somewhere.

> When I read that we should fight child porn, it means that we should give up our freedom and our democracy. Choose your fight.

You're going to have a hard time finding popular support for the notion that freedom protects your right to sexually exploit underage girls.


>freedom protects your right to sexually exploit underage girls

That's not what he said or implied at all


Commercial child porn is extremely rare. In 2010, for example, not a single federal distribution case in the United States involved commercial activity. The majority of prosecutions are against people downloading it off peer to peer networks for free; the rest are against people trading it for more child porn.

https://injusticetoday.com/new-doj-report-demonstrates-stunn...


This misframes the argument GP was making IMO. That those are reasons to go after free speech on the internet has nothing to do with who they are going after. Everyone agrees that they should go after the perpetrators, just not everyone agrees who the perpetrators are. The street analogy GP uses is apt if you assume these internet services are just infrastructure. If you do not and assume they are like newspapers or storefronts, you might think differently.

> If we narrowly edit the law to specifically go after those cases, then it creates less opportunities for scandal later that end in much broader restriction.

We already have law to go after these cases, just not the conduits. We have to keep our eye on who the real criminals are, go after them, and we already have the laws to do it. Unfortunately the emotional appeal will be strong enough to go after others (backpage being a bad example because they may have been actually complicit). And we know these grasps for more power/authority rarely are kept at their minimum state or reduced. Want to actually avoid "much broader restriction"? Stop it where it starts.


A government will always its power. That is why you give it as little as possible and let the people tackle the social issues.


What are the legitimate functions of government?

Possibly relatedly (depending on whether you think it's a legitimate function of government to be the answer to this question), what sort of social issues may the people tackle, and with what sort of power, and how is abusive vigilante justice prevented? Should Reddit refuse to allow people to arrange for sales of alcohol using their site as a communication platform, because underage people might buy beer? (This is a new policy from approximately yesterday: https://redd.it/863xcj ) Should the CEO of Cloudflare, bending to public pressure, refuse to serve Nazis on ideological grounds? Should a religious person with a genuine belief that abortion is murder be allowed to use deadly force to stop an abortion clinic? Should the Black Panthers be allowed to bear arms in public whenever Oakland police are in black neighborhoods, in order to dissuade the government from abusing its power? (Ronald "nine most terrifying words" Reagan signed the act that made this illegal.)


> Should Reddit refuse to allow people to arrange for sales of alcohol using their site as a communication platform, because underage people might buy beer?

To be fair, they're doing that in reaction to Sec 230.


Sure. But that's ultimately because the government has defined that facilitating beer sales to people under 21 is illegal.

If the government decided to stay out of drug regulation entirely, would it be reasonable for companies like Reddit to decide on their own that people under some age should not be able to purchase beer online, set that age, etc.? (Presumably, because there is not overwhelming public support to repeal those laws, society at large believes that young people should be prevented, passive voice, from buying beer - if we move enforcement of that social problem to the people, how should that get implemented?)


The people for the most part either don't want to or can't deal with social issues themselves. They want someone else to do it for them. So then the people put pressure on social networks instead, and they do whatever they're going do to police their network.

I guess that's better? In theory at least, people can switch social networks. To some extent you'll get the level of regulation you prefer.

Even that's a bit optimistic, since a lot of people aren't technically competent and just do whatever their friends recommend.


That's the trap -- the problem is that to comply with the law with respect to these things, you need to provide a facility for interception and inspection that can trivially impact all things.

I'll give you an example where I've seen this impact. Due to an unrelated event, a past employer retained backup tapes which included certain information going back many years, which normally would have been lost in the normal cost of business. When a litigation event took place, they were forced to restore that data, which was inadvertently preserved.


The difference is that in this case, the street owner was explicitly notified (made aware of) the fact that Immortan Joe was using his road to rape and plunder, and the street owner responded by providing cover for these activities instead of stopping them.

Backpage (allegedly...) played an active role in conducting sex traffic by posting its own ads and modifying third-party ads for child prostitutes to make them less obvious to authorities. Like Silk Road, there is no deniability-- they know damn well what business they're in, so they don't get to play the "b-b-but we were just running a website!" defense.


Sure, and now Backpage is being prosecuted under current law, which is how the justice system should work. Do we really need another law to tighten the stranglehold on the entire internet, when current laws were apparently already enough to prosecute Backpage?


I totally agree; I don't understand why they've had so much trouble applying existing law to this problem that we need new ones now.


Because governments will always do whatever they can to expand their powers at the expense of the citizens they are supposed to serve.


Snapchat should be on notice. They're in the same business.

https://www.huffingtonpost.com/eric-yaverbaum/is-snapchat-pe...


And what they were doing was illegal under existing laws- no need for new ones- unless Congress is just using Backpage as a smokescreen to hide their real motives.


>And, it's high time we recognize these services to be "infrastructure". You don't persecute the street owner for vehicles that drive on it - if you could, the state would be liable for all crimes involving a vehicle

Sure, but a public street is state-owned. A public street agency doesn't profit from trying to convince the right kind of people to drive on it with the right kind of cars so it can market things to them or charge tolls.

I oppose this law for the same reason as you, but it would be better to just admit that vital network infrastructure of the common-carrier kind shouldn't be reliant on feeding eyeballs to advertisers and manipulating users.


Services that decide to implement their own, private, content based restrictions on speech have already ceded their primary and strongest argument to be considered "infrastructure".


Agreed. Hacker News is definitely not a form of infrastructure, while GMail clearly is (even though it technically does have some content filtering, it's obviously necessary to make the service usable).

But there are gray-areas between websites with a full-blown CoC and apps that are deliberately compared to the Postal Service. Where do you draw the line?



A webpage or app is not infrastructure, even if the underlying network is.


> This law is not good for the internet.

The law is good for people.

“The internet” is not a member of society. It is “infrastructure”. If the law allows roads to be built in a way that systematically hurt people, you change the law.

Well done US Congress!


The Internet can be colloquially interpreted to mean the community of Internet users. The isn't necessarily the same as society in general, and for the purpose of argument, it is useful to distinguish community that is specifically enabled by the Internet.

To extend your road analogy, lawmakers can regulate roads in ways that uniquely harm the motoring community, in the name of protecting society at large.


It's already made clear this law will hurt people. Also, the current law already allowed people to fight the abuses presented by this law. This will do far more damage.


>This law is not good for the internet.

It's not bad or good. It's new rules, and people will figure out how to play by them. If hosts become more liable for what people say, maybe we all go back to self hosting. In my opinion, that would be a net positive, if you can excuse the pun. Right now, the rules allow corporations to control speech. I think less of that is a good idea.


It isn't a popular statement, but I'm super excited about this, and if anything, this didn't go far enough. Section 230 needs to be removed in it's entirety. As Matt Stoller from the Open Markets Institute put it last night, this is the first time that big tech has ever really lost in Congress.

One example that stuck out to me recently was Google's participation in the drug rehab scam market. The Verge called them out on it repeatedly, and Google finally stopped doing it. https://www.theverge.com/2017/9/7/16257412/rehabs-near-me-go...

As noted in the article, Google was making as much as $230 per click. While they knew a lot of the ads they were selling were scams, the market was extremely lucrative for them, they were bringing in hundreds of millions of dollars with it. They even had marketers specifically catering to building the rehab ad market.

Thanks to Section 230, Google can't be held responsible for scams that they knowingly continued to support and fostered the growth of... and made hundreds of millions of dollars from. Even though Google's pulled the drug rehab ads now, all that profit is still theirs to keep, even though it should really go to some sort of victim's compensation fund.

Tech companies have shown repeatedly they can't be trusted with blanket immunity from prosecution, and it's time to take it away.


Should gun manufacturers be liable for shootings? Car manufacturers for hit and runs? What about the USPS for when people ship drugs or other illegal contraband?

This seems like an awfully slippery slope where companies are are not only responsible for their own actions, but also for the actions of their customers as well.


If gun owners are actively advertising to people they perceive to be likely mass shooters: Yes.

There's just so much here. They actively encouraged the practice, they made hundreds of millions of dollars on it, and the perverse incentive of a bidding war caused by all these scammers mean that despite knowing about the problem, Google had no desire to fix it.


> If gun owners are actively advertising to people they perceive to be likely mass shooters

That's an extraordinarily absurd setup legally. How do you perceive someone to be a likely mass shooter exactly? What the hell is likely? Perceive what? Your entire concept is legally broken top to bottom, it would never stand up to a challenge.

And how do you define that they're specifically advertising to that person?

So: first, you have to magically perceive that someone is "likely" to be a mass shooter. Then the company has to be caught having identified someone as a likely mass shooter. Then the company has to intentionally advertise to them. Then it has to be shown that they intentionally advertised to them. Beyond being a non-functioning legal premise, in the best case scenario you just narrowed the risk for the gun maker down to zero.


That is my point, that the parent of my comment was not making a really comparable example to the one I was giving of Google's actions and how Section 230 absolves them of responsibility they truly should share in.


Your analogy is not quite right. Google in this example isn't the gun manufacturer, they are simply the building owner that somebody else is selling guns out of.

The GP wants that building owner to be liable for the actions of the person that their tenant sells to.


Here's the bill: https://www.congress.gov/bill/115th-congress/house-bill/1865

The current version seems to be narrowly written to only apply to sex trafficking.


It's narrow in the sense that it only applies to sex trafficking. It's not narrow at all in terms of the relationship between your online service and sex trafficking. From your link:

"(Sec. 5) The bill amends the federal criminal code to define a phrase related to the prohibition on sex trafficking. Currently, it a crime to knowingly benefit from participation in a venture that engages in sex trafficking. This bill defines "participation in a venture" to mean knowingly assisting, supporting, or facilitating a sex trafficking violation."

It "facilitates" sex trafficking to be able to use gmail and Google Maps to conduct sex trafficking activities.


You're leaving out "knowingly," which is an operative word there.

If Google puts out a product offering called "Gmail for Pimps" or a maps overlay called "Where My Bitches At?" then they might be complicit in something. Until then, they are afforded deniability.


I'm not leaving out "knowingly." Google "knows" that its services are used to do all sorts of unsavory things. You have to assume that Gmail and Gchat have been used to arrange murders.

Your examples cross the line from "knowing" to "intent." They show not only knowledge that products are used for illegal activities, but intention to encourage and profit from that illegal activity. Releasing "Gmail for Pimps" would suggest specific intent to profit from prostitution, rather than just "knowing" that Gmail is being used for prostitution.

"Knowing" and "intent" are terms of art in criminal law, and "knowing" is well understood as being a lower standard than "intent." The law uses "knowing" on purpose.


That is a much broader reading then is used elsewhere. In general specific knowledge is required.

> The word “knowingly” in law means consciously or with knowledge or complete understanding of the facts or circumstances.

https://definitions.uslegal.com/k/knowingly/


No, if someone else uses Google's tools to create a product offering called "Gmail for Pimps" Google is liable.


and sex work generally. “Sex trafficking” is used as an excuse for a broader crackdown, forcing sex workers further underground and into more dangerous situations.


The bill seems at once too specific and too broad. Too specific in that it targets sex trafficking and prostitution narrowly, but too broad in that if you run the site and someone uses it for that, it doesn't seem to matter whether you had any idea it was happening or not.

In other words, it looks like some politicians saw a topic they could exploit for political points and we're yet again stuck with crappy legislation because they couldn't be bothered to do the real part of their job, which is examine the topic in depth and make considered, useful legislation.


As far as the Feds (and some states) are concerned, all prostitution is trafficking. Did you pay for a prostitute's Uber home after your hook-up? Congratulations, you've engaged in "human trafficking". Enjoy your long prison sentence.


If it passes, it'll be inevitable for its scope to expand beyond sex trafficking - little by little.


do you have evidence that something like that has happened before with such a specifically worded bill?


I still remember the UK using the anti-terrorism act to seize funds from Icelandic banks in 2008. Iceland doesn't strike me as a terrorist nation. If laws can be abused they will be abused.


I fully expect various hacker groups to have a field day with this after it goes into effect. It will be a whole new world of swatting. Find someone you don't like online, crack their website, place offensive material on it, call the internet police.

Worse yet, government agencies just gained a new tool to take down and ruin any technology individual or technology company they don't happen to like even easier than ever before.

The concept of competence as a politician has come up now as a talking point for the first time I can remember in a sustained way. And I think there's plenty of room to talk about someone besides Trump in that light. This bill is written so unbelievably poorly that no one knows what it means. At least not the people who wrote it and support it. Everyone pushing for it has a different take on what the standard of knowledge means and what the intent of the bill is.

This bill is either intended to be misused in the worst possible ways (in which case, the people pushing it are bad actors and lying about what they want it to accomplish) or they are utterly incapable of writing effective legislation--they are incompetent.

I'm going to laugh my ass off if these corrupt politicians start using the law against each other and hiring people to fuck up rival's websites and tie them up in criminal proceedings.

I certainly wouldn't shed any tears if Paul Ryan suddenly found himself in front of a jury arguing that he didn't knowingly facilitate child porn on his congressional website just because it was sitting there unnoticed for 6 months.


My understanding is that documented evidence exists that Backpage.com directly intervened and edited “user postings” which were flagged as potentially underage, to make them less obviously illegal. And that their direct knowledge and moderation of the postings makes the Section 230 liability moot.

So why is someone like NPR which certainly should know better raising the specter of Section 230 protecting abetters of child sex trafficking when in fact it does not?

I expect a much more reasoned argument from “unbiased media sources” for a proposal to change a law which is universally regarded as having ushered in the modern internet.

In the case of Backpage.com, if anything it was allowed to exist all the better to be able to track and analyze the data in the open rather than have it move to a modern dark-web platform which would be much more difficult to track.

Since all technology can be used for good or evil in equal proportion, it’s clear the best technology can also be used for the worst evil. The most private, distributed, and secure social network would be at once a boon for law abiding and law breaking citizens alike. You can’t make the tool work for one and not the other, without compromise. The legal standard for years has been if you can demonstrate sufficient legal use of a tool and that you haven’t intentionally solicited illegal uses then it passes muster. I am greatly concerned by any legislation which shifts the balance because great tech will always be used by bad actors.

I.e. Since the Bitcoin blockchain can be used to pass messages between parties anonymously then anyone running a node on the network could be supporting anonymous message passing for any purpose, obviously including underage prostitution. That doesn’t mean you shut down Bitcoin.

The government would love to make it illegal for any technology to exist which doesn’t provide them the means to inspect it and control it. But providing those same means would deeply compromise any attempt to implement a truly private and secure distributed network. I worry that the 1st amendment could be compromised to allow for such a law.


It's not just a shield for Facebook and Google it's a shield for every website and app out there,and it's the greatest facilitator of free speech online.

Not only did the headline conveniently omit that fact but the whole pieces was edited to make it seem like only zealots and anarchists have any interest in preserving section 230.

Senator Wyden who voted against SESTA was trying to convey that this change is unwise and short sighted and will actually hurt the efforts to combat human trafficking yet only his criticism of tech firms survived the edit.

It's a great irony that the press keeps calling for limits on speech online perhaps hoping that when everyone else is silenced theirs will be the only voice heard.


>Congress should revisit the law, he says, and "make the statute longer and make it crystal clear."

>Cox draws this distinction of websites like Backpage — involved or connected with their content — and sites that are "pure intermediaries." He wouldn't say whether that term applied to Facebook or Google

How can you write both sentences with a straight face? "I think the law should be more transparent but I won't tell you what it will mean"


This bill worries me greatly. INAL, but could one interpretation allow the US government to go after anyone running technology supporting the BitCoin blockchain, if BTC is used by hackers, drug dealers, etc., along with the many legitimate uses?


Yes.


How will Section 230 changes affect Bitcoin’s blockchain?

https://gizmodo.com/child-pornography-that-researchers-found...

"The researchers write, “our analysis shows that certain content, e.g., illegal pornography, can render the mere possession of a blockchain illegal.” ... As the researchers note, “Since all blockchain data is downloaded and persistently stored by users, they are liable for any objectionable content added to the blockchain by others,” which is likely true under many countries’ laws. The researchers continue, “Consequently, it would be illegal to participate in a blockchain-based system as soon as it contains illegal content.”


It's weird how some bits of data can be made illegal. I understand that having child pornography on one's computer should be forbidden because it encourage its distribution, but it's clear that the concept of data is difficult to define.

Intellectual property laws are already difficult to enforce, but the whole concept of forbidden data because it's illegitimate shows that technology is really creating problems for the law.


The true intent is to make it usable against political enemies.


Yeah the only real way to avoid this charge is to never possess a computing device.


Rana Faroohar has been writing good commentary on this topic and big tech's broader problems in the Financial Times

From December last year 'Why Big Tech wants to keep the net neutral' - good on explaining tensions between net neutrality and section 230

https://www.ft.com/content/a06bedd2-e1ae-11e7-8f9f-de1c2175f...


Make me think of my days as a landlord when the government started making laws to persecute landlords for drug dealing going on in their properties. Like those failed laws, this one just doesn't pass the smell test. Even Google, with their awesome AI, can't police their platform. And why should they? We pay the police to be the police.


When I was a sysadmin, I had several discoveries of incidents of employees saving and sharing child pornography. Luckily caught them due to an embedded virus in content. Note: CAN’T UNSEE!! I can’t imagine the guilt if I did not report them to the authorities and supply evidence to the police. Honestly, I’d have felt like I “knowingly facilitated” their actions since I managed the corporate network. Websites like backpage know what they’re doing and should fucking burn.


The companies the least at risk from the Section 230 changes are the ones that need the most government oversight. I applaud the excuse for amending 230, but both the real reasons and the actual amendment themselves are no good at all.


Don't kid yourself it's the small guy just starting out that's screwed. Not the global giants with armies of lawyers.

In fact wonder if HN is facilitating with it's current moderation policies.


It looks like this is why Craigslist deleted their personals section. I wonder if some dating sites are going to go down completely?


Why is state outsourcing the job they are getting paid taxes for and expecting private sector to do it for free ?


I wonder if security vulnerabilities will be found in the resultant eventual legally-required filter software…


This is a pretty awkward headline. For a moment, I thought Google was "A Key Legal Shield for Facebook".


It's a newspaper-style headline. I'm not really sure why NPR decided to title it that way on the web, though.


a,b in news headlines often means "a and b"... this seems to be a vestige of print news, where headline space is limited.


I know this would never happen, but if google really wanted to fight this they could shut down gmail for everyone indefinitely while they scanned all the email for evidence of human trafficking. Maybe someone can think of a solution along these lines that would similarly grab public attention but be more realistically something google bosses might sign off on.


Why do they mention Google on the title? Google doesn't rely on content. It's sites like Reddit that will take a big hit. As for Facebook, you can always report a user.


YouTube, a Google subsidiary, very much relies on content and has been saved many times by both Section 230 (against defamation claims) and the DMCA's safe harbor provisions (against copyright infringement contributory liability claims).

Note: while I worked for Google in the past, I had no involvement with anything I mention in this comment and am not speaking for them here.


Google's core business is advertising, and advertising is likely also considered 'content' in this context.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: