Hacker News new | past | comments | ask | show | jobs | submit login

> This has been my understanding for many years. And so it's ironic how increased moderation to increase marketability brings immunity of service providers under 230 into question.

Especially since allowing providers to increase (automated and best-effort, but not comprehensive human) moderation withour incurring general liability for content was the explicit and overt justification for Section 230, because the kind of complete human editorial control expected of publishers in traditional media was viewed as preventing scalable systems on the internet (where traditional media had other scaling limits that make content liability far from the limiting factor.)

> But instead, we could have had decentralized systems that made liability impossible.

Decentralized systems probably wouldn't have made liability impossible, just ineffective at it's objectives, because no matter how many operators were ruined by liability. the content would still thrive and you'd never hit enough operators.




Yes, decentralized wouldn't be enough. I'd want anonymously decentralized.


Do you think many users will be willing to adjust to the complexity and unpredictability of that environment compared to what they experience now? Just because it's mostly way better in terms of issues of freedom and power?


Sure, I think. You just default client apps to all mainstream filters. So new users see nothing alarming. At least, until they start tweaking.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: