Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> how do we avoid breaking the web while keeping privacy needs in balance?

One thing that I feel would help: Massive decentralization. Self-hosting of content and regular synching of the hosted content on the server sides; or tunneling, think duckduckgo.

Self-hosting would make knowledge storage more redundant which protects against (also partial) network blackouts.

On the other hand this knowledge is then harder to control. Removing or redacting content would have to rely on the particular sites to "pull the updates" from the upstream. Copyright will also be problematic: each site would have to make copies of content with the (probably commercial) intent of serving it to consumers.

Still I can't help to think we need more decentralization and self-hosting.



Sounds good, but a solution like Peertube would leak the user's IP to even more (and unknown) third parties


People don't just use third-parties because it is fun. They do it because it reduces cost. If we have to build every service from the ground up as first party is that good economically. Does every website need to build its own CDN now?


So your solution is to introduce a massive amount of redundancy so Google doesn't know you downloaded a font.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: