Hacker News new | past | comments | ask | show | jobs | submit login

One could even argue that this could trigger a new wave of self-hosting/publishing.

Probably not, as self-hosters are disproportionately affected by spurious or malicious copyright claims (they don't have the pooled resources to fight).

On the other hand, the copyright owners will have a harder time finding the allegedly infringing material. A bot that can find and scrape media on any website needs to be pretty advanced, especially if the site is using encrypted media extensions. And after that, getting into contact with the owner of the site is not trivial. It won't be worth the effort in the majority of cases. The centralisation of media on youtube, soundcloud etc has been very practical for litigious copyright owners.

Those services do not necessarily need to be publicly accessible. Think of a federated network of private servers with closed user groups, which allow to share stuff easily with selected users on other servers. The directive would probably not even apply to this, but even if it does, it would be pretty hard to go after you, ay long as you don't share critical material with people you don't know.

That's precisely why there are societies of authors, editors, publishers, etc. => in part in order to protect from malicious claims.

Sounds exclusionary.

Depends on how the link tax is played out. If companies are forced to pay a websites owner to link to it, I doubt even Google or Facebook would pay for that considering Google dropped German websites when they tried to get Google to pay them with their own link tax law. Then you'd have your website, but nothing would link to it and you would never get any traffic besides the people you specifically tell about it.

Depends yes.

But even if it plays a restrictive tune, what if we use /robots.txt to explicitly tell if the website or specific contents can be freely indexed & linked to?

It smells like an opportunity to reboot the Web in a less centralized fashion.

Isn't the link tax only for news sites? Because in that case, it won't affect non-news sites at all, it just means legitimate news sites will become unfindable. The rest of the Web will survive.

That's something I've been wishing for, yes. It would certainly be the best possible outcome, although I am not sure how the entire thing will actually unfold (I don't think anyone can be sure about it yet).

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact