Hacker News new | comments | ask | show | jobs | submit login

I get the larger point you’re making, but I’d guess that copyright-protected works make up less than 0.05% of all content posted to this site and that includes repo links. /new is mostly just TechCrunch articles and things like that. I’m aware that bad faith interpretation of imprecise wording can be used to weaponize laws, and that laws are sometimes passed for this very reason, but it would be an incredible stretch (both legally and colloquially) to claim that phrasing applies here.





Every comment and article written in the last century is copywrited.

>/new is mostly just TechCrunch articles and things like that

Those are all copyright-protected works. Literally everything of any substance is automatically copyrighted.


Those articles also aren't uploaded but merely linked, so the rule doesn't apply.

Another article of the same law (Article 11) would apply copyright also to shortest excerpts of news articles, anything longer than "individual words". (Colloquially known as the "link tax" provision.)

So even reproducing the full title of a news article would likely be an infringement, and then that becomes another thing platforms take liability for/need to filter under Article 13. Whether there's a link or not would be irrelevant.

(It's meant to allow EU news publishers to bill Google and the social networks for distributing snippets/link previews of their content.)


> It's meant to allow EU news publishers to bill Google and the social networks for distributing snippets/link previews of their content.

I see this as an unreasonable reduction of previously established fair use. The fact that most of the companies linking to news articles using snippets are American, like Google and Facebook, while many news publishers involved are EU-based hints at a geopolitical motivation rather than any fundamental change to the fairness of this sort of use.


You are absolutely right: The mere fact that these publishers don't block Google using robots.txt and that they in fact spend a lot of effort optimizing their metadata so that link previews show up just the way they want them to proves that it's at least mutually beneficial, and not an abuse of their intellectual property.

Because google owns the index, if they don’t allow google to index they get no traffic. Google is a monopoly, there is no choice here, no market driving competition.

The sources I could find with a quick Google (yes, possibly ironic) search suggest news sites get most of their traffic from direct visits, but Google contributes a substantial amount[0]. It seems to me that the news sites want Google to provide them traffic and simultaneously pay for the privilege of doing so.

The behavior of publishers in countries where they won this battle is telling: a when laws were passed in Belgium and Spain requiring aggregators to license even small excerpts, Google stopped, and the publishers didn't take very long to offer free licenses.

[0] Here's one slightly dated example: https://www.sistrix.com/blog/new-data-is-google-or-facebook-...


They don’t want google or any other monopoly that is in competition with them to ‘give’ them traffic. They do want their services available and indexed on the internet. If the internet search market was evenly distributed among 5 search engines the I suspect this conversation wouldn’t exist.

Sort of, there is a big push in that direction mostly from the EU.

https://en.wikipedia.org/wiki/Copyright_aspects_of_hyperlink...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: