Hacker News new | past | comments | ask | show | jobs | submit login
Wpost Opinion on Bailing Out Newspapers (washingtonpost.com)
10 points by tdonia on May 18, 2009 | hide | past | favorite | 9 comments



"Publishers should not have to choose between protecting their copyrights and shunning the search-engine databases that map the Internet. Journalism therefore needs a bright line imposed by statute: that the taking of entire Web pages by search engines, which is what powers their search functions, is not fair use but infringement. "

Reading this, the term "spoiled sport" comes to mind...


"Publishers should not have to choose between protecting their copyrights and shunning the search-engine databases that map the Internet."

Then put it behind a paywall. Or at least put a small, indexable summary page in front of an unindexed article (using robots.txt). Or invent your own search engine/news aggregator.


I think that the lawyers who wrote the article would have benefited from consulting with a couple of qualified internet engineers.

Having said that, the piece was written by lawyers, so it isn't surprising they see primarily a legal solution to the problems newspapers are facing. When all you have is a hammer...


Cloak it. Lobby for it to be illegal for a user to mis-represent his/her/(in the case of a corporation, its) identity, and cloak the site. Let the googlebots scrape teasers. It's not like WP needs to be in the Google index.


See, the strategy here is to conglomerate everything into one giant too-big-to-fail media monopoly, then get a never-ending stream of taxpayer handouts to keep the tipsy inverted pyramid of bad journalism and political spin from meeting it's final and spectacular demise by ten million otherwise-unemployed FOIA-armed citizen journalist bloggers.



It seems funny, to me, that they would talk about making Google do the legwork. If this took effect, Google could simply do nothing -- and watch as the people who forced them to 'come negotiate' crumble. Google is the gateway to the internet for a lot of people. If you aren't in Google, you don't exist. This seems like the exact opposite of what these companies want. Also, the claim that "[newspapers] don't have the tools [to build a new model]" is utter crap. The newspapers have neglected the internet, and it's coming back now to bite them in the ass.


Search engines aren't a public service. The article half-way acknowledges the robots.txt solution, and quickly dismisses it as unworkable. It's clear, however, that publishers have a recourse if they feel the search engines are doing them more harm than good. If they want finer granularity than is currently available, they need to talk to Google, not the government.


I don't understand, are they upset that you can view cached versions of them?

Or are they upset other people are selling information for cheaper?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: