Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What about Google and other companies' auto-takedown bots? Should they be penalized for getting it wrong?


Why not? If they create poorly-designed bots that damage others, why shouldn't they be held responsible?

If I create a lawn-mowing bot that goes rogue and runs over the neighbor's cat (or in a less-bloodthirsty scenario, wrecks his expensive landscaping) I'll be held responsible, yes?

I don't see why automation should be an excuse. If anything, systems that are intended to operate without human supervision should be held to a greater standard of safety, not a lower one.


Because the bot is cutting Google's own yard (e.g. Youtube), not the user's.


If my landlord's unattended lawnbot runs over my kid, he's probably going to be held responsible, right?


Yes, but copies of videos aren't kids. They're not even property.


Sigh...

If my landlord's unattended lawnbot runs over the mailbox and destroys a video that my friend sent me, he's going to be held responsible, right?

Property rights aren't absolute. They just aren't, sorry.

In general you can't damage someone else just because they happen to be on your property. Not tenants, not guests, and in some cases, not even trespassers.


Your landlord can't, because the video is your property. A copy on Google's servers isn't, even if you retain the copyright.

In any case, none of these are proper analogies. Google is providing you with a service, not holding stuff for you, and they're simply cutting people who have no contract with them except for their own ToS off.


Isn't it a bit of a stretch to suggest that Youtube taking down videos is damaging to the owner?

I see the analogy more like this -- if you place your mailbox on my property (your video on youtube), the unattended lawnbot moves the mailbox off their property.


Youtube isn't really Google's own yard, though. Or isn't wholly their own yard, at least.

Google has chosen to make money from other people's content, and as such they have a responsibility to treat that content with respect.


On what grounds would a company be penalized for taking down parts of their own websites? Should YCombinator be penalized if they delete a post?


I think he's pointing out the problem that if they start penalizing false DMCA claims, people will find non-DMCA ways to take things down.

Given that publishers have managed to get Google's ContentID system to misidentify public domain songs, bird songs, and other such things as their exclusive property, bad faith or otherwise negligent copyright claims are a real problem.

Ref: http://www.geekosystem.com/rumblefish-birdsong-takedown/


I agree with you, but the problem the parent poster is pointing out, is that YouTube is a private site. Google should be able to do what they want. If they do something stupid (and I agree the copyright bot is stupid), then it's their own problem. In theory, a competitor could do a better job at not repeating goggle's mistakes, and beat them in the market.

It's worth mentioning that copyright bots are not even required by law. Google could be complying to dmca requests without automatic take downs. They're going beyond what the law requires because they want to be in Hollywood's good side. So a competitor could still be legal without the bots.


BTW, there probably are some viable claims someone could make against a person convincing third parties to remove their work from the internet, particularly if they could show financial harm. However, recordings of natural bird song and the like are unlikely to get protected via expensive lawsuits because the upside of that is incredibly limited.

I suppose we might see litigation whenever DMCA notices start getting used for election-related hijinks, though.


"Google is a private company" is not a set of magic words.

See my other comment for a simple scenario that shows why: http://news.ycombinator.com/item?id=4497393


They shouldn't but if you can't have a perfect system then it's best to have a system as close to perfect as possible and that is something we do not have. So ideally an entity wouldn't be punished for requesting a takedown of their own content but because of the insane volume of bogus takedown requests I think it's better to punish those who send them and if anyone accidentally sends a takedown request for themselves then them's the breaks until the system can differentiate between legit, bogus, and self takedown requests.

Furthermore, I'm not sure how YC deleting a post and issuing a DMCA takedown request against content on their own site are the same thing. This isn't about deleting content at all. Whether a site deletes content isn't at issue whether it violates copyright or not. The issue is sending bogus takedown requests to others.


I read parent's post as taking about self-censoring bots like Youtube's Content-ID, and not DMCA sending bots. As far as I know, Google doesn't use the latter.


So if gmail's spam filter isn't perfect, it's better to have no spam filter?


No, exactly the opposite. I'm saying if you can't have perfect then take the next best thing. In your example the next best thing would be taking an imperfect spam filter over none at all.


I was asking, not implying any position on my part.

I don't think website owners should be prohibited from deleting user content as they see fit. But I do think the takedown bots are different. They are mainly used to avoid lawsuits and to appease the RIAA/MPAA; websites wouldn't choose to use them if left to themselves. I think they cause as many problems as bogus DMCA notices and should be discouraged.

I don't necessarily believe the discouragement should be legal, but maybe. For example, if human oversight was required before the content was removed, that wouldn't necessarily restrict companies' ability to remove content, but it would minimize bogus takedowns by bots.

Again, I'm not really staking a position here, I'm just trying to start the conversation while I figure out my own views on the subject.


For example, if human oversight was required before the content was removed, that wouldn't necessarily restrict companies' ability to remove content, but it would minimize bogus takedowns by bots.

Sure it would; it'd make spam removal impossible, for example.


Private companies have no obligations to honor freedom of speech. Google has every right to say that your content must get a passing grade from [insert black box]. If the black box happens to be designed to filter out copyright violations, it gets no special treatment from the law. Having said that, the main reason for many of the oversensitive, automated bots, is that big companies like Google (who have user generated content), still need to be incredibly careful in the area of copyright to ensure that they are protected. I suspect that strengthening safe harbor would largely solve this problem. I would also like a system where sites were highly incentivised to promote free speach on their platforms, but other than public pressure, I do not see how to do this.


Actually, this has nothing to do with freedom of speech, and even Google's terms of service are irrelevant. Here's a very simple scenario that results in hauling Google and/or Google's partners into court:

1. Alice produces a video consisting of 100% original creative content, and registers copyright to it.

2. Alice licenses the video to Bob for distribution, under terms which allow uploading to YouTube.

3. Bob uploads a copy of the video to YouTube.

4. YouTube flags the video and declares the copyright to be held by someone other than Alice.

5. Alice, who is not bound in any way by Google's terms, heads down to the courthouse with a copy of her registration papers and files suit over the misrepresentation of her copyright.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: