
"Notice-And-Stay-Down" Is Really "Filter-Everything" - jakeogh
https://www.eff.org/deeplinks/2016/01/notice-and-stay-down-really-filter-everything
======
Silhouette
_That said, safe harbors are essential to the way the Internet works._

Reasonable people could debate what form a safe harbor should take, though.

Big content sharing sites are only possible, in their current form, because
they use technology to enable scaling in terms of both content distribution
and advertising revenue generation.

Big content sharing sites often do facilitate illegal activity on a huge
scale. Plenty of them got start doing so more-or-less overtly.

Big content sharing sites often do make a huge amount of money.

It is not obvious that big content sharing sites should be magically exempted
from the rules about redistributing copyrighted content that everyone else has
to follow, just because their business models don't work otherwise. Yes, it's
difficult to determine reliably whether some untrusted third party's content
might be infringing or not. Yes, it's expensive to screen content you're going
to share before you share it. These things are difficult and expensive for
everyone else, too, but we don't all get to excuse ourselves from following
the law.

Even if we accept that having big content sharing sites is generally a
positive thing, and we therefore decide to provide a safe harbor mechanism to
limit their exposure if users abuse their system, that doesn't mean we should
absolve them of all responsibility for illegal acts committed using their
systems and from which they may profit handsomely through indirect means like
advertising. Nor does it mean we should absolve users of responsibility for
sharing copyrighted work using sites that can attract audiences in the
millions within a matter of days or even hours, just because it's done at
arm's length and "everyone is doing it".

So personally, I'm not sure I have a problem with the general principle of a
notice-and-stay-down rule here. Certainly takedown procedures on big content
sharing sites have been abused, but just as certainly there are _vastly_ more
infringing copies being shared that way without penalty to the lawbreaker than
there are non-infringing copies being taken down even though they were fair
use or the equivalent, and the ability for anyone to sign up for another
account and re-upload the same infringing content moments after it is taken
down is a gaping hole in the credibility of the existing takedown framework.

If someone wants to share content and exercise their freedom of expression,
they can still publish that material directly on their own system, without
relying on the support of any third party content sharing service. But they'll
also then have to take direct responsibility for the consequences if that what
they do is illegal.

~~~
mchahn
> If someone wants to share content and exercise their freedom of expression,
> they can still publish that material directly on their own system,

That is not a practical solution and an unfair burden. Most people do not have
that option at all. They don't know how to do it themselves and they can't
afford to pay others. Aggregating many users makes publishing affordable to
many and the filter-always rule goes a long way towards making that
impossible.

~~~
Silhouette
_That is not a practical solution and an unfair burden._

As opposed to the law turning a blind eye to tools that facilitate illegal
activity on a massive scale, while requiring everyone actually holding the
relevant legal rights to take individual action against every infringement
themselves, even in the face of blatant and repeated infringement by the same
people using the same mass distribution service?

If you want to talk about impractical solutions and unfair burdens, talk to
any small-scale, independent content producer who's seen something they worked
on for months shared on-line and watched their profits evaporate. Not everyone
in the content ecosystem is making big hits like _Game of Thrones_ or the
latest Taylor Swift single and at least benefitting from some extra exposure
leading to more people buying the content with real money later as well as
losing out to pirates who won't. Not everyone in the content ecosystem has
full-time in-house legal staff to chase down infringers, nor the resources to
actively monitor every major content sharing service 24/7 to intercept large-
scale illegal redistribution in near real time.

 _Most people do not have that option at all._

Anyone who is on the Internet in the first place has that option. You can have
control of your own web site for a fraction of the cost of a monthly ISP
subscription, and you can learn to do it with a few minutes on Google, the
help provided by your chosen hosting service, or asking your neighbour's 15
year old kid.

 _Aggregating many users makes publishing affordable to many and the filter-
always rule goes a long way towards making that impossible._

Aggregating many users makes publishing affordable to those who want to
benefit from publishing but spend literally nothing on it. Even if you have
freedom of expression protected by law to some extent, that doesn't mean
you're entitled to have someone provide you with a mechanism for expressing
yourself to a huge audience, free of charge.

For example, YouTube is free to take down any video they like for any reason
they like on their own service, regardless of whether or not any law actually
requires them to do so. It's their service. If you really believe in promoting
free expression, surely you should be in favour of more people publishing
independently and under their own control?

In any case, you're dramatically overstating the effect of the filter-always
rule as I understand it. From the description given in the article, it appears
that the rule just says if someone _has_ formally complained about a copyright
infringement under the safe harbor system, and that complaint has not been
disputed and therefore presumably the original copy has been taken down, then
the same rightsholder does not have to _keep_ formally complaining about the
exact same material if it's uploaded again after being taken down the first
time.

I don't see how this stops anyone from publishing anything, unless it's
already been published, and that has been challenged, and the person who
published it has not exercised their right to dispute the allegation of
infringement. And the extra burden here is primarily on the hosting service,
which is usually the huge beneficiary of safe harbor agreements already and
still bearing far less burden than they otherwise would under the law.

How is any of this unreasonable, assuming a fundamental desire to reconcile
freedom of expression with the legitimate interests of copyright holders that
are recognised and protected by law in as fair a way as possible?

