
You Can't Destroy the Village to Save It: W3C vs. DRM, Round Two - cpeterso
https://www.eff.org/deeplinks/2016/01/you-cant-destroy-village-save-it-w3c-vs-drm-round-two
======
epistasis
I know my opinion is not a popular one, but I find these arguments somewhat
less than convincing.

The goal seems to be to reduce the spread of DRM. I'm cool with that. However,
I'm not sure that these actions will do anything at all to reduce the usage of
DRM. My reasoning there is that those who want to use DRM are not going to
accept any alternative that is not DRMd. So in order to stop spread of DRM,
keeping DRM non-standardized would have to prevent others from adopting DRM.

So the ultimate question for me is: who's going to start using DRM that's not
already? I think this set is zero. Adding standardization of DRM won't close
up any more content that wasn't closed before, IMHO.

However, by not standardizing, we lock out all sorts of non-mainstream clients
form accessing content. Now that Flash is going to disappear entirely, that
means no access to all sorts of content on Linux, _unless_ it's standardized.

So I see something to gain, and nothing to lose by standardizing DRM. I'm
making assumptions to arrive at that conclusion, but I believe that they're no
worse than the ones that Cory Doctorow is making here. It's just weird to see
myself diverging from the EFF on this, and on T-mobile, and other things.

~~~
JoshTriplett
Standardization provides endorsement. If DRM requires a non-standard plugin,
and plugins themselves become increasingly verboten, at least some folks who
might have used DRM may decide that they want to reach a wider audience
instead.

Consider what happened when Apple refused to allow Flash on their platform;
that certainly accelerated the decline of Flash, or at least forced the
development of Apple-specific approaches for that platform.

So what would happen if Chrome and Firefox had stood together in solidarity
and said "no"? I don't think the answer would have been a mass exodus to IE;
for that matter, what if IE had gone along with it? (Edge already bans
plugins.)

~~~
Silhouette
_So what would happen if Chrome and Firefox had stood together in solidarity
and said "no"?_

Given that Google has been one of the biggest voices in favour of EME, that
seems rather unlikely.

~~~
icebraining
Yeah, the thing is that Chrome interests are inseparable from Youtube's, and
unless they can force the media companies to let go of DRM, like Apple did to
music labels, they need some kind of solution. And frankly, what's in it for
them?

~~~
magicalist
Does youtube use DRM? All the youtube download extensions suggest not.

~~~
JoshTriplett
YouTube rentals and purchases use DRM (and the downloaders don't handle
those); normal videos just use obfuscation.

~~~
icebraining
DRM is obfuscation.

------
pornel
This spec (W3C EME[1]) has been introduced and heavily lobbied by Google and
Netflix.

Regardless of what W3C decides, Chrome won't drop Netflix support, and Netflix
for now seems to be hell-bent on having total legal control over which devices
are allowed to play their content.

[1] [https://w3c.github.io/encrypted-media/](https://w3c.github.io/encrypted-
media/)

------
Silhouette
I'm not much of a DRM fan. That said, the W3C has already become borderline
irrelevant to the future of the web, as it has been far too slow to
standardise everyday technologies that have been widely deployed in browsers
for a considerable time. Realistically, actively refusing to cooperate seems
unlikely to stop browser developers from implementing new features. It's just
going to mean that those features work differently from one browser to
another, and possibly that some browsers or platforms won't offer the features
at all. Personally, I'd rather the W3C work within the bounds of reality as a
moderating influence than have it become a mere talking shop with no real
influence at all.

~~~
magicmu
"...the W3C has already become borderline irrelevant to the future of the web,
as it has been far too slow to standardise everyday technologies that have
been widely deployed in browsers for a considerable time"

This is a bold statement, but one that I'm surprised to find myself agreeing
with. I suppose now it's a matter of figuring out what should replace the W3C;
things are only going to move faster.

~~~
Silhouette
It is a bold statement, true, but I stand by it.

A decade ago, if I wanted to look up how something in say HTML or CSS worked,
I'd probably go straight to the W3C site and just read the spec. Today, my
first ports of call are places like MDN, caniuse.com, or sometimes places like
the Babel documentation or kangax's ES6 compatibility table.

I can't remember the last time I read anything on the W3C site. I do find
myself there now and then, but sadly, it is simply irrelevant to most of my
daily web development work. What matters in a world with ever-moving goalposts
is what browsers can actually do right now, which today basically means which
of the evergreen browsers have support currently, which of their LTS releases
also have support, and what the situation with IE and possibly older mobile
browsers is, as validated by actual testing in each case.

I really wish this weren't the case, because the lack of standardisation and
vast numbers of browser idiosyncrasies makes development much more onerous and
less reliable than it should be. But the W3C just couldn't keep up, and for
now the browser developers seem to be mostly ignoring it as a result.

~~~
gsnedders
I don't think that's true at all: there's far more relevant work going on at
the W3C than there was a decade ago.

Pretty much the _only_ thing the W3C was working on that affected web
developers a decade ago was CSS 2.1 and a couple of CSS 3 modules (Backgrounds
& Borders and Selectors come to mind but nothing much else).

Yes, admittedly, it's still more-or-less just CSS work that actually happens
at the W3C (with increasingly large amounts of work happening at the WHATWG
covering the majority of the rest of the platform, HTML and DOM especially),
but the number of CSS modules being worked on is vast, and the quality of the
specifications is far greater than it was a decade ago.

As for:

> I really wish this weren't the case, because the lack of standardisation and
> vast numbers of browser idiosyncrasies makes development much more onerous
> and less reliable than it should be. But the W3C just couldn't keep up, and
> for now the browser developers seem to be mostly ignoring it as a result.

Per above, the quality of specifications is far greater than it was a decade
ago, and it is (despite the perception of many) still the case that
specifications are most of the time always nearly-complete before anyone ships
anything (typically the incomplete parts are the bizarre edge-cases that no
web developer is likely to ever run into, but there is a real attempt to avoid
undefined behaviour), and there's definitely more involvement across the
various standards orgs from _all_ browser developers than there was a decade
ago, W3C included.

The increasing number of browser "idiosyncrasies" is mostly a result of ever-
increasing complexities in implementations (partly down to the increasing
complexity and size of the web platform, partly down to new features being
retrofitted into implementations that weren't originally designed to do
anything like the new feature, and partly down to lack of a good shared,
between all vendors, testsuite for CSS).

I am remain hopeful that we're moving towards somewhere better when it comes
to interoperability across browsers. web-platform-tests
([https://github.com/w3c/web-platform-tests](https://github.com/w3c/web-
platform-tests)) has got us a good, high-quality test suite for the platform
minus CSS, and is run by most but not yet all browsers on a daily basis (and I
think it's realistic to hope that by the end of 2016 it'll be run by all
browsers on a daily basis, and that for many it will be _the_ place they write
tests, hence test suites will basically become shared rather than done
separately with different coverage holes for bugs to slip through). csswg-test
([https://github.com/w3c/csswg-test](https://github.com/w3c/csswg-test)) is
slowly starting to move towards a point where it'll be as widely run as web-
platform-tests (please don't make me think about this too much, I'm finally
making progress there, making the same arguments as five years ago but now
with the evidence from web-platform-tests that it actually works).

And finally:

> I can't remember the last time I read anything on the W3C site. I do find
> myself there now and then, but sadly, it is simply irrelevant to most of my
> daily web development work. What matters in a world with ever-moving
> goalposts is what browsers can actually do right now, which today basically
> means which of the evergreen browsers have support currently, which of their
> LTS releases also have support, and what the situation with IE and possibly
> older mobile browsers is, as validated by actual testing in each case.

Really that makes it sound like half the problem is the fact that the spec
documents give no informative data about what browsers support what features.
There's definitely been talk about trying to do something better here, but
it's a hard problem: you need some way to gauge support _and_ quality of
implementation, across all browsers and across all features. I think there's
mostly agreement that caniuse is actually too coarse to really be put anywhere
near a spec (because specs are rarely implemented as atomic units—there's
normally different statuses for different sections), and that just grabbing
test suite results isn't that useful either (because you can fail large
numbers of tests due to obscure bugs, or because you've only implemented the
awkward 20% not the easy 80%, so you're actually closer to shipping complete
support than someone who supports the easy 80%).

(As a disclaimer: I've been around the W3C and WHATWG for quite a while
(coming up to a decade), have been employed/contracted for several browser
vendors, and was one of the early pushers to get high-quality test suites
shared across all browsers.)

~~~
Silhouette
For what it's worth, I don't dispute at all that the quality of CSS specs from
the W3C has improved over time. However, specs are means to an end, and the
usefulness of any spec is inevitably dictated by the implementations that
exist.

As a professional, I truly wish this weren't the case, but as a professional,
my job is to make a working site, not necessarily a standards compliant one.
All too frequently, it is still not possible to do both at the same time,
either because browsers don't implement W3C recommendations consistently, or
because of the poor quality of implementation you alluded to yourself, or
because even when browsers did provide a useful implementation yesterday
someone broke it in an update last night and today I'm rewriting something a
different way because platinum support customers started calling first thing
this morning and get a same day response not a 6 week one.

"Of course I understand that it's the update we just rolled out to our
corporate standard browser that broke your standards-compliant and otherwise
normally functioning site," said no customer ever. :-)

------
guelo
Am I understanding this right that the idea is that if a member of W3C sues
anybody for reverse engineering their DRM then they get kicked out of W3C?

I wonder how much of a deterrent that is. W3C needs Google/Microsoft/Apple
more than they need W3C. The content producers aren't even members of W3C I
don't think. I guess it would be companies that create the encryption plugins
like Adobe that could theoretically sue people under the DMCA. I just don't
see how the W3C could even function without the biggest players at the table.

~~~
ascorbic
No, the idea is that they have to sign a legally binding covenant beforehand,
agreeing to not sue. If not then their DRM doesn't get into the standard in
the first place.

~~~
takeda
Yes, so what the repercussion is when they break that contract?

~~~
ascorbic
They lose their court case. I'd imagine it's a pretty simple defence against a
patent suit if you can present a covenant by the patent holder saying they
won't sue.

------
dendory
I think the security aspect is the best option. If a CEO can't access some
proprietary web app and is told "You need to use browser X that supports DRM",
then they will be "Why isn't that already installed on my system". But if they
are told "The web app uses hidden code that may be insecure, and cannot be
audited" then you'll have executives, people in actual power positions, say
"Get that stuff off our network".

------
bitwize
Either the Web will adopt DRM or the Web will be considered extremely suspect
as a distribution medium.

Similarly, either you will accept backdoored encryption or you will be
automatically considered a terrorist and singled out for LE scrutiny.

------
chris_wot
I don't expect Tim Berners-Lee to agree to this, given he was the one who
championed DRM at W3C in the first place.

Also, I don't expect Mozilla to do anything useful, given they went along with
it so easily. But then, I haven't expected anything much of Mozilla in a long
while.

~~~
gsnedders
> I don't expect Tim Berners-Lee to agree to this, given he was the one who
> championed DRM at W3C in the first place.

The W3C is in a somewhat awkward place, really. To some degree, the W3C is
beholden to its member organisations, and if they want to do something it
frequently happens. That said, it's easier to refuse to work on something than
to work on something the majority of members don't want (HTML comes to mind
there!).

> Also, I don't expect Mozilla to do anything useful, given they went along
> with it so easily. But then, I haven't expected anything much of Mozilla in
> a long while.

I think people vastly overestimate how much influence Mozilla actually has.
Look at how long Mozilla held out against H.264, and all that did was cause
them to lose marketshare (because plenty of HTML video simply wouldn't play in
Firefox) with no affect on the web as a platform.

~~~
chris_wot
Sort of makes me wonder about why people give money to Mozilla if they have
such little influence.

~~~
acdha
Mozilla has influence, not control. That doesn't mean they're not valuable for
keeping things from being even more skewed by e.g. being the only non-DRM
vendor in the room.

The HTML5 video situation is a bit complicated, too, as H.264 was simply
better across the board: standardization, quality, speed, tooling, hardware
support, etc. There was only one reason to choose WebM and that was the belief
that it might not be covered by patents, which mote optimistism than something
you could be rely on.

That point really needs to be fought in the next generation or two where you
aren't waiting until the contest is over before getting started.

------
stcredzero
People need to realize that DRM is simply a tool. Absent pernicious laws like
the DMCA, forms of DRM are like fences, locks, and cameras. Such tools can be
used to oppress people. The same tools could also be used to protect
individual's privacy and act as a check on what organizations can do with
people's data.

Those who control products and infrastructure shouldn't be allowed to set up
such tools to grant themselves disproportionate power. However, individuals
need to realize that such tools could help protect them as well. (Analogy:
Camera phones and police misconduct.)

~~~
mikeash
What would favorable DRM look like? How could I use it to protect my privacy
or limit what organizations can do with my data?

~~~
stcredzero
What if your email correspondence from companies was entirely handled through
DRM that you controlled? What if you could revoke the right for a company to
use your email? Basically, if such infrastructure were widespread, no company
would actually have your email, unless they were to circumvent the DRM.
Furthermore, such circumvention would then be covered under the DMCA, and mass
circumvention would open up companies to class action lawsuits. (If properly
implemented, either lawbreakers could be tracked, or the company that let the
protected address fall into the wrong hands would be identified.)

You could not say that such a circumstance would be unfavorable. Typically,
people come up with an argument saying that DRM doesn't work, and that such
tools wouldn't be available in a true and trustworthy form to individuals
anyhow.

Also, email isn't such a great example anymore, but having such control over
one's medical records would be desirable.

~~~
bad_user
First of all, you're talking about a doubly edged knife. You wouldn't "
_actually have_ " the received email either. Now imagine you're sexually
harassed or blackmailed, but couldn't prove it, because your email vanishes
just after you've read it.

But more problematic is that you're forgetting about the analogue loophole. As
long as I can take a photo of your crappy email, you only have the illusion of
safety. Add to that the fact that currently there is no such thing as
unbreakable DRM, mostly because it's a fundamentally flawed concept that can't
work. DRM is security theater at best.

> _having such control over one 's medical records would be desirable_

One thing that always bothered me is that programmers have the arrogance to
think that technology can fix a broken society. Given that DRM is a broken
concept, it wouldn't stop global adversaries like the NSA from mining your
medical records. And then when your insurance company demands whatever records
they want to do however they please, including selling that data to the
highest bidder, threatening to reject your contract or claim, good luck
explaining them about DRM and control.

~~~
stcredzero
_You wouldn 't "actually have" the received email either._

That's a straw man. The contents of communication would be proof from
repudiation, but permission to send would be subject to revocation.

 _Given that DRM is a broken concept, it wouldn 't stop global adversaries
like the NSA from mining your medical records._

Begging the question. Our experience so far of DRM is broken. And there are
plenty of adversaries below the capability of the NSA people need protection
from.

~~~
icebraining
_Begging the question. Our experience so far of DRM is broken._

No, DRM as a concept is broken. The whole idea rests on giving someone the
encrypted content and the key to decrypted, while obfuscating the way the two
are combined in a machine the user controls.

That's why most DRM has been broken, often mere days after release, despite
the many millions buried in its development.

