
How Google’s Open Sourcing of VP8 Harms the Open Web - chanux
http://www.robglidden.com/2010/05/how-googles-open-sourcing-of-vp8-harms-the-open-web/
======
ZeroGravitas
A little bit cheeky, but I suppose his heart's in the right place.

I had assumed that Google would standardize WebM eventually but the last thing
they want to do is get pulled into the political quagmire of MPEG right now.
That would be a kiss of death.

This guy's been fighting for years to get MPEG to deliver on their promise of
a royalty-free profile of H.264 and rumours suggest that Apple and Google
applied pressure in a previous attempt to break the HTML5 codec deadlock. Even
if they got baseline (or less) for free it's basically a bait and switch for
the better profiles, and they still got knocked back. A couple of years later
and even with China applying pressure on this issue he's got nothing to show
but a half-hearted agreement that they'd think about looking into royalty-free
codecs as long as more than five national standards bodies signed up,
otherwise they'd drop the idea.

Not particularly classy the way he held up his own failed OMS project as
something to emulate without mentioning that it was his project (and a
complete failure).

It's amazing how much static has been thrown up in the IETF in their attempt
to standardize a royalty free Internet wideband audio codec. In fact I don't
think they even got the charter to explicitly mention royalty-free as a goal,
despite strong industry support from folk like Broadcom and Skype (both of
whom signed on for WebM too). A lot of the politics seems to involve people
trying to link up with MPEG because they know the whole project will die if
they do and other folks trying to fight off these procedural antics.

------
prodigal_erik
I can't believe I'm saying this, but I don't see much value in a standard and
multiple competing implementations of the same video format. Unlike HTML,
there's only one correct rendering of a video, and the problem is pretty much
solved by the first solid portable free decoder. Optimal encoding is an active
research topic, but that can rely on whatever the de facto decoder supports.

~~~
ZeroGravitas
In an ideal world all "standards", _de facto_ or _de jure_ , would have two
independently developed, open source, BSD licensed implementations as part of
the process of standardization.

This stops people writing specs that are impossible, or ambiguous. It saves
people having to go off and create their own half-baked implementations from
scratch. And it serves as as a basic test suite for those occasions when you
do actually need to write your own for whatever esoteric reason (to do it in
hardware, or a managed language or whatever} as well as testing each other to
catch stupid bugs in the implementation and/or spec.

The crazy future date that Flash evangelists have been throwing about for when
HTML5 will be "finished" is actually the date they expect to have two inter-
operable implementations of the entire spec. This is a lofty and noble goal,
sadly they just get dinged for it by people who are happy to call Flash "open"
while there is still only one complete implementation.

