

Are video codecs written in JavaScript really the future? - velodrome
http://arstechnica.com/information-technology/2013/05/are-video-codecs-written-in-javascript-really-the-future/

======
KaiserPro
I can understand the appeal, however its just not that feasible on low powered
devices(Yet). Most mobiles (most things videos infact) have some silicon
dedicated to h26 _x_ decoding. Using that silicon is a massive order more
efficient (both power and CPU/GPU) Its also more reliable.

The other thing to note is that if ars is correct and that it only uses
I-frames only in some cases, its not going to be bandwidth efficient. That's
one compromise you can't really make in mobile land. Its also going to require
two separate streams (one for each class)

One last thing, I've not actually seen any output "in the flesh" as it were.
VP8 was supposed to be wonderful, but in fact its _almost_ a clone of h264,
just with marginally worse performance.

(another thing, its actually really hard to make an HD player from scratch.
You wouldn't think it but putting frame on screen in a timely and smooth manor
is really not trivial.)

~~~
smith7018
Would you mind going a little more in depth about VP8? Last I heard it was
supposedly amazing but that was awhile ago.

~~~
w0utert
What more is there to say about it? It's fundamentally almost exactly the same
as H264, minus a few features that would make it too vulnerable to patent
litigation, plus a few minor additions that are meant to patch up the
efficiency lost by excluding some of H264's features. For web video VP8
performs about the same as H264, for high-bitrate/high quality (think Blu-
Ray/Full HD) it performs decidedly worse because it doesn't have most (any?)
of the advanced features in H264 main/extended/high profiles. The one big
downside is lack of widespread hardware codec support.

There really isn't anything 'amazing' about VP8, except that some people think
it fixes something about video coding because it is 'more free' than H264,
which IMO is vastly overstated. In real life, for 99.9% of people (including
those offering commercial video products), the difference is practically
philosophical.

~~~
KaiserPro
+1 this ^

~~~
eridius
If you want to +1 a comment, just click the little ▲ instead of commenting.

------
StringyBob
<http://en.wikipedia.org/wiki/Betteridges_law_of_headlines>

It's quite hard to argue against commoditised hardware codecs, particularly
when the decoder is a low power mobile device. It's one of the main reasons
h264 is going to be around for a long time...

~~~
KaiserPro
+1

There is a massive amount of high quality content, encoders and transport
tools. Even when h.265 is properly ratified its going to be at least 2 years
before we see mature tools/content.

~~~
jychang
Your green username indicates you're new here, so a tip: Don't say "+1", but
just upvote.

------
B-Con
In all seriousness: Why? What do we gain from doing the decoding in
JavaScript, besides the "cool" factor?

I can see how this would allow more flexibility, since a codec could be
downloaded at run-rime, which would remove the need for media to only be
packaged in codecs with native, cross-browser support.

But is there _any_ chance this could be even close to as efficient, CPU or
power-wise, as the current standard native browser codec support?

~~~
gluxon
The power in this is that JavaScript is extensively portable and very
consistent across platforms. There are no issues with cross platform support
that plagues C++ and to an extent, Java's JVM. There's no possibility of
platform lock-in, which happens for .NET applications.

I don't like how this article is written at all. It leaves a bias afterlook
that Mozilla's goal here is to create a toy. JavaScript is the fastest
interpreted language and plans are out to make it even speedier. There's a
reason why "the future" is emphasized.

Perhaps I am biased as a Node.js programmer, but I do believe this is a good
thing. JavaScript is seriously pretty sweet. I do like the design of the
language. There are features that outblow many other languages.

~~~
simfoo
> JavaScript is the fastest interpreted language and plans are out to make it
> even speedier.

Citation needed

~~~
gluxon
With JIT compilation, asm.js, tracing, it is hard to argue against. A long
fought browser war really brought out the maximum capabilities of the
JavaScript engine.

[http://en.wikipedia.org/wiki/JavaScript_engine#Performance_e...](http://en.wikipedia.org/wiki/JavaScript_engine#Performance_evolution)

asm.js promises speed 2x slower than native code. For interpreted code, that
is amazing.

~~~
gcr
None of those are interpreters. You're trying to compare Javascript against
other JITted / compiled languages. Mozilla themselves claim that asm.js is
only about 2x slower than C.

------
zmanian
Part of the question about the Obex technology is where should the
computational cost of video transmission be born? Client side or on the server
side.

Decoding in general purpose compute is expensive for client devices in
comparison to codec specific hardware.

Otoy's goal is to make encoding computationally inexpensive on the server
side. This makes applications like per client frame by frame watermarking and
low latency encoding far more scalable than other solutions.

Otoy is trying to explore the market potential of inverting the typical
encode/decode cost model so that rather than encode once -> decode many. It
becomes affordable to encode on a per client basis.

~~~
btbuildem
As client devices grow both in computational power and sheer numbers - I think
scalability will be facilitated by offloading at least some of the heavy
lifting to the client.

~~~
georgemcbay
That makes sense if you just consider computational power, but what about
bandwidth? The reason we make these fancy video encoders in the first place is
to make the data to send much smaller, and while we all have plenty of power
on desktops, laptops and mobile is catching up quick, bandwidth still mostly
sucks, at least here in the US.

------
leeoniya
"We spoke to OTOY, and the company told us it has no immediate, specific plans
to publish it openly. That's not to say that the company is necessarily
opposed to doing so; rather, it's not sure how to do without undermining its
commercial products."

ah, thats why i couldnt find it on github. and oh :(

~~~
randyrand
If it is decoded in Javascript on the browser (so... source code), can't we
figure out how to decode and encode the codec fairly easily?

Or would using the codec still be a copyright violation even if it is reverse
engineered?

~~~
battwell
Minified js isn't much easier to reverse engineer than compiled native code.

~~~
daeken
I've spent most of my life reversing native code, and I've also reversed a
good deal of minified JS. The latter is many, many orders of magnitude easier
and less time-intensive in _every_ case. Now, with things like Emscripten and
Asm.js that will change (since the JS is effectively compiled native code),
but it still will be easier due to the fact that, for example, you don't have
to worry about code-data separations you have to worry about in most native
code.

------
Strilanc
It seems like easy access to the same video with many different watermarks
would make it easy to find the watermark, or at least easy to destroy it?

Can you still identify who shared a video, if the shared video was created by
averaging ten copies with different watermarks?

------
bmuon
> H.264 video is abundant, and is unambiguously "the winner" of all the
> current major video codecs

> The viability of watermarking as an alternative to DRM is also speculative
> conjecture (...) In spite of the changes in the audio market, the video
> market has remained firmly in favor of DRM

A week-old demo and the author is already saying it doesn't have enough market
share to be viable.

------
shacharz
Is the major advantage of ORBX.js is that it can be decoded by leveraging
webGL? isn't that also possible with VP8/H.264?

~~~
azakai
Experiments with decoding of H.264 in JS+WebGL were done in the Broadway
project, and while (as mentioned in the article) reasonable frame rates in JS
alone were achieved, WebGL was not a huge benefit. The problem is that H.264
was not designed to run in a massively parallel manner like GPUs expect. There
is too much stuff that is serial, so you must run that on the CPU.

But a new codec like ORBX.js however can be designed to run efficiently in
WebGL, that is what is interesting here.

------
pjmlp
I wonder how much CPU it really uses. The Firefox PDF viewer always makes my
CPU usage jump almost to the top.

------
epeus
Yes they are, because existing video codecs are based on 1980s signal
processing ideas, and increasingly video starts out as being 3D models that
are naturally better suited to a trimesh and textures approach instead. I
wrote about it here: [http://epeus.blogspot.com/2013/05/finally-some-progress-
in-v...](http://epeus.blogspot.com/2013/05/finally-some-progress-in-video-
codecs.html)

~~~
hnha
What are you talking about? Film is 2D. How do you imagine it to suddenly be
"encoded in 3D"?

~~~
epeus
The world is 3D. I bet the last few films you saw had extensive 3D modelled
scenes in them. I bet all the video games you play have 3D cutscenes.

~~~
KaiserPro
This is true, but the are pre-rendered for a reason. Anything with water
you're looking at 24hour+ sim time, plus render, and 1tb of data plus.
SubSurface Scattering (translucency) takes a huge amount of CPU time.

------
Shorel
As long as this means no more flash, it's a welcome improvement.

~~~
wmf
Not to your battery life.

~~~
Shorel
Neither is flash.

~~~
asdfs
Flash video typically uses hardware decoding.

~~~
Shorel
This article backs my claim:

<http://iss.oy.ne.ro/HTML5-Video-Battery>

Hardware decoding or not, Flash is always less efficient than the
alternatives.

------
ndesaulniers
> The use of WebGL and shader programs may improve the power efficiency
> somewhat, but they are still inferior to the dedicated motion video hardware
> found in all modern GPUs.

Um...what? Last time I checked WebGL ran and "shader programs" ran on the GPU.
What am I missing?

~~~
p_l
The actual decoding passes don't translate well to shader architectures, so
GPUs include custom circuitry to handle various transformations involved in
decoding.

------
ndesaulniers
The author works for Microsoft. The proposal to use watermarking is in direct
contrast to Microsoft's PlayReady DRM implementation. Passing off a conflict
of interest as news.

------
wmf
Previous discussion: <https://news.ycombinator.com/item?id=5653531>

~~~
Shorel
Sadly, only focused on the 'games streaming' aspect.

------
ianstallings
This is fine but the real reason h.264 is so well entrenched is because the
whole industry supports it at the hardware level with onboard chips and IP
cores. This is particularly important as mobile devices become prevalent.
Hardware decoding is essential to reduce energy usage and improve batter life.

~~~
shmerl
VP8 is catching up in modern SoCs. So H.264 won't have that advantage in the
near future (it still has it now, given the existing hardware base).

Some things need to be clarified first though, like recent Nokia's patent
sabotage to VP8. Not sure what is going on with that.

------
smiddereens
Sure, why not.

