
Thor – A Project to Hammer Out a Royalty Free Video Codec - TD-Linux
http://blogs.cisco.com/collaboration/world-meet-thor-a-project-to-hammer-out-a-royalty-free-video-codec
======
jngreenlee
"We also hired patent lawyers and consultants familiar with this technology
area. We created a new codec development process which would allow us to work
through the long list of patents in this space, and continually evolve our
codec to work around or avoid those patents. Our efforts are far from
complete, but we felt it was time to open this up to the world."

This burden is becoming far too great, when this is the cost necessary to
achieve innovation.

~~~
smithkl42
Amen. As Thomas Jefferson (correctly) described patents, they're a part of
positive law, not natural law. In other words, their only justification is
pragmatic, not moral. You can't "own" an idea the way that you can own a couch
or a car. We allow for (temporary) patent protection because it's supposed to
encourage innovation and help our economy. If it doesn't - and it's clearly
reached the point where it hinders rather than helps innovation - then we need
to change things.

~~~
manachar
To be clear, because some forget this part of patent law, the encouragement
towards innovation was to encourage people to share and build on a communal
set of ideas. The temporary monopoly on ideas was the carrot to get people to
register their ideas in a central location (patent office) rather than lock
ideas behind closed doors and secrecy.

Too many people think the carrot of the temporary monopoly was the point of
the patent office.

In other words, we know that the community of ideas gets better with more
sharing and building off of other people's ideas. As a society we decided to
make laws to encourage this sharing. As a result, technological progress was
immense.

Sadly, the current situation seems to strongly suggest that we may need to
find better ways to encourage sharing (Open Source has been great for much of
software).

~~~
joepie91_
Copyright was built with a similar purpose - enriching the public domain - and
has failed in a similar manner.

Both patents and copyright are failed experiments. They weren't meant to
'benefit creators' or 'guarantee an income', and they cannot take that role in
a healthy society.

~~~
forrestthewoods
Copyright has succeeded a thousand times more than its failed. It's pretty
great for the most part. It enables a vast, vast number of jobs and new
creations. It could certainly be better. But I'd much rather have what
copyright as it exists today than for it to not exist at all. And don't forget
that GPL is only enforceable due to copyright law.

~~~
Asbostos
Good point about GPL. Some people advocate for much shorter copyright terms,
just enough to make it worthwhile creating the work, like say 20 years. But
that would push some GPL software into the public domain, thus damaging the
GPL ecosystem by allowing commercial competitors to use the same code.

~~~
usrusr
Pardon me if I have trouble seeing a terrible loss for the GNU ecosystem if
releases from 1995 were transferred to the public domain.

~~~
Asbostos
There are some very old but hard-to-do-yourself things like compilers, early
versions of Linux, etc. Sure it won't be massive, but in the long term it'd
slow the ever-growing collection of GPL work, as the old stuff leaks out the
back end.

~~~
belorn
We are talking about works which then has had no patches for 20 years. Even
compiles of dead languages do get minor patches over time, if nothing else
then to fix memory leaks, crashes, infinitive loops and compatibility with new
os kernels. You would for example have to enable low memory address exception
in the kernel to run many 20 year old programs, since many programs back then
assumed they would start at address space zero, which modern kernel forbids
for security reasons.

------
halosghost
Actually, I'm still rooting for Daala (from Xiph.org, the same folks that did
so well with Opus). It's still a long ways away from being finished, but their
work is awesome and I've been following it for a while now!

Either way, having another effort competing to make a great format is not a
problem. Here's hoping it goes well!

~~~
derf_
Hello, I'm the Daala tech lead.

One of the things that made Opus a success was the contributions of others. We
certainly don't have a monopoly on good ideas. We'll take pieces of Daala and
stick them in Thor and pieces of Thor and stick them in Daala, and figure out
what works best. Some of that experimentation has already begun:

[https://github.com/cisco/thor/pull/8](https://github.com/cisco/thor/pull/8)

[https://review.xiph.org/874/](https://review.xiph.org/874/)

[https://www.ietf.org/proceedings/93/slides/slides-93-netvc-5...](https://www.ietf.org/proceedings/93/slides/slides-93-netvc-5.pdf)

Because none of us have a financial incentive to get our patents into the
standard, we're happy to pick whatever technology works best, as long as we
end up with a great codec at the end. Hopefully NETVC can replicate the
success of Opus this way.

~~~
xiphmont
Yeah, we're going to have to start arguing on what to name the hybrid soon,
since 'Vopus' is already taken.

~~~
cpeterso
Brainstorming fun names combining "Thor" and "Daala" reminded me of the Robert
E. Howard villains "Thulsa Doom" and "Thoth-Amon". :)

------
Animats
The MP4 patent situation needs another close look. MP4, which was first
standardized in 1998, ought to come out of patent soon, if it hasn't already.
There are a few remaining patents in the MPEG-LA package, but they're mostly
for stuff you don't need on the Internet, such as interlaced video, font
loading, error tolerance for broadcast, and VRML. This hasn't been looked at
hard since 2011[1] and it's time for a new look. Some of the key patents
related to motion compensation expired last April.[2]

It looks like the last patent on MP3 audio decoding expires next month.

[1]
[http://www.osnews.com/story/24954/US_Patent_Expiration_for_M...](http://www.osnews.com/story/24954/US_Patent_Expiration_for_MP3_MPEG-2_H_264/)
[2]
[http://scratchpad.wikia.com/wiki/MPEG_patent_lists#MPEG-1_Au...](http://scratchpad.wikia.com/wiki/MPEG_patent_lists#MPEG-1_Audio_Layer_3_patents)

~~~
TD-Linux
If by MP4 you are referring to H.264, there are still many years remaining on
most of the patents. MPEG-LA publishes patent lists, if you're interested to
look.

You are right in that there are many other encumbered technologies that have
patents expiring soon. MPEG-1 and MPEG-2 video, MP3 and AC3 audio, and several
container formats are included. Notably, this is almost all of the
technologies required to make a DVD.

~~~
Animats
Yes, MPEG-LA publishes lists, and they need to be looked at closely. Most of
the patents have expired. When you go down the list, you see things such as US
#6,181,712, which has to do with multiplexing two unrelated video streams into
one. Broadcasters and cable systems do this, but Internet video does not. US
#6,160,849 only applies to compression of interlaced video, which nobody uses
on line. US #7,627,041 is about dealing with missing header data due to
transmission noise. US #5,878,080 is about backwards-compatible multichannel
(>2 channels) audio, also seldom used on-line. US #6,185,539 is for video with
overcompressed extra-low-quality audio. So is US #6,009,399.

MPEG-LA has been padding their patent portfolio by dumping in all these
patents on little-used features. Until this year, they still had some good
patents, such as the ones on motion estimation, which is a hard problem and is
needed to make compression work. But those have now expired. What's left looks
like it can be avoided as unnecessary for Internet use.

~~~
nickpsecurity
I think you're right. It's worth the hard look as it would let us take
advantage of the existing format instead of push a new one. One's always
easier than the other. Looks like we'll be doing same for MP3 soon, too. Glad
you mentioned that one as I was going to dodge it for a future project for
licensing reasons. Might not have to. :)

~~~
Animats
If you're looking at new compression ideas, take a look at FrameFree.[1] This
is a technology developed around 2005 at Kerner Optical, which was an effects
unit of Lucasfilm. It's a completely different approach to compression, not
based on frames. It's based on delaminating the image into layers which seem
to be uniform in motion, then interpolating by morphing each layer, then
reassembling the layers.

Because the interpolation operation is a full morph of a mesh, (which GPUs can
do easily) you can interpolate as much as you want. Ultra slow motion is
possible. You can also up the output frame rate and eliminate strobing during
pans.

Kerner Optical was spun off as a separate company, then went bust. The
technology was sold off, but nobody could figure out how to market it. The
delamination phase turned out to be useful for 2D to 3D conversion, which was
popular for a while. But Framefree as a compression system never went anywhere
after Kerner tanked. Nobody is doing much with it at the moment, and it could
probably be picked up cheaply. At one point, there was a browser plug-in for
playback and an authoring program, but they're gone. I'm not sure who to
contact; the "framefree.us" domain is dead. the "framefree.com" domain is
dead. Here's its last readable version: [2] The remnants of the technology
seem to be controlled by "Neotopy LLC" in San Francisco, which is Tom
Randoph's operation.[3]

[1]
[https://hopa.memberclicks.net/assets/documents/2007_FFV_Comp...](https://hopa.memberclicks.net/assets/documents/2007_FFV_Compression_CPFvsBMA_Feb2_8_45am.pps)
(Open with OpenOffice Impress; it's a slide show.) [2]
[https://web.archive.org/web/20120905065521/http://www.framef...](https://web.archive.org/web/20120905065521/http://www.framefree.com/)
[3]
[https://www.linkedin.com/in/neospace](https://www.linkedin.com/in/neospace)

~~~
nickpsecurity
Very different type of technology. I could see this being much easier to
cheaply accelerate in hardware, too, due to simplicity. Thanks for link! Will
keep copies of this and see who's interested.

------
ChuckMcM
<sarcasm mode> No wonder Big Media hates tech, they are trying to take all
their money away. </sarcasm mode>

I think this is a great effort, and if you'll recall Google went and attempted
to do the same thing with VP8, but found that people could file patents faster
than they could release code[1]. I would certainly support a 'restraint of
trade' argument, and a novelty argument which implies (although I know its
impossible to currently litigate this way) that if someone else (skilled in
the art) could come up with the same answer (invention) given the
requirements, then the idea isn't really novel, it is simply "how someone
skilled in the art would do it." I've watched as the courts stayed away from
that theory, probably because it could easily be abused.

[1] Conspiracy theory or not, the MPEG-LA guys kept popping up additional
patent threats once the VP8 code was released.

~~~
jsprogrammer
Why attack novelty instead of non-obviousness? An expert can attest to
obviousness, but not necessarily novelty (you shouldn't need an expert for
that: simply produce the prior version).

~~~
ChuckMcM
In my experience (and that experience is limited, I've only participated as an
expert witness in two cases that have gone to trial) looking backward through
time on "obviousness" is really hard. Once you know how a magician does his
trick, its obvious to you, but before you knew it wasn't. Compare that to
multiple people from different places trying to solve the same problem came to
implement that solution in exactly (or an infringing) way, speaks to the
notion that of non-novelty more strongly. And its something that has already
happened prior to disclosure, so you cannot argue that the other people were
"taught" by the patent disclosure. (one of the tenets of the patent is that it
should teach others skilled in the art how to do what it is you're patenting)

Anyway, I'm not a lawyer, and none of this is legal advice or patent advice.
Just my thoughts (or perhaps frustrations) on how hard it is to generate
deliberately patent free technology. That difficulty suggests to me a way in
which patent law could be improved.

~~~
jsprogrammer
Sure, determining past obviousness can be hard. That's why you bring an expert
or many experts to attest to how obvious the technique is.

But you don't need an expert for novelty. Either you can show a prior art or
you can't. I'll grant that there may be some some edge cases where the prior
art needs some nuanced interpretation from an expert witness.

~~~
ChuckMcM
I think we agree :-). I was thinking of the more subtle version of novelty
which is perhaps best expressed as, "as requested". Here is a fictional
example of what I'm thinking about.

Lets say someone asks you to make a mud pie[1] and put bits of lavastone in
it. You make your mud pie and then you patent "system and method for creating
a mud pie with lava stones."

Perhaps there is no prior art because nobody asked for a mud pie with lava
stones, perhaps there is no prior art because others who made mud pies with
lava stones didn't see anything useful about it. But someone, somewhere, filed
a patent. And the patent office grants it.

The question I pose is how to come up with a defense that anyone skilled in
the art of making a mud pie, would make one with lava stones in just that way
? And yes, I know all the legal arguments why it doesn't work like that, so my
point is how do we fix the patent system such that utility patents on methods
or combinations of methods, would likely be independently arrived at by anyone
skillled in the art?

How do we fix it so that Cisco, writing their patent free video codec in the
open, doesn't get "scooped" by someone taking their project, projecting out a
month or a year in advance of what it is going to need to work, and then
throwing together a provisional that pre-dates the open source project getting
there, thus depriving the people working on Cisco's efforts their ability to
ship without hindrance?

[1] Really, just dirt and water.

~~~
throwawaykf05
_> Lets say someone asks you to make a mud pie[1] and put bits of lavastone in
it._

This is begging the question (in the original sense of the phrase). The
process is usually not somebody saying "make me a mudpie with lava stones". It
usually starts with "how do I make a more attractive mudpie?" There are
countless ways of doing so. You could use marbles, leaves, different mud,
different levels of consistency... But maybe using lava stones gives you the
most bang for the buck. So then you are really filling a patent on "method and
system for increasing mudpie attractiveness".

The infamous Amazon one-click patent can similarly be viewed that way. The
patent is not really solving the problem of "how do we enable purchases with
one-click?" (the solution to which is blindingly obvious) but of "how do we
get people to buy more things on our online store?" Now, the path from there
to "one-click buying" may also be obvious, no doubt, but it's not as obvious
as the path from "how do you build one-click?" to "here's how" simply because
the solution space is so much bigger.

------
fndrplayer13
Why not throw the weight behind VP9? edit: I actually am curious, this isn't a
question pointed at the validity of Thor. I just really want to see a great,
open-source standard emerge and see people get behind it.

~~~
thefreeman
I'm no expert, but the article lists VP9 as "proprietary". Which I take to
mean not open source, and potentially not free. Though the proprietariness
could be a response to the issues they had with VP8 and suspected reactionary
patenting.

~~~
oconnor663
VP9 is open source and royalty-free though. I'm confused.

~~~
slacka
And it's not just open source, it's BSD which is about as none restrictive as
you can get. So now 1) Open Source 2) Royalty-free 3) Free as in beer 4) Free
as in freedom (Open Source OSI certified BSD)

Software can now be call proprietary?

pro·pri·e·tar·y : of or relating to an owner or ownership.

So what is this guys saying. That now anything with a company behind is is
proprietary? Linux has got LMI, so I guess that's proprietary, and firefox has
got Mozilla. Libreoffice...by this guys twisted version of reality, what is
not proprietary?

~~~
0x09
VPx reference software is open source. The article is not referring to the
software but the specification which is developed and published by Google (a
private company), as opposed to an open standards consortium like ISO, ITU or
the IETF.

VP8 was published as an informational RFC under the IETF, but not as part of a
standards track, see "Not All RFCs are Standards":
[https://tools.ietf.org/html/rfc1796](https://tools.ietf.org/html/rfc1796)

~~~
indolering
Parent has the right idea. One of the few valid criticisms of VP8 was that the
code /was/ the standard, so you had to reverse engineer the encoder/decoder.
This is not only a PITA but also prompts questions such as whether an obvious
bug is part of the spec.

It also took a full year after Google bought the company behind VP8 to
actually release the code. Someone from Firefox actually wrote an open letter
to Google basically asking WTF was going on.

I don't have any first or even second hand knowledge of the current situation,
but I suspect that Google has continued to ... not collaborate as much as
everyone else would like.

(Caveat emptor: the above is based off of memory of events that took place a
few years ago.)

------
russtrotter
Wasn't Ogg Theora created under just the same principles? I'm not smart enough
in all things codec to know how it stacks up technically, but best I can tell,
it's unencumbered.

[https://en.wikipedia.org/wiki/Theora](https://en.wikipedia.org/wiki/Theora)

~~~
theandrewbailey
Theora simply can't do what H264 can.

example: [http://www.streaminglearningcenter.com/articles/ogg-
vs-h264-...](http://www.streaminglearningcenter.com/articles/ogg-vs-h264---
round-one.html)

~~~
nickpsecurity
Deleted a prior comment on "why not theora?" because this link is what I was
asking for. Thanks.

------
JoshTriplett
> Google’s proprietary VP9 codec

That's an _odd_ choice of phrase; it's unfortunate that a press release
chooses to disparage alternatives without explanation.

~~~
anon1385
Why is that odd? It is pretty much a textbook example of a proprietary
standard.

Here are some definitions of "proprietary" as used by members of the FOSS
community when talking about standards:

[https://news.ycombinator.com/item?id=4634957](https://news.ycombinator.com/item?id=4634957)
(BrendanEich)

>"Proprietary" as in "sole proprietor" is appropriate for a project with zero
governance, launched by Google after some incubation closed-source, dominated
by Googlers.

[https://news.ycombinator.com/item?id=9395992](https://news.ycombinator.com/item?id=9395992)
(pcwalton, Mozilla employee and Rust core developer)

>In a competitive multi-vendor ecosystem like the Web, public-facing protocols
that are introduced and controlled by a single vendor are proprietary,
regardless of whether you can look at the source code. NaCl and Pepper, for
example, are proprietary, even though they have open-source implementations.

~~~
FooBarWidget
That is a really odd thing to claim, given that there are so many proponents
of the MIT license. People claim that MIT is "more free than GPL" because MIT
has less restrictions. GPL has more restrictions, and although those
restrictions are there to ensure that the same rights are passed on other
people, MIT proponents don't buy that argument: they argue that even if
someone forks an open source project, slap a GUI on it and sell it as a
proprietary product, no freedom is lost because the original is still
available. It does not matter that you cannot contribute to the fork. The
ability to make proprietary forks is seen as good.

Yet when applied to Google's products, this is suddenly viewed in a different
manner? Even if the maintainer does not accept patches, you can still fork it,
so no freedom is lost. And it's ok for _other_ people to make a proprietary
fork, but not ok for the author to make a proprietary fork? That sounds like
hypocrisy to me.

~~~
anon1385
Did you apply to the correct comment? Your post doesn't make much sense. We
are talking about standards not code. Code is an implementation of the
standard, and the license of the code is irrelevant to the status of the
standard.

Forking standards is completely different to forking a codebase. It should be
obvious why.

------
JustSomeNobody
I am sure that some entity holds a broad enough patent that all your bases
will belong to a Texas court.

~~~
saidajigumi
And that's the real problem. Heck, it doesn't even need to be an NPE, it just
needs to be one of the patent holders they're "avoiding" who wants to fire up
some litigation.

They don't even need to be able to win. An existing "legit" patent holder
might choose to simply throw lawyers at as a tactic to delay or defeat a
potential competitor. In that case, it comes down to a cost/benefit analysis
for the would-be litigator.

~~~
TD-Linux
Certainly the risk is better with a royalty-free video codec, though. In the
case of Daala, the goal is to be sufficiently different to avoid these broad
patents, too.

Also, any companies contributing to the NETVC standard are required to declare
IPR, which is not the case for MPEG standards.

~~~
saidajigumi
> Certainly the risk is better with a royalty-free video codec, though.

I'm not sure what this means. A royalty-free video codec basically has a
bullseye painted on it from the perspective of existing rightsholders. The
only reason such entities might exercise restraint is because the cost/benefit
analysis doesn't support litigation. Even if they don't think a competitor
codec is a threat at the outset, there's nothing stopping an attacker from
just sitting on the sidelines and waiting until the threat profile (and depth
of infringers's pockets) becomes clearer. I.e. the "submarine patent" model,
except it could even be a known patent in this case.

edit: clarity.

~~~
TD-Linux
Er, sorry, you're right, that was a bad assertion. The submarine patent risk
is pretty much the same in both cases - RF does not make it better. Early
declarations to discourage use are more likely towards RF codecs, but are also
easier to deal with.

I think Daala's development process (and IPR disclosure policies at the IETF)
reduce the risk substantially. However, this is not automatically true of any
RF codec.

------
bobajeff
So... this is a separate project from Daala which Cisco also works on. Is
there a story here?

~~~
CUViper
*Daala - but yes, I was wondering the same thing...

~~~
bobajeff
Oops, I somhowe missed it good catch.

------
Ono-Sendai
What I would like to see is a video codec that has a library implementation
for reading and writing video in that format, that is cross-platform and
relatively easy to build, like libjpeg or libpng does for images. I have tried
to build VP9 on windows and it was a tedious and ultimately unfruitful
process.

I don't really care about the compression ratios achieved, or speed of
compression/decompression.

Something like motion JPEG would be good, if it was actually a proper standard
(AFAICT it isn't).

~~~
fndrplayer13
Motion JPEG isnt resilient. H264/H265/VP9, etc all build on some of the ideas
that JPEG introduces, but introduce features that allow for the stream to be
resilient to dropped packets or frames.

Its a cool idea, it just doesn't work in practice. Especially since a lot of
these video standards are transmitted over UDP.

~~~
Ono-Sendai
Interesting. I would have thought it would be reasonably resilient though, due
to its intra-frame nature. If you get lost in the stream you could scan
forward to the next JPEG header.

~~~
fndrplayer13
Thats true in a sense, actually. The thing is with H264/H265/VP9, etc dropped
packets or missing data is somewhat acceptable as long as you have a key
frame. You just end up with interpolated or 'guess' data (aka the droopy or
ice-cream frame effects). With Motion JPEG the frame typically just freezes
until another good frame is retrieved and decoded. Motion JPEG isn't a
standard though, really, so perhaps what I've seen doesn't match others
experiences. mJPEG is cool though for stuff on your local network or other
places where you know you'll have a reliable dedicated network. If you have a
nice wifi or wired network too you can crank up the quality on an mjpeg stream
and get some really gorgeous quality streaming video, as long as the receiving
devices can effectively buffer all that network data :)

------
donpdonp
Didn't we already go through this with VP8/VP9/WebM?

~~~
TD-Linux
VP9 is a good choice if you want a royalty free video codec right now. NETVC
is shooting for the next generation. In addition, the goal is to get the video
codec standardized at the IETF.

NETVC participation is open to anyone though, so it is possible more players
will show up.

~~~
jerf
"NETVC is shooting for the next generation."

What makes a codec "next generation"? I assume, broadly, it involves trading
off more yet more computation for a tighter encode? (As nearly an
embarassingly-parallel problem, video coding continues to get faster with more
silicon even if serial performance is stagnant.) What kind of gains can we
expect from the "next generation"?

All honest questions, BTW. Links welcome, though something focused on this
question and not just a laundry list of features with no reference to the past
would be preferred.

~~~
astrange
Video encoding is not embarrassingly parallel; no kind of compression ever can
be, because if any bit doesn't depend on the previous bit you've wasted it. It
is pretty suited to ASICs.

Codecs are only efficient up to a certain image size, and then stop working
because all the details are too large-scale for them. HEVC works much better
than H.264 on 4K. Besides that, there's higher bit depth pixels, 3D, that kind
of stuff.

Also there's usually so many mistakes and compromises in any standard that you
can always find something to fix in the next one.

~~~
jerf
"Video encoding is not embarrassingly parallel; no kind of compression ever
can be, because if any bit doesn't depend on the previous bit you've wasted
it."

That objection makes no sense. That just implies that at worse parallelization
may cost some encoding efficiency. In general, we are quite often willing to
pay for that encoding efficiency with gusto given the speedup we can obtain.
For instance, [http://compression.ca/pbzip2/](http://compression.ca/pbzip2/).

~~~
astrange
* no compression aiming for efficiency can be

If you have that much need for a speedup, you probably have multiple video
streams going (like you're Youtube or a livestream broadcaster). In that case,
it's better to do one video per CPU, and now you really are parallel.

Also, you can get up to 4x parallel through slice-threads safely on one video,
or 16x through x264's frame-threads if you don't care about your target
bitrate. I wouldn't consider that embarrassingly parallel until it's up to
1024x or so, but maybe you do.

~~~
nitrogen
Are there not stages of compression that are highly parallelizable, though?
Like basic transformations that operate locally on the image (maybe DCT, per-
block motion vector calculation)?

~~~
astrange
Sure, if they worked on the original image.

But that doesn't happen - when you're encoding, the DCT isn't actually run on
the image but on the output of previous compression steps (prediction) which
are based on the last encoded block. So there's a dependency on every pixel of
the image to the upper left of you.

And when you're decoding, it just never ends up worth it to read the whole
bitstream so you have a whole frame of motion vectors to do it at once. The
whole data locality thing.

------
datashovel
There should be efforts outside of large corporations dedicated to building
these standards. Because in general even when large corporations promise free
/ open-source licensing they really only mean non-commercial licensing or
"open with caveats". So they pretty much own the commercial rights.

I want open-source to subsidize a small team of engineers to create a
completely open standard where no single entity owns it and everyone is free
to branch / fork it.

------
yabun
There really needs to be a change to patent law around independent derivation
of a concept. At very least we need to look into generalised thicket busting
laws. The current situation is fundamentally unscalable.

------
dharma1
Seems to me the success will depend on the quality and whether chip
manufacturers will embrace this for hardware encoding/decoding. Right now
looks to me like h265 is the winning horse

~~~
masklinn
Thor is part of the NetVC effort, tentatively a royalty-free successor to
HEVC.

------
yjm
i wonder how many orders of magnitude slower this one will be compared to
x264. vp8/9 was like 9x slower last time i checked

~~~
TD-Linux
It is currently quite a bit slower, but the goal is to make a codec fast
enough for real time communication use.

VP9 is still about 9x slower than x264, but yields the same quality at half
the bitrate. You can set VP9 to run a lot faster, but you'll lose some of the
bitrate advantages. Still, VP9 is practical for a lot of applications, such as
Youtube.

~~~
bitmapbrother
VP9 being 9x slower than x264 is hard to believe. Do you have a citation?

~~~
mappu
"VP9 encoding (using libvpx) is horrendously slow – like, 50x slower than
VP8/x264 encoding. This means that encoding a 3-minute 1080p clip takes
several days on a high-end machine. ... libvpx multithreading [encoding]
performance is deplorable. It gains virtually nothing."[1]

1\. [https://blogs.gnome.org/rbultje/2014/02/22/the-worlds-
fastes...](https://blogs.gnome.org/rbultje/2014/02/22/the-worlds-fastest-
vp9-decoder-ffvp9/) n.b. x264 comparisons were taken with `--preset veryslow`
which understates x264's potential performance by an order of magnitude. From
the same link: "it can be fast, and it can beat x264, but it can’t do both at
the same time."

~~~
TD-Linux
This is old. libvpx 1.4.0 is a lot faster now and has multithreading. On my
i7-4900MQ laptop, I get about 3fps encoding 1080p content. Still very slow,
but 24 minutes for a 3 minute clip, not days.

------
s9w
This seems like fantastic news after the HEVC patent disaster.

Has anyone tested this or has more information on the performance/quality vs
other codecs?

~~~
TD-Linux
I have, at my website (for objective metrics)
[http://arewecompressedyet.com/](http://arewecompressedyet.com/)

Summary is that Thor is performing at a slightly better level than Daala and
worse than VP9 or H265. But it's also missing a lot of features right now, and
the encoder is only tuned for PSNR.

------
electriclove
Why not simply work with the VP9 project rather than starting a new effort?
Per Wikipedia: "VP9 is an open and royalty free[3] video coding format being
developed by Google."

------
shmerl
So Daala will be fused with Thor like happened with CELT and SILK to create
Opus? Does it make sense technically, or they are radically different?

------
jsprogrammer
Is it common to characterize BSD licensed software as proprietary? As in,
'Google’s proprietary VP9 codec'?

~~~
TD-Linux
This refers to the fact that the bitstream format of VP9 has not been
standardized anywhere, whereas the NETVC group intends to produce a standard.

------
Navarr
Yay another one!

------
josu
So [https://xkcd.com/927/](https://xkcd.com/927/)?

~~~
yellowapple
The comparison would be more accurate if most of those 14 competing standards
are stated to be proprietary and burdened with patents galore; the stated goal
of Thor isn't to try to unify a bunch of standards, but instead to actually
create a proper non-proprietary standard.

------
codebeaker
See also the massively popular, and critical-to-rails Gem
[https://github.com/erikhuda/thor](https://github.com/erikhuda/thor) \- naming
is hard.

~~~
dvt
Isn't Thor a Norse god, too?

~~~
dalke
And a washing machine, and a family of satellites, and a ramjet engine.
[https://en.wikipedia.org/wiki/Thor_%28disambiguation%29](https://en.wikipedia.org/wiki/Thor_%28disambiguation%29)
. Not listed there, also a special purpose DBMS for small molecule chemistry
data.

