
Free Lossless Image Format - davidbarker
http://flif.info/
======
iraphael
This is really interesting. One of the most interesting parts is the
progressive decoding/responsive images.
[http://flif.info/example.php](http://flif.info/example.php) (go to the bottom
of the page).

Basically, the last example shows that, if you want a scaled version of the
image, you can simply stop decompressing. No need for multiple image files.
Just create one with very high quality and decompress until you get the
quality you want and scale it down with html.

edit: more info on the responsive side of FLIF:
[http://flif.info/responsive.php](http://flif.info/responsive.php)

~~~
rplnt
This would be great. A lot of "responsive" sites nowadays use only one image
as well, the highest resolution available. So you are downloading that 4K jpeg
whether you are on mobile or on a desktop.

~~~
colanderman
Progressive JPEG has supported this for decades. Anyone over 25 should
remember them from dial-up years.

~~~
hansjorg
I think the suggestion is that the client could just stop downloading at a set
threshold, maybe taking current bandwidth, cost, etc. into consideration.

Judging from the video, something like that could work very well (and much
better than with progressive jpegs).

~~~
guelo
It's an interesting concept. But I'm not sure how it could work. Resizing
introduces artifacts so best quality is always going to be achieved by
reencoding for different resolutions. Also, how would the client know when to
stop downloading?

~~~
iopq
The client would download the first 1KB for an icon, 4K for a thumbnail and
the whole image for the full size. The best part is that all of those are
actually the same file, not separate embedded thumbnails inside the file.

The client could decide BY ITSELF to download only 2K for a thumbnail because
it's currently on 3g.

~~~
guelo
Where are you getting 1KB and 4KB from? The compression is highly dependent on
the image so there's no way to know the quality of the thumbnail image.

~~~
iopq
Maybe a future version of the format would specify useful cut-off points for
browsers to make the decision inside the header. Then the browser can decide
to download 4K version on wifi and 1K version on 3g.

------
jonsneyers
To clarify: at the moment FLIF is licensed under the GPL v3+. Once the format
is finalized, the next logical step would be to make a library version of it,
which will be most probably get licensed under the LGPL v3+, or maybe
something even more permissive. There is not much point in doing that when the
format is not yet stable. It's not because FLIF is GPL v3+ now, that we can't
add more permissive licenses later.

And of course I'm planning to describe the algorithms and the exact file
format in a detailed and public specification, which should be accurate enough
to allow anyone to write their own FLIF implementation.

~~~
davidw
BSD/MIT, or ASL2 are pretty much the standard if you want to see something as
widely used as possible, which this sounds like might be the case.

Cool work in any case!

One thing the page could use is "how much processing power does this use,
compared to other things?"

~~~
SomeCallMeTim
Yeah, GPLv3 AND LGPLv3 are the kiss of death for any corporate use. The patent
clause in GPLv3 makes it _legally impossible_ [1] to be used by any
corporation that licenses patents; it's a completely broken clause, and GPLv3
should be banished.

Without a BSD/MIT/ASL2 license as an option, you'll never see this in Internet
Explorer or Chrome. Probably not even Firefox.

[1] Companies license patents in bulk from other companies _for their own use_
all the time. They don't have the right to sublicense those patents to others;
they're just protected against any lawsuits relevant to the use of the ideas
in those patents. Yet GPLv3 _requires_ that they provide a free license to
_any_ patent that they have a license to that might be required to use GPLv3
source code. So it's requiring them to do something they legally can't do.
Selecting GPLv3 means that no large company will ever touch it as a result.

~~~
paxcoder
You are wrong, and a drama queen.

The only thing (L)GPLv3 patent clause obliges _propagators_ to do (ie not mere
users; not even mere modifiers) is to grant their _users_ licenses to
_applicable_ patents which _they own_. The permissive Apache license v2
demands from contributors the very same. It is the software patents that
should be "banished", not freedom preserving licenses.

TL;DR: (L)GPLv3 prevents patent trolling through free software.

Disclaimer: This is not to be treated as legal advice.

~~~
SomeCallMeTim
You're not a lawyer. Lawyers at big companies accept Apache's demands and
reject categorically GPLv3's.

Doesn't matter what you or I think of software patents (yes, they should be
banned). Doesn't matter what you or I _think_ GPLv3 says.

The lawyers at big companies see it as a problem, so it's a problem. End of
discussion. No drama required; it's just the fact that big companies avoid
using anything cursed with GPLv3.

~~~
paxcoder
Unlike you, I have at least taken the time to read the relevant license parts
before discussing them.

Note that what you've argued before was very different from what you're saying
now. You've narrowed the scope of the discussion (leaving out LGPL), but also
its very nature (" _legally impossible_ [1]", eh?).

Anyway, companies do use software licensed under both licenses, and even
incorporate them into their services - hence the need for AGPL. Maybe others
wouldn't see GPLv3 as that much of a problem if people didn't spread FUD about
its supposed "curse" ("no drama required" but you couldn't help it, huh?). And
if they didn't defend harmful practices, like Linus does tivoisation.

But mostly what companies avoid is copyleft, because it mandates reciprocity
and prevents leeching the community. For projects such as these, LGPL is an
acceptable compromise. The only valid argument against it is that apps under
incompatible licenses will not be able to use it where dynamic linking is
barred, and such is the requirement for apps in the Apple's store. However, in
this particular case, that wouldn't be a problem either if the platform itself
provided a decoder, like iOS does for PNGs.

~~~
SomeCallMeTim
>Doesn't matter what you or I think GPLv3 says.

I _have_ read the license. I'm like that.

And it _still doesn 't matter._

It's what the lawyers think. And they say it's verboten.

~~~
paxcoder
There are only so many options here. Either:

* You are lying that you read the license.

* You were lying about what the license says.

* You really dont want to admit that you have misunderstood the license.

Either way, you were wrong then and you are wrong now about what the lawyers
think. Speaking of which, there are only a few possibilities here as well,
only these are not mutually exclusive:

* You are intentionally dishonest because you have an agenda

* You are dishonest just to cover your behind

* You are genuinely careless about what you say

Even if we change the word 'think' for 'say', it's still a gross
overgeneralization.

So all things considered, in the best case scenario, you refuse to admit when
you are wrong, and will continue overgeneralizing. Forgive me, but it's really
not worth the effort arguing under these circumstances. If you wanted to
continue, you would have to make some concessions, but I doubt you will, so in
all probability: Goodbye.

~~~
SomeCallMeTim
>You were lying about what the license says.

I was reporting what I was told by corporate lawyers. My own reading of the
patent section _does_ happen to side with the lawyers' reading: That if you
distribute an app that's protected by a patent you own a license to, that you
need to arrange a sublicense for all users of that software. Maybe not
technically "impossible," but I didn't count "spending millions of dollars to
fix the problem" among the likely corporate responses when I said
"impossible." Especially when most GPLv3 code can be written from scratch for
less than the cost to license patents.

Someone alleged Blizzard uses it; fine, their lawyers either disagree, weren't
consulted, or are being ignored, but Blizzard doesn't make Chrome, Firefox, or
Internet Explorer, so the point is moot if you care about web adoption, which
would make the format relevant to anyone but a game developer.

What matters is what the lawyers for the big companies that control Chrome and
IE won't let GPLv3 code into the code base. Many other big company lawyers
take the same position (probably _all_ companies above some size threshold),
and that's all I've been alleging from the start. Criticize my delivery all
you want, but that's what I was trying to say.

My agenda is to get the developers to change to a license that _could_
actually be adopted into a web standard. Since you're refusing to actually
read what I'm saying, I agree: Goodbye.

~~~
paxcoder
What would cost millions of dollars? Please cut the drama out already and
limit yourself to arguments. You're finally starting to display understanding
of the patent clause.

Companies wouldn't adopt things under "GPLv3", but they wouldn't a permit
GPLv2 either. Or LPGL. Or Apache 2. Or MIT, or BSD, or any license. They
permit nothing short of contributors assigning them copyright and the patents,
just them (see eg. Webkit's and Chromium's copyright notices and CLAs). And
yet, libpng is under a license. So yeah, I agree they'd write their own
library - out of their selfishness. Let them.

With the "adopting a web standard" thing you're attempting to further move
goal posts. But you fail, and not because your implication that standard
bodies would accept permissive licenses is wrong - which it is, because
they're exclusively public domain + patent clause (oh and the people building
browsers still contribute somehow). You fail because you're mixing apples and
oranges again; programs are not parts of standards. Standards describe file
formats, and prescribe behavior of programs that process them. They are not
concerned with implementations' licenses.

The spec can become a public domain standard, and all would still be well with
the library under (L)GPLv3+. Free software should have the edge.

------
tfinniga
This is an exciting development.

I think the concerns about the licensing are a bit premature - honestly, this
is currently a research project, not a practical replacement for existing
image formats. The licensing is only one of several impediments to adoption.

\- The format has no spec

\- The format may change, rendering all previous images unreadable

\- The format has no javascript implementation, and no way of running on old
browsers.

\- As far as I can tell, there hasn't been a patent search done to see if it
violates other patents from other organizations. For example, this one:
[http://www.google.com/patents/US7982641](http://www.google.com/patents/US7982641)

\- No peer-reviewed paper published yet?

However that doesn't mean that we can't recognize the accomplishment - this
looks like a very promising research result, and I hope the project continues!
Good work Jon!

~~~
mmastrac
> \- No peer-reviewed paper published yet?

While there's no paper there's a full Github repo which I'd actually say is
quite impressive.

~~~
joshvm
As someone who regularly has to read supposedly landmark papers - with no
source code provided - this is far preferable in my opinion. Computer vision
is terrible for this!

------
lt
This is really interesting! There seems to be some additional technical
information here:

[https://boards.openpandora.org/topic/18485-free-lossless-
ima...](https://boards.openpandora.org/topic/18485-free-lossless-image-format-
flif/)

    
    
      - for interlacing it uses a generalization of PNG's Adam7; unlike PNG, the geometry of the 2D interlacing is exploited heavily to get better pixel estimation, which means the overhead of interlacing is small (vs simple scanline encoding, which has the benefit of locality so usually compresses better)
    
      - the colorspace is a lossless simplified variant of YIQ, alpha and Y channel are encoded first, chroma channels later
    
      - the real innovation is in the way the contexts are defined for the arithmetic coding: during encoding, a decision tree is constructed (a description of which is encoded in the compressed stream) which is a way to dynamically adapt the CABAC contexts to the specific encoded image. We have called this method "MANIAC", which is a backronym for "Meta-Adaptive Near-zero Integer Arithmetic Coding".

~~~
versteegen
Great, thanks. It seems there is no technical description available anywhere
beyond what you quoted. They haven't written it up, and I couldn't even find
comments in the source providing details. That's too bad, but he (I assume
that's Jon Sneyers) does say he hopes to write it up later.

Also, comments in that thread on speed [1]:

    
    
         In terms of encode/decode speed: both are slow and not very optimized
         at the moment (no assembler code etc, just C++ code). A median file
         took 3 seconds to encode (1 second for a p25 file, 6 seconds for a p75
         file), which is slower than most other algorithms: WebP took slightly
         less than a second for a median file (0.5s for p25, 2s for p75), PNG
         and JPEG2000 took about half a second. It's not that bad though: BPG
         took 9 seconds on a median file (2.5s for p25, 25s for p75), and
         brute-force pngcrushing took something like 15 seconds on a median
         file (6s for p25, over 30s for p75), so at least it's already better
         than that.
    
         Decode speed to restore the full lossless image and write it as a png
         is not so good: about 0.75s for a median file, 0.25s for a p25 file,
         1.5s for a p75 file. That's roughly 3 to 5 times slower than the other
         algorithms. However, decoding a partial (lossy) file is much faster
         than decoding everything, so in a progressive decoding scenario, the
         difference would not be huge.
    

[1]: [https://boards.openpandora.org/topic/18485-free-lossless-
ima...](https://boards.openpandora.org/topic/18485-free-lossless-image-format-
flif/?page=2#comment-399001)

~~~
GhotiFish
awesome, thanks for the info! if flif does ultimately consume more CPU
resources then that is a trade off I'm perfectly happy to make, I'd rather
burn CPU than my heinously bandwidth capped internet.

~~~
omribahumi
Here in Israel, bandwidth cap is big for a cheap price - I'm paying around
13USD/month for a package including unlimited voice calls, unlimited SMS and a
3GBs data plan.

What interests me most is battery life - is more CPU and less radio power
better?

------
rwinn
Wow! Higher compression than any other format, full transparency support,
progressive/partial loading AND animations?! I can't wait for this to get
widespread adoption!

Github repo here btw:
[https://github.com/jonsneyers/FLIF](https://github.com/jonsneyers/FLIF)

~~~
digi_owl
GPLv3 will be a hard sell...

~~~
lucian1900
In fact, it makes it impossible to use in practice. None of the major browsers
are GPLv3.

~~~
zanny
The GPL is compatible with BSD, MPL, and MIT licenses that Firefox / Chromium
use.

The problem is more that Firefox / Chrome ship proprietary bits like the DRM
modules that would violate the linking GPL coverage of flif. And that Chrome
is proprietary.

Its going to need to be relicensed LGPL to be included.

~~~
snuxoll
> The GPL is compatible with BSD, MPL, and MIT licenses that Firefox /
> Chromium use.

Yes, it is compatible, the other way around. You can take BSD, MPL, MIT code
and adopt it into a GPL project, not the other way around.

EDIT: Even LGPL won't work here as it will prevent use on Windows Phone, iOS
and Android.

~~~
awalton
_bzzt_.

LGPL code is everywhere in Android and iOS. There are numerous apps built on
GStreamer for both platforms, which is LGPL.

I wouldn't be surprised if Microsoft pulled the pig-headed move though.

~~~
snuxoll
> LGPL code is everywhere in Android and iOS.

You're not going to find LGPLv2.1 or LGPLv3 anywhere in them, however.
Starting with LGPLv2.1 you are required to allow the end-user to replace a
compiled binary you provided of the LPGL'ed component with their own,
something that obviously cannot be guaranteed on any of the mobile platforms I
listed.

I suppose I should have clarified the version, but it's important to note that
the FSF has tried to pull the tivoization card with more than just v3 of their
licenses.

------
spullara
One of the least interesting parts of this is that it is GPL. Someone should
reimplement it with a BSD license so it can be used more widely. AFAIK &
IANAL, but I don't think you could integrate this with FF, Chrome, Safari or
IE.

~~~
jblow
No kidding. If you want an image format to become widely adopted and
standardized, GPLing the code is a pretty bad idea.

~~~
Aldo_MX
Not just an image format, everything you want to become widely adopted and
standardized should refrain from using the GPL, even the FSF[1] recommends
using the Apache License in this specific case.

    
    
      Some libraries implement free standards that are competing against restricted
      standards, such as Ogg Vorbis (which competes against MP3 audio) and WebM
      (which competes against MPEG-4 video). For these projects, widespread use of
      the code is vital for advancing the cause of free software, and does more
      good than a copyleft on the project's code would do.
    
      In these special situations, we recommend the Apache License 2.0.
    

[1] [https://www.gnu.org/licenses/license-
recommendations.html](https://www.gnu.org/licenses/license-
recommendations.html)

~~~
bloody_pretzels
Note that the Apache License isn't compatible with GPLv2. If you don't make
use of any patents use BSD/MIT/ISC instead.

Sadly there isn't a real alternative permissive license with a patent clause.
There is a license called COIL[1], but it hasn't seen much adoption yet.

[1] [http://coil.apotheon.org/](http://coil.apotheon.org/)

~~~
Aldo_MX
Nothing that a dual license GPL/APL wouldn't solve.

------
onion2k
The 'only download as much of the file as you need' way of managing detail
levels is brilliant.

~~~
kcorbitt
Totally. I really recommend watching the video under the "Progressive and
lossless" heading. With just 20% of the file downloaded the image quality was
already subjectively quite good. I didn't even know progressive loading within
a single file was a thing, but it's a neat trick.

~~~
talmand
Isn't a progressive jpeg loaded from a single file? Or are you referring to a
different context of progressive with this file type?

------
espadrine
The most solid feature is lossless responsiveness. Stopping the download gives
you a lower-resolution image. When scaled, the image then becomes lossy.

Here is a thorough analysis of how good that lossy image is:
[http://flif.info/example.php](http://flif.info/example.php)

It shows how very good _lossy BPG_ is, but also how good FLIF is against
anything but BPG (and against lossless BPG).

------
emergentcypher
Why all the hate for GPLv3?

~~~
josteink
Sharing code is no longer cool if people have to share back. Or at least so it
seems.

The GPL-hate in here really is quite immense, even though time and time again,
RMS has been shown to be right about his stance on freedom.

Should we attribute it to people's desire for a quick ("free") fix over long
term considerations?

It's hard to tell, but I suspect the silicon valley influence here doesn't
help. There everyone wants to take everyone else's hard work and code, build a
weekend SaaS and get rich. The GPL, while not preventing that, is clearly in
opposition to that goal.

~~~
versteegen
As someone who made a negative comment about the GPL elsewhere in this thread,
I think I'll comment. I have absolutely nothing against the GPL, and use it
myself. However, using the GPL for a library seems like a bad idea, because
there are a lot of people, especially at companies, which won't touch the GPL.
So if you license a library GPL and it's useful someone is just going to come
along and reimplement it with a more liberal license. And such a _waste of
resources_ is sad to contemplate.

The ideal of the GPL was that it would force other projects to switch to GPL,
but it doesn't seem to happen too much in practice because the modern world
has an abundance of alternative software for any task. Think long term:
chances are someone in the next 50 years will be annoyed enough to reimplement
whatever you're doing.

~~~
robert_foss
In the context of this project which is yet to have a finalized specification
I think the GPL is an ideal format. As a project it is important to have
access to all of the pieces and avoiding incompatible forks of different kinds
in the early stages. So for the purpose of developing a 'golden standard'
prototype for FLIF, I don't think a non-GPL license would have served them
better. The author seems to have the same idea, and is considering re-
licensing to MIT later on.

I don't agree with corporate apologeticism which seems to be the norm here. If
a company wants get something for free, expecting it to publish source code
changes is not an actually high threshold. It might be in terms of corporate
politics, but in actuality it is not hard.

~~~
jobigoud
> If a company wants get something for free, expecting it to publish source
> code changes is not an actually high threshold.

Obviously the issue is not about publishing changes to the library, it's about
publishing the rest of the source which just uses the library as a building
block.

~~~
robert_foss
You're free to link to compiled GPL libraries. There is nothing forcing you to
change the license of your product unless you intend to integrate the
sourcecode.

~~~
kazinator
I also believe this, but it is not an established fact. The GPL does prohibit
it, and so whether that is actually enforceable has to be tested in a court of
law. So you are in fact not free to do this, if you have a boss above you who
cares dearly about the company steering clear of hot water.

I'd be willing to testify as a technical expert in a court of law that the GPL
cannot reasonably rule out dynamic linking; that dynamic linking to a program
is a form of _use_ (like invoking a command line and passing it arguments) and
not _integration_. The proof is that dynamically linked components can be
replaced by clean-room substitutes which work exactly alike, or at least well
enough so that the main software can function.

For example, a program can be shipped with a stub library which behaves like
GNU Readline (but perhaps doesn't have all its features). The users themselves
can replace that with a GNU Readline library. Thus, the program's vendor isn't
even redistributing the GPL'ed component. They can provide it as a separate
download or whatever. However, if they were to include a GNU Readline binary
for the convenience of the users, then the program supposedly infringes. This
is clearly nonsense.

~~~
belorn
The problem with that line of thought is that the courts don't care if its
_use_ or _integration_ , but if the resulting work _depends_ on someone else
work. The word used in copyright law is transform, adapt, recast, and such
changes requires additional copyright permission. For example, if I buy a
painting and cut it into pieces and rearrange them, I actually need an
additional license beyond what I got from purchasing the copy. Components can
not be cleanly viewed as separate when dealing with copyright.

You can also turn this around and ask if its legal to use process call within
a other program without invoking the need for additional permissions. FSF view
that it should be legal, but it has never been tested. By now however an
_industry standard_ has formed around the FSF guidelines and most courts would
just look at it when deciding. This is what commonly happen when no one go to
court to find out what the rules actually should be.

~~~
kazinator
> _but if the resulting work depends on someone else work._

Yes, so obviously your argument cannot be that "in theory, we could replace
this with a workalike".

You better have the workalike, and that's what you should be shipping.

A powerful argument that you aren't infringing is that your shipping media are
completely devoid of the work.

> _don 't care if its use or integration_

For the sake of the GPL, they _must_ care in this case, because the GPL
specifically abstains from dictating _use_ ; it governs redistribution!

The only parts of the license relevant to use are the disclaimers; the only
reason a pure user of a GPL-ed program might want to read the license at all
is to be informed that if the program causes loss of data (or whatever), the
authors are not liable.

GPLed programs get _used_ all the time. A proprietary app on a GNU/Linux
system can use the C library function system() which might invoke /bin/sh that
is GNU Bash, and even depend on that functionality.

> _For example, if I buy a painting and cut it into pieces and rearrange them,
> I actually need an additional license beyond what I got from purchasing the
> copy._

But what if that cut-up never leaves my house?

Or what if I only distribute instructions which describe the geometry of some
cuts which can be made to a painting, and the relocation of the pieces?

~~~
belorn
> A proprietary app on a GNU/Linux system can use the C library function
> system() which might invoke /bin/sh that is GNU Bash, and even depend on
> that functionality.

And that according to FSF is legal because it do not create a derivative work.
You said above that "GPL cannot reasonably rule out dynamic linking", but now
you are picking and choosing which part of FSF interpretation of derivative is
correct and which is wrong. I just wanted to point out that the law could have
been easily interpreted in a different way if someone had challenged FSF
interpretation 25 years ago.

> But what if that cut-up never leaves my house? Or what if I only distribute
> instructions

Again, the law is both clear and quite fuzzy at the same time. The author has
the exclusive right to transform their work, and as such, you could get
charged even if it never leaves your house. In EU its a bit different, since
it talks about moral right which protect the integrity of the authors work,
through the end result is likely to be the same in many cases.

As for just giving out instructions, the legal nature of those are extremely
fuzzy. If I provide instructions that reproduce a copyrighted video (by
compressing/encrypting the data that represent it), I will still run foul of
copyright infringement. From what I have seen, courts tend to take a "common
sense" approach to this problem and if the end result is an infringement, then
the indirect steps that will cause an infringement becomes infringement too.
Judges collectively seem to rule against people who they perceives as trying
to bypass laws by technicalities.

~~~
kazinator
> _You said above that "GPL cannot reasonably rule out dynamic linking", but
> now you are picking and choosing which part of FSF interpretation of
> derivative is correct and which is wrong._

I don't see how you perceive a position change here. The FSF considers dynamic
linking to be derivative; I do not agree.

We both consider the invocation via command line not to be derivative.

> _now you are picking and choosing which part of FSF interpretation of
> derivative is correct and which is wrong._

Have been all along. "Dynamic linking is derivative" is almost complete
bullshit in my eyes.

It's pretty much pure use. We map this object into memory and then call it.

------
cbr
They claim this is good for responsive images:

    
    
        The download or file read operations can be stopped
        as soon as sufficient detail is available, and if
        needed, it can be resumed when for whatever reason
        more detail is needed.
         -- http://flif.info/responsive.php
    

Unfortunately that's not how HTTP works. Your browser opens several
connections to a website, and makes requests for resources. Each connection is
in use until that resource is fully loaded, at which point you can send a
request for another resource up that connection. Yes, the browser could close
the connection after it has received enough of the image for the desired
quality, but there's a lot of overhead in setting up a new connection,
including some round trips, and so that is probably a net loss.

The JPEG image format also supports progressive images, in browsers today, and
no one has figured out how to use that for single-file responsive images yet.

(You might be able to do something in HTTP/2 where you set the priority of the
stream to 0 after you have enough of it for your needs, but I don't think
anyone has gotten this working yet.)

~~~
XCSme
Why do you say it is a net loss? Maybe the server could signal the end of file
based on some request parameters (the resolution of the image).

~~~
AshleysBrain
Yes, handling it on the server side sounds like a reasonable workaround. For
example photo.flif?res=1x returns Content-Length: 100000, and
photo.flif?res=2x returns Content-Length: 200000, with both returning data
from the same file.

------
chmike
I couldn't see any information on speed performances. There is certainly a
price to pay for these impressive results. It could be the compression time. I
hope not the uncompressing time.

~~~
foldor
I'm guessing that they're not releasing the numbers on that yet until the
format is finalized. It's likely not gone through any serious optimizations
yet.

~~~
versteegen
See the quote I pulled out above:
[https://news.ycombinator.com/item?id=10318161](https://news.ycombinator.com/item?id=10318161)

------
MasterScrat
Sounds almost too good to be true... any major downside, other than current
lack of support?

~~~
raving-richard
Because it is GPL3, it won't be supported by Google on Android, by Apple
probably anywhere, or by Microsoft anywhere.

In other words, it's a lovely idea, but due to a poor choice of license, it
won't get any adoption. C.f. Ogg Vorbis, where the license for the
specification is public domain, and the license for the libraries are BSD-
like.

To quote FSF:

    
    
        Some libraries implement free standards that are competing against restricted standards, such as Ogg Vorbis (which competes against MP3 audio) and WebM (which competes against MPEG-4 video). For these projects, widespread use of the code is vital for advancing the cause of free software, and does more good than a copyleft on the project's code would do.[1]
    

I would suggest that this is a case where widespread use is more important
than copyleft.

[1] [https://www.gnu.org/licenses/license-
recommendations.html](https://www.gnu.org/licenses/license-
recommendations.html)

~~~
pmelendez
> Because it is GPL3, it won't be supported by Google on Android, by Apple
> probably anywhere, or by Microsoft anywhere.

Why not? As far as I know, GPL3 let you to dinamically link without having to
open the source code the resulting software.

~~~
skrause
You can't link proprietary software with GPL libraries, only with LGPL
libraries.

For but Apple it's not even a licensing concern. Apple doesn't have a problem
with the GPLv2, but doesn't touch the GPLv3 because of the patent clauses that
it contains. The bash in OS X is a version from 2007 because that's the last
GPLv2 version of bash. But Apple has no problem including git because that's
also GPLv2 and not v3.

~~~
anon1385
>Apple doesn't have a problem with the GPLv2, but doesn't touch the GPLv3
because of the patent clauses that it contains

It's not the patent clauses that are the issue:
[https://news.ycombinator.com/item?id=8868994](https://news.ycombinator.com/item?id=8868994)

------
StavrosK
How does this unicorn achieve all this, and unencumbered by patents on top of
everything else?

~~~
Daemon404
I cannot imagine it truly is - it uses a variation of CABAC, which sure has
patents related to it.

I wonder if they had a real legal person OK that claim.

------
donlzx
I hope the idea of progressive downloading partials can be applied to other
files types as well.

For example, if commonly used libraries such as minified Javascript source
files are encoded and distributed incrementally, we don't have to repetitively
download various versions of the same library from different CDNs for
different websites, which will save a huge amount of Web traffic.

------
fnordfnordfnord
Where are the images to compare? Bellard has a very neat demonstration on his
page [http://xooyoozoo.github.io/yolo-octo-
bugfixes/#nymph&jpg=s&b...](http://xooyoozoo.github.io/yolo-octo-
bugfixes/#nymph&jpg=s&bpg=s)

BTW, credit to the flif people for linking competitor formats bpg and webP

~~~
kcorbitt
Well given that they're all lossless formats, the image should appear
identical with any of the codecs, with only the file size (and encoding style)
being different.

More practically, since browsers won't have an flif decoder there isn't an
easy way to embed the images online without reencoding them in a different
format, which would rather defeat the purpose of putting up sample images.

~~~
espadrine
They could demonstrate interlacing comparisons by slowing down (through
setTimeout()) the image processing in the browser.

Having a Youtube video already shows that it works really amazingly well
(especially coming from Adam7), but seeing it interactively on a selection of
images would be nice.

After all, the single biggest feature is that you can actually stop
downloading the image whenever you feel like you don't want to spend more
bandwidth!

------
shmerl
_> FLIF is completely royalty-free and it is not encumbered by software
patents._

That's good but does it actually avoid patent minefields that others scattered
around? That's equally critical for healthy adoption.

------
russnewcomer
I don't see any comparison here on the impact of decompressing? Is this going
to hit a processor harder than JPG/PNG/BPG/et al will, and thus be a hit to
people on mobile devices?

~~~
infogulch
Decompression speed is important. (And more important than compression speed.)
But the bottleneck on mobile devices is increasingly falling on the mobile
network. CPU performance per watt is falling dramatically where mobile
bandwidth per watt is staying relatively static. If this continues eventually
we'll get to the point where the power usage (e.g. for loading and displaying
a web page) is completely dominated by data transfer, and more and more
computationally expensive compression becomes the best way to save overall
power by trading cheap cpu cycles for expensive bandwidth.

------
natch
One more nice feature they didn't mention: unambiguous pronunciation.

~~~
mdup
So, is it eff-leef or fleef ?

~~~
tricolon
/'flɪf/

------
robert_foss
An interesting aspect that has not been mentioned on flif.info nor here, is
the implementability on silicon.

How many transistors would be needed for and encoder? How about a decoder or a
codec (encoder/decoder)? And what about clock speed? How long is the longest
pipeline in the codec?

For any other modern codec the hardware aspect is a very important one.
Usecases like smartphone browsers or smartwatches are very common, and on
platforms like that performance==battery life==usefulness.

------
kenOfYugen
"A FLIF image can be loaded in different ‘variations’ from the same source
file, by loading the file only partially. This makes it a very appropriate
file format for responsive web design. "

Awesome! I have been toying with custom png 'container' ideas in the past
offering similar 'responsive' features (i.e various resolutions, lossy-
lossless versions)

I would like to see the ASM.js version of the decoder and see how that one
performs.

------
hokkos
It would be nice to have a comparison with
[https://en.wikipedia.org/wiki/JPEG_XR](https://en.wikipedia.org/wiki/JPEG_XR)

------
antirez
One would expect, in theory, Google to push this like mad, based on what they
did with HTTP 2.0 in another part of the stack, for possibly less gain at the
cost of much more complexity and standards shaking. Let's hope it really
happens.

------
URSpider94
I don't see how the author can claim that FLIF is "unencumbered by software
patents," only that he/she hasn't obtained patents on the work already. This
has traditionally been one of the problem with so-called "free" image formats
-- it's not that the standards body or creator asserts patent rights, it's
that other inventors claim that their pre-existing patents cover the new
format.

------
happywolf
Would like to point out a caveat listed in that page, as follows. Don't
convert all your photos in your drive (yet).

"WARNING: FLIF is a work in progress. The format is not finalized yet. Any
small or large change in the algorithm will most likely mean that FLIF files
encoded with an older version will no longer be correctly decoded by a newer
version. Keep this in mind. "

------
whatever_dude
Sounds amazing. But: no information on encoding time. I'm assuming it's going
to be a beast.

Which, IMO, is totally fine.

------
jmspring
Ignoring patents is a concern.

The patent issue is a major one and not lightly ignored. I haven't checked
patent status, but one of the things that was pretty contentious with JPEG
2000 during the ISO standardization process was patents around arithmetic
coding -- not just the method for doing the encoding, but also things like
context based modeling.

In reality, the majority of image compression comes about in the "modeling"
stage -- be it predictive coding, context based encoding coefficients, etc.

I'm happy to see new advances in image coding, but having spent many years
working with Glen Langdon and others, the depth of IP concerns is still fairly
fresh in my memory.

------
ruffyen
I think the important thing that needs to be addressed here before this file
format takes hold is how the hell you are going to pronounce this format. I
don't want another gif/gif debate.

------
MrBra
> FLIF is based on MANIAC compression. MANIAC (Meta-Adaptive Near-zero Integer
> Arithmetic Coding) is an algorithm for entropy coding developed by Jon
> Sneyers and Pieter Wuille. It is a variant of CABAC (context-adaptive binary
> arithmetic coding), where instead of using a multi-dimensional array of
> quantized local image information, the contexts are nodes of decision trees
> which are dynamically learned at encode time. This means a much more image-
> specific context model can be used, resulting in better compression.

------
alphabetam
GPL 3. Meaning it won't ever be adopted by anyone.

~~~
yiyus
IANAL, but I guess the file format itself is not covered by the GPL 3, only
the reference implementation.

They specifically say that "FLIF is completely royalty-free and it is not
encumbered by software patents". So, I suppose people will be allowed to
develop their own libraries with another license.

~~~
dagw
True, but the only complete documentation of the file format I found was the
source code of the reference implementation. So the only way to write a new
implementation is to study the reference implementation, which will make your
new implementation "tainted".

~~~
antientropic
Studying GPL'ed source code does not "taint" anything. You're absolutely
allowed to do that. If you're _really_ concerned about subconsciously copying
elements of the original into your implementation, you could let somebody else
study the original and write a spec.

~~~
dagw
_Studying GPL 'ed source code does not "taint" anything._

This is far from my area of expertise, but lawyers smarter than me about these
things disagree.

------
Mathnerd314
I found what appears to be an early draft of the paper:
[http://webcache.googleusercontent.com/search?q=cache:z2rYA4Z...](http://webcache.googleusercontent.com/search?q=cache:z2rYA4ZilDoJ:https://svn.ulyssis.org/repos/sipa/jif/paper/jpif.tex)

------
yekim
Pretty sweet. Will be interesting to see where this lib goes. If someone has
some time, maybe they could give it a mention on the Wikipedia page:
[https://en.wikipedia.org/wiki/Image_file_formats](https://en.wikipedia.org/wiki/Image_file_formats)

------
gok
Comparing compression ratios without comparing compute/memory requirements
is...not terribly meaningful.

------
hngiszmo
For all the GPL-hate in this thread: GPL protects this specific implementation
and not the algorithm. Even if the authors released this under GPL, this would
only be a minor nuisance to getting this unicorn into all kind of products via
re-implementations of the same protocol.

------
r-w
I guess this would be considered a Las Vegas algorithm:
[https://en.wikipedia.org/wiki/Las_Vegas_algorithm](https://en.wikipedia.org/wiki/Las_Vegas_algorithm)
§ “Relation to Monte Carlo algorithms”

------
baldfat
I don't see any foundations or corporations backing this project. Without
support (I hope something like this does gain wide scale support (SVG in
browsers for the Internet please already!) I don't see this being accepted.

~~~
pmelendez
What you say it's true for a wide scale usage. However, imagine a small
company, startup, or a sole developer using this for their projects or as a
internal format. I found it very interesting even if there is not a widespread
use of it.

~~~
baldfat
It is interesting! I agree but I guess I am getting to old. I keep hearing
awesome new formats and than they just waste away. The few success stories are
few and in between. FLAC and OGG are a good example. OGG is a decent format
and I prefer it to MP3 but it really just withered. FLAC is awesome but it is
far from main stream. Heck even SVG is far from where it SHOULD be.

~~~
anc84
And then there is Opus which took the world in storm.

~~~
baldfat
It has the devices covered

> Devices based on Google's Android platform, as of version 5.0 "Lollipop",
> support the Opus codecs. Chromecast supports Opus decoding. Grandstream
> GXV3240 and GXV3275 video IP phones support Opus audio both for encoding and
> decoding.

Developers were Mozilla and Skype (MS after the purchase)

>Its main developers are Jean-Marc Valin (Xiph.Org, Octasic, Mozilla
Corporation), Koen Vos (Skype), and Timothy B. Terriberry (Xiph.Org, Mozilla
Corporation). Among others, Juin-Hwey (Raymond) Chen (Broadcom), Gregory
Maxwell (Xiph.Org, Wikimedia), and Christopher Montgomery (Xiph.Org) were also
involved.

Standardized

> [https://tools.ietf.org/html/rfc6716](https://tools.ietf.org/html/rfc6716)

Still I think most people have no idea about Opus as a codex and only thing
OGG (Container which Opus uses) and Vorbis codec format (Which Opus was to
replace).

------
joelthelion
How cpu-intensive is it? Decompression speed can be an issue, especially on
mobile.

~~~
matheweis
Here is what he says about that:
[https://boards.openpandora.org/topic/18485-free-lossless-
ima...](https://boards.openpandora.org/topic/18485-free-lossless-image-format-
flif/?do=findComment&comment=399001)

Decode speed to restore the full lossless image and write it as a png is not
so good: about 0.75s for a median file, 0.25s for a p25 file, 1.5s for a p75
file. That's roughly 3 to 5 times slower than the other algorithms. However,
decoding a partial (lossy) file is much faster than decoding everything, so in
a progressive decoding scenario, the difference would not be huge.

There's still room for optimizing the encode/decode speed, but it's not very
useful to do that before the bitstream is somewhat finalized. The prototype
implementation I have now is not extremely fast, but at least it's in the
right ballpark. Most likely even an optimized decoder will still be slower
than an optimized PNG decoder, but the difference will be small enough to not
matter much (certainly compared to the time won by downloading less bytes).
comradekingu likes this

------
stolsvik
For me, the license choice makes this project "interesting, but _no way_ this
is gonna fly!!", and I immediately think that this is a "try-before-you-
buy"-usage of the GPL, to which there are many.

------
hellcatv
How does this compare to packpnm
[http://packjpg.encode.ru/?page_id=73](http://packjpg.encode.ru/?page_id=73)

------
tracker1
GPLv3 will make adoption problematic... if it were at least LGPL so that it
could be used as a library that would make adoption far more likely.

------
tempodox
Those diagrams are incomprehensible. The sorted and unordered WTF-per-diagram
rate is too frigging high.

------
waynecochran
BE SKEPTICAL!

Not a single mention of "signal to noise ratio" (SNR). If you are going to
list compression rates you have to give the associated quality -- SNR is used
in the literature.

I was suckered into "fractal image compression" by those selling the snake oil
back in the 90's -- I am far more skeptical about these things now.

~~~
GhotiFish
could you go into detail about what you mean with "associated quality"?

These are lossless compression algorithms that are being compared here. The
only other pertinent details regarding a lossless compression algorithm are
the Compression Ratio, CPU usage, and Memory usage.

~~~
waynecochran
Because I am an idiot. Disregard. Move along.

------
williamsharkey
Can the progressive data obscure previous portions of the image?

If so, it can also be a hacky video format.

EG: A flif would specify how many megabytes it used per second. There would
not be discrete "frames".

If a video flif said that it used 1MB per Second, and you wanted to see what
the video looked like at 10.8 seconds, you would download 10.8MB .

------
anentropic
am I the only one who doesn't understand the horizontal axis on their graphs:
"Images sorted on compression ratio"?

what does it mean that the FLIF lines suddenly get worse than PNG over on the
right-hand side?

~~~
jonsneyers
For each algorithm, the images are sorted from best (on the left) to worst (on
the right). So if you look at the middle region of the graph, you see how they
behave on "median" images.

Another way to view the graph is: the area under the curve corresponds to the
total disk space needed to store a large corpus of images in the given format.
So obviously lower is better.

------
andrepd
I wonder how it compares to lossy (but high-quality) compression codecs.

------
MrBra
Then it was written, that we had to be doomed by animations everywhere!

~~~
dcposch
Anywhere on the internet where you can post an image today, you can already
choose an animated GIF.

If the site doesn't want animations, it will only show the first frame. For
example, try uploading an animated GIF to Facebook.

------
proyb2
Between Brotli and FLIF, which is good enough?

~~~
gillianseed
Brotli is excellent at text and does a great job at compressing overall, but
FLIF is written directly to excel at lossless image compression, hence FLIF
will be much better unless it's a really poor solution (which results show
it's certainly not).

Likewise Brotli will not beat FLAC at compressing audio, since FLAC is
specifically written to do that.

~~~
robryk
Actually, FLAC doesn't have a very good compression ratio (e.g. see [1]). It
does have features that are very important for audio streams: it can be
quickly resynced and AFAIK decompression proceeds at nearly-constant speed.

[1]
[https://news.ycombinator.com/item?id=7893171](https://news.ycombinator.com/item?id=7893171)

~~~
gillianseed
Even so, I would be very surprised if FLAC did not outperform general purpose
compressors if the test is done on a wider range of music than a single piano
piece (which should be very compressable overall as audio goes).

edit: went and did a test on a couple of tracks, using brotli: bro --quality
11 --window 24 against flac -8 (best settings in both cases)

    
    
      satie - gymnopedie no 1:
      wav: 41.6mb
      brotli: 28.1mb
      flac: 14.7mb
    
      ram jam - black betty:
      wav: 41.8
      brotli: 35.8
      flac: 26.5
    
      vangelis - tao of love:
      wav: 29.4
      brotli: 23.0
      flac: 13.4
    
      meat load - paradise by the dashboard light:
      wav: 89.3
      brotli: 78.3
      flac: 58.5
    

certainly not a massive test, and there are likely general purpose compressors
which does a better job than brotli, still I feel rather convinced that they
will not do better than FLAC other than in extreme cases such as the piano
piece you linked to.

------
arasmussen
How does it compare to slightly lossy jpegs?

------
tzakrajs
And the Weissman score is?

------
jttam
Does it use a middle out algorithm? :)

~~~
anandvc
That's exactly what I was wondering! :)

------
jaigouk
wow. this is amazing!

------
paxcoder
I want to believe.

------
T3RMINATED
Request denied by WatchGuard HTTP Proxy.

Reason: Category 'Newly Registered Websites'

~~~
fnordfnordfnord
WatchGuard is a disease. They also listed Hackaday as banned for "Hacking".
Ever try to get a site unlisted from them? It's nearly impossible, and may
only last a month or so before it goes back on the blacklist.

------
boksiora
looks great!!!

i want to see it in my browser and websites, home some browser support come
from chrome and firefox

------
rancur
BMP is free isn't it????

~~~
rancur
I guess trollers/jokesters aren't appreciated here...

------
dnesteruk
Doesn't the GPL license generally ruin being able to use this in a commercial
product by having to provide source code and affecting derivative works, too?

------
Navarr
Obligatory XKCD: [https://xkcd.com/927/](https://xkcd.com/927/)

No but I'm excited, this is progress!

