

Using evolving images for image compression: 256x256 image in 1k - jkkramer
http://www.screamingduck.com/Article.php?ArticleID=46&Show=ABCE

======
patio11
Keep in mind that when you're fitting a compression algorithm so close to the
test cases, what you're really doing is pushing data from the test cases into
the compression algorithm. For instance, prior knowledge that blurs are
important gets pushed into the algorithm.

This is why when testing compression algorithms against each other you usually
weigh the total of the code and the compressed versions of some common corpora
which is large relative to what you expect code sizes to be.

If you don't do this, I mean, you can do lossless compression of the Mona Lisa
into one bit while remaining able to display any other JPG. Check bits of
input. If first bit is 1, then decompress into the Mona Lisa. If first bit is
0, the rest of the stream is a JPG, interpret as normal.

Still, I'm of the opinion that it remains an outrageously good tech demo for
GAs. (Visual, easily described task, results anyone can appreciate, etc.)

~~~
reitzensteinm
I have been watching the development of this for a while, and I don't think
that the algorithm has been tailored towards the test cases at all. The file
format is extremely general (vertices forming polygons, color, alpha and blur
per vertex). It's been fed anything and everything from around the net, but it
takes so many days of computation to make the images look good that only a few
examples were used in the article itself.

If anything, Mona Lisa and Lena were specifically chosen to be hard for a
system like this to compress (fine brush detail does not mix well n sided
polygons). Blur for example compensates for the shortcomings of polygon
rendering to make the whole system more general purpose.

------
huhtenberg
I am getting a whiff of a fractal image compression.

It was developed by none other than Michael Barnsley (of "Barnsley Fern"
fractal fame) and it was the compression technique du jour in the early 90s.
It looked _very_ interesting.

And then it got patented all over, which effectively inhibited any further
academic research. The patent holding company (Iterated Systems) struggled
with commercialization of the technology, so the practical applications never
really took off either.

Too bad really. The idea of compressing an image down to a _formula_ was both
elegant and damn smart. And so are these genetic algorithms. I just really
hope they don't follow the path of the fractal ones.

~~~
spot
it's actually turned out much worse than that. 1) fractal compression never
worked very well, not for lack of trying. it's just a bad idea if you actually
think about it. 2) the company is now involved in penny stock scams and is
using the wikipedia to prop up their claims. see
<http://en.wikipedia.org/wiki/Talk:Fractal_compression> where the conclusions
from the definitive research and the FAQ are being kept out of the article. i
tried to keep the page sane, and in return i got a campaign of assassination
run against me and my edits all over the wp.... which i won but i was unable
to fight 2 editors back on this page. and then i ran out of time and gave up.
anyway, if you can help, please do!

~~~
spot
these are good:
[http://math.lanl.gov/~brendt/Publications/wohlberg-1999-revi...](http://math.lanl.gov/~brendt/Publications/wohlberg-1999-review.shtml)
[http://www.faqs.org/faqs/compression-
faq/part2/section-8.htm...](http://www.faqs.org/faqs/compression-
faq/part2/section-8.html)

------
CoryOndrejka
For those who are curious, ran some comparisons versus jpeg here
(<http://www.flickr.com/photos/corylinden/3330168391/>). Short version is
that, as expected, jpeg block artifacts completely overwhelm image at very low
quality levels. Minimal jpeg file size ends up ~3k.

~~~
ComputerGuru
What I want to know is how their algorithm looks with _out_ the space
constraints.

For any compression scheme to take hold, it'll need to handle both large and
small files well.

Before looking at your comparison, I thought their pic looked like crap. Now
it looks absolutely stunning given the comparison shots. But how would it look
at 200kb? Does it compare (dis)favorably against JPEG then?

------
westside1506
Very cool algorithm demo. How well would this compare to a very extreme JPEG
compression?

Also, I'm not an image compression expert, but I wonder if you could use this
technique as a pre-pass before a more standard image compression algorithm to
improve the overall results. In other words, maybe subtract the image created
using the polygons from the original image and then compress the result using
JPEG (or something else more suitable?).

~~~
newt0311
Very extreme jpeg pre 2000 compression would probably do considerably worse as
it would degenerate into drawing in boxes of averaged color. jpeg200 would
absolutely destroy this compression technique as it specifically uses a form
of fourier analysis (wavelet analysis) which is _provably_ optimal for this
kind of compression. In fact, AFAIK jpeg2000 is designed so that as more bit
are read from a file, it is possible to get a better and more accurate picture
-- one of the benefits of using a fourier-analysis like technique. Sort of
like a compression on steroids. Getting a lower res. picture is only slightly
more complicated than truncating the file and requires minimal processing.

~~~
tlb
"Provably optimal" always involves unwarranted assumptions. For example,
least-squares fitting is provably optimal, assuming all your distributions are
independent and gaussian. Image compression quality is all about sensitivity
of the human visual system to various kinds of errors, so nothing can be
proven optimal, ever.

~~~
eru
Proving still retains value. Because you reduce the burden of choosing a good
algorihtm to choosing good assumptions.

~~~
jibiki
One common set of assumptions in image denoising is that:

1\. The pixels in the output image should be close in value to the
corresponding pixels in the input image.

2\. Neighboring pixels in the output image should have similar values.

I wonder if anyone has compared compression techniques using these sorts of
assumptions.

------
vincentpants
Interesting how there's only one fitness loop comparing the accuracy of two
polygon steps. This could totally be optimized with adding more comparisons to
the fitness loop. This is awesome actually, and I have to say that this is one
of the better genetic algorithm examples I've seen. 1k! woo!

------
ideamonk
I'm getting a Bandwidth Limit Exceeded message. Looks like talking about such
a technology would bring huge traffic ;)

------
globalrev
does GA and GP behave somewhat predictable then? Couldn't it, if you are
unlucky, take a very long time to converge?

~~~
jodrellblank
Going by the javascript version he was inspired by, it takes a long time to
converge if you are lucky (I kept it running for several days on and off). If
you're unlucky, it doesn't converge before you get bored.

~~~
bd
It's actually not so bad. It's just JavaScript and maybe my code (I wrote JS
version).

Roger Alsing (who was the original author and inspiration) did later an
optimized compiled parallel version that was able to get Mona Lisa in 1 minute
34 seconds (on a 64 bit 4 core machine).

[http://rogeralsing.com/2009/01/04/scaling-clustere-
evolution...](http://rogeralsing.com/2009/01/04/scaling-clustere-
evolution-1-1-4/)

Also there are supposed to be some efforts to run it on GPU, which should be
even faster.

