Hacker News new | past | comments | ask | show | jobs | submit login

I've been wondering for a long time why nothing better than JPEG seems to be catching on for lossy photo compression. It seems like it would be possible to trade CPU cycles for file-size (in the same way that h.264 vs MPEG2 does), but I suppose it's really hard to get critical mass with anything that isn't backwards compatible...



I also wonder if it would be possible to establish a set of well known reference bits to compress against.

E.g. if you shipped an additional 10 MB of "data" with every browser, could you reduce the average image size on the internet by X% by leveraging that 10 MB?

(I have no idea; just an interesting thought exercise. Could make a cool masters thesis for someone.)


I can see how it may potentially work with fine crafted bloom filters - lossy, of course! I'm just not sure if training the set of filters would be worth the effort, because computing the filters might take a looooooooooooooooong time on that level of information complexity. I've been toying around with something like that over the years actually, had some models I've programmed (not for pictures though) - I gave up.


Sounds like the kind of thing that a company like Google could tackle. It would be especially good in the mobile world, where bandwidth is more limited.


Well funded organization could do it for sure. And Google would benefit from it, of course. But, from my limited research over the years I'm not convinced anymore that any further significant advancement could be done via continuous mathetmatics. It might be my personal bias from my experience, but I have a strong hunch on new approaches via discrete mathematics will be the key for any serious advances approaching shannon's limit / kolmogorov complexity (if those hold true, and they should as far as we know).


I like it. With the size of hard drives now, it seems like that would be a good trade-off against bandwidth (which is scarcer). It would save bandwidth and speed things up, two things much more desirable than a few more megs of HDD.

I suggest that you make a blog post or Tell HN about this and post it as a top level submission. I'd love to hear more opinions about this.


Go for it if you're passionate about it!

I understand the principle but don't understand enough of how it would actually work to speak for it as much as I'd want to.

Such a technique could also be especially useful for low bandwidth situations, most notably Paul English's proposed Internet for Africa effort, http://joinafrica.org/ .

If you're interested in this pursuit you should also look at SPDY, Google's suggested augmentation of HTTP. It would probably speed up the web more than better compression in many cases.


This is somewhat possible, I thought about this few years ago. I'm not sure if it helps much or at all with current compression algorithms though. Should probably give it a second some day.


IIRC JPEG 2000 has some patent FUD and JPEG XR has "it's from Microsoft" FUD.


There was some really forward-thinking work on Fractal Image Compression in the late 90s, but largely due to time-efficiency, plummeting storage costs, and patent encumbrances, it's never caught on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: