Hacker News new | past | comments | ask | show | jobs | submit login

Lossless data compression and information complexity in general has been kind of my hobby for last 13-14 years. More out of OCD/curiosity than practical purposes, like a puzzle if you will.

From what I gathered, I think future of compression lies in the way we look at data patterns itself. That is, not looking at patterns statistically, and not looking at the problem as statistical, but some new approach via discrete math.

That is, a fresh new "out of this world" approach. I have some ideas where it might come from, but I don't want to sound like a mad man.




Are you talking about Fractal Compression methods?

http://en.wikipedia.org/wiki/Fractal_compression


Well, fractal compression is lossy based and sort of a reconstructive process which resembles original, but isn't. So, no.

I am talking about lossless compression. There are various things putting limits to what can be done ultimately.

http://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem

http://en.wikipedia.org/wiki/Kolmogorov_complexity

http://en.wikipedia.org/wiki/Pigeonhole_principle

those being the most important ones. Current methods are more or less all analysis/statistics/continuous math. I think, from what I gathered that new approach will come from the area of discrete maths. Mainly combinatorics. Also, this is where mad man part comes in, cellular automata kind of systems.

Discrete math in itself offers great tools to make r&d on paper, but requires severe computing power later on to prove right. New research also needs new minds un-poluted with "what can and can't be done" in order to bring new ways of thinking. I won't bore you with details, I had lots of ideas over the years, but pretty much all were fluke for lossless.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: