Lossless data compression and information complexity in general has been kind of my hobby for last 13-14 years. More out of OCD/curiosity than practical purposes, like a puzzle if you will.
From what I gathered, I think future of compression lies in the way we look at data patterns itself. That is, not looking at patterns statistically, and not looking at the problem as statistical, but some new approach via discrete math.
That is, a fresh new "out of this world" approach. I have some ideas where it might come from, but I don't want to sound like a mad man.
those being the most important ones. Current methods are more or less all analysis/statistics/continuous math. I think, from what I gathered that new approach will come from the area of discrete maths. Mainly combinatorics. Also, this is where mad man part comes in, cellular automata kind of systems.
Discrete math in itself offers great tools to make r&d on paper, but requires severe computing power later on to prove right. New research also needs new minds un-poluted with "what can and can't be done" in order to bring new ways of thinking. I won't bore you with details, I had lots of ideas over the years, but pretty much all were fluke for lossless.
From what I gathered, I think future of compression lies in the way we look at data patterns itself. That is, not looking at patterns statistically, and not looking at the problem as statistical, but some new approach via discrete math.
That is, a fresh new "out of this world" approach. I have some ideas where it might come from, but I don't want to sound like a mad man.