

Tokyocabinet + LZ4 = Speedy Gonzales - maxpert
http://blog.creapptives.com/post/25026698783/speeding-up-your-datastore-with-compression
Using a simple compression technique can result in a huge difference.
======
pivo
I got similar results compressing data for a mixed memory/disk based cache
(Ehcache) using Java's in-built GZIP compression. Can't believe it took me so
long to try this.

The cache values were in the 2-8K range and GZIP gave me about %90
compression. I also "compressed" the cache keys (which could be in 1-2K range)
by using their MD5-sums instead of the actual key. The tiny chance of a
collision was hugely outweighed by the memory savings.

~~~
maxpert
I totally agree with MD5 strategy, but using it can be a bad idea for range
scans if you are want to use them at all in future.

------
bravura
This works well if the value is 3K-4K.

But what about very short keys and values? Things on the order of 10 bytes?

I imagine that the compression header would not allow for meaningful
compression.

What would be an appropriate compression library for this use case of
compressing short texts? Particularly if we know that we are compressing
English text? I have searched for such a library, but cannot find one.

~~~
maxpert
You don't need to worry about such small lengths of values

------
hero-nakamora
How about doing some benchmarks with Redis?

~~~
maxpert
You can try it at your own. what I expect is that it would be much more
faster, since you are pushing less data on wire.

