In fact, this type of linear trade-off is ubiquitous in
known succinct data structures, and in data compression.
The folk wisdom is that if we want to waste one bit per block,
the encoding is so constrained that it cannot help the query
in any way. Thus, the only thing a query can do is to read
the entire block and unpack it.
We break this limitation and show how to use recursion
to improve redundancy. It turns out that if a block is encoded
with two (!) bits of redundancy, we can decode a
single element, and answer many other interesting queries,
in time logarithmic in the block size.
Our technique allows us to revisit classic problems in
succinct data structures, and give surprising new upper
bounds. We also construct a locally-decodable version of
An Alternative to Arithmetic Coding with Local Decodability (2010) [pdf]
Changing base without losing space (2010) [pdf] https://dl.acm.org/citation.cfm?id=1806771 (same paper as above, different title)
"The ternary numeral system (also called base 3) has three as its base...a ternary digit (trit) is analogous to a bit. One trit is equivalent to log2 3 (about 1.58496) bits of information" .
NB: Base 3 is the integer base with the lowest average radix economy; however, base e has the lowest average radix economy overall .
 Ternary numeral system https://en.wikipedia.org/wiki/Ternary_numeral_system
 Radix economy https://en.wikipedia.org/wiki/Radix_economy
 Three-valued logic https://en.wikipedia.org/wiki/Three-valued_logic
Representing trits (0, 1, 2) efficiently in binary is harder. (Which is the subject of the paper. It's a great paper, and we lost a marvelous mind when Mihai died.)