LZMA tends to be faster and more efficient than bzip2 in all respects.
I also wrote a random access program for xz files:
(in the plugins/xz subdirectory)
However, it is true that that same option does not seem to work everywhere. At least, it didn't seem to work a few weeks ago in a different system (Ubuntu). The option is accepted, but does not seem to have any effect.
For compression, let's say that the best possible compressor compresses a file to 30% of its size, and the current compressor reaches 50%. Then, an improvement to 45% should not be seen as 'only 5%', or as '10% smaller', but as '25% of the maximum possible improvement'. A follow-on step that gets you to 40% would be a larger improvement of 33%.
That, IMO, is a reasonable way to somewhat compensate for the fact that the low hanging fruit gets plucked by those who come first.
And yes, there is a problem there. That 'best possible compressor', theoretically, can produce extremely small files. Maybe your Wikipedia dump happens to be equal to the binary expansion of sin(1/e + 34/sqrt(PI)) to a few billion digits, but how are you going to find out? So, for most files, we don't really know what that best compression is.
For fast lossless compressors, see this article using an Apple II to compare the best of today's (LZ4) with what was available when the hardware was popular/new.
PAQ8PX is a classic example of this - amazing compression ratios, but ridiculously slow.
There are programs that can shrink JPG losslessly by 24%:
NanoZip keeps a very good balance of speed and compression ratio, but it's so unpopular, it's not very practical.
The following link has some interesting (albeit non-exhaustive) benchmarks, comparing various compression algorithms (both serial and parallel implementations).
For strong compression, there are better options than bzip2.
In a sense, bzip2 is "worse of both worlds".