As an example,
http://nishi.dreamhosters.com/u/gv2_1.rar
858 bytes. Its just the postcoder from bwtmix now, without BWT,
with some tweaks. I'd post the source if anybody wants that,
but its not really useful imho.
Details:
-1 5-bit coding
-1 extra check for >=26
-1 redundant rc cache byte removed
-2 rc flush optimized
-4 file size field removed (hardcoded 1674)
-26 tuning of model's parameters
But anyway, all of this doesn't make sense really.
Improving compression by 5% at 1k scale doesn't mean
that there'd be the same improvement at 1M scale
(in fact, likely the reverse, because tuning for small files
tends to oversimplify the models).
I guess, it might be still applicable if there's random access
to small independently compressed records or something,
but again, in that case there're some better methods.