Sportman's thread reminded me of something -

one rather common idea of random compression people is that its possible to

split random data into substrings without repeated symbols and then compress these.

Obviously it doesn't actually work.

But I kinda liked the idea, and sometimes use it as a "data compression problem" -

given a file of length N, and knowing that each byte value occurs only once within each

aligned 256-byte block, what's the guaranteed compression ratio for this file?

I even made a demo coder for this - http://nishi.dreamhosters.com/u/task1.rar

but its a purely mathematical problem.