# Thread: COMPRESSING AES CBC MODE OUTPUT

1. ## COMPRESSING AES CBC MODE OUTPUT

I stumbled on an article. That I don't believe. Or I misunderstand the points from my fast reading of it.
Isn't it virtual always thought the compressing before encryption is the best way to go for security. But I think this article implies that compressing after encryption is just about as good. What am I missing and do some of you have thoughts on this.?

http://eprint.iacr.org/2010/477.pdf

2. I skimmed through it...
Thank you, it's very interesting...if it's right, security consequences would be huge.
You could estimate entropy of a sufficiently long message. And therefore - tell real messages from random noise.
The only defence seems to bring message's size as close as feasible to its entropy, that is - compress it well before transmission.

I wonder what do they mean with
It was shown in[18],[19] that Slepian-Wolf compression approaches entropy with speed O(sqrt(log(n)/n) which is considerably slower than O(1/n) of arithmetic coding.
Also, they only did the maths only for small and extremely redundant messages. 128 bytes? A non-negligible compression over 1 TB with a sensible entropy would be much more interesting. It may or may not be possible with their scheme, too bad they didn't check it.

3. Thinking about it some more I chilled down a bit.
It doesn't seem to be able to do any modelling, just entropy coding. So all it takes to defeat it is a sufficiently precise coder. The meaning of 'sufficiently' is easy to grab, there are formulae to calculate maximum entropy depending on message size. It seems to me that encryption algorithms could be easily adapted to bypass this vulnerability.

4. The idea is to "compress" the encrypted data by throwing away some of the bits. The decoder, which knows the key, guesses the discarded bits and tests that the decrypted plaintext is compressible. This works well in CBC and CFB modes because if the decoder guesses wrong then the rest of the message looks random. In ECB and OFB modes this does not work as well because only the current block (usually 128 bits) looks random. The subsequent blocks are not affected.

The obvious disadvantage is that you can't throw away too many bits or else decompression takes too long. Also, you can't throw away more bits than the redundancy of the plaintext. If you did, then you could not decompress even with unlimited computing power because more than one guess would test correct. In general, the compressor does not know either the plaintext or the key, so it cannot know how many bits can be safely discarded. It might be zero if, unknown to the compressor, the plaintext is already compressed.

Another method, not mentioned in the paper, is that ECB mode is compressible if there are multiple identical blocks of plaintext that encrypt to identical blocks of ciphertext. Of course this is insecure, which is why ECB mode isn't used unless other steps are taken to avoid duplicate plaintext blocks. Compressing the plaintext would be one way to do this. But CBC mode is simpler and guaranteed to work.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•