Results 1 to 4 of 4

Thread: COMPRESSING AES CBC MODE OUTPUT

  1. #1
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts

    PIM Logo COMPRESSING AES CBC MODE OUTPUT

    I stumbled on an article. That I don't believe. Or I misunderstand the points from my fast reading of it.
    Isn't it virtual always thought the compressing before encryption is the best way to go for security. But I think this article implies that compressing after encryption is just about as good. What am I missing and do some of you have thoughts on this.?

    http://eprint.iacr.org/2010/477.pdf

  2. #2
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    I skimmed through it...
    Thank you, it's very interesting...if it's right, security consequences would be huge.
    You could estimate entropy of a sufficiently long message. And therefore - tell real messages from random noise.
    The only defence seems to bring message's size as close as feasible to its entropy, that is - compress it well before transmission.

    I wonder what do they mean with
    It was shown in[18],[19] that Slepian-Wolf compression approaches entropy with speed O(sqrt(log(n)/n) which is considerably slower than O(1/n) of arithmetic coding.
    Also, they only did the maths only for small and extremely redundant messages. 128 bytes? A non-negligible compression over 1 TB with a sensible entropy would be much more interesting. It may or may not be possible with their scheme, too bad they didn't check it.

  3. #3
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Thinking about it some more I chilled down a bit.
    It doesn't seem to be able to do any modelling, just entropy coding. So all it takes to defeat it is a sufficiently precise coder. The meaning of 'sufficiently' is easy to grab, there are formulae to calculate maximum entropy depending on message size. It seems to me that encryption algorithms could be easily adapted to bypass this vulnerability.

  4. #4
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    The idea is to "compress" the encrypted data by throwing away some of the bits. The decoder, which knows the key, guesses the discarded bits and tests that the decrypted plaintext is compressible. This works well in CBC and CFB modes because if the decoder guesses wrong then the rest of the message looks random. In ECB and OFB modes this does not work as well because only the current block (usually 128 bits) looks random. The subsequent blocks are not affected.

    The obvious disadvantage is that you can't throw away too many bits or else decompression takes too long. Also, you can't throw away more bits than the redundancy of the plaintext. If you did, then you could not decompress even with unlimited computing power because more than one guess would test correct. In general, the compressor does not know either the plaintext or the key, so it cannot know how many bits can be safely discarded. It might be zero if, unknown to the compressor, the plaintext is already compressed.

    Another method, not mentioned in the paper, is that ECB mode is compressible if there are multiple identical blocks of plaintext that encrypt to identical blocks of ciphertext. Of course this is insecure, which is why ECB mode isn't used unless other steps are taken to avoid duplicate plaintext blocks. Compressing the plaintext would be one way to do this. But CBC mode is simpler and guaranteed to work.

Similar Threads

  1. Compressing pi
    By Matt Mahoney in forum Data Compression
    Replies: 24
    Last Post: 26th September 2016, 20:17
  2. Slow output to nul:
    By Matt Mahoney in forum The Off-Topic Lounge
    Replies: 3
    Last Post: 31st December 2011, 00:21
  3. Rangecoding with restricted output alphabet
    By Shelwien in forum Data Compression
    Replies: 0
    Last Post: 17th August 2010, 19:38
  4. Fast arithcoder for compression of LZ77 output
    By Bulat Ziganshin in forum Forum Archive
    Replies: 13
    Last Post: 15th April 2007, 18:40
  5. Debug mode!
    By encode in forum Forum Archive
    Replies: 0
    Last Post: 12th May 2006, 17:40

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •