Results 1 to 5 of 5

Thread: What is best for a pure Entropy Encoder?

  1. #1
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Thanked 14 Times in 10 Posts

    Question What is best for a pure Entropy Encoder?

    When people write compression code. They general have several stages with often the last stage being an entropy compressor. The question is what is the best approach
    for this last stage. Do you general assume a nonstationary data stream at this point so that you don't use a pure entropy coder so that you make up for the mistakes in models leading to this last stage by tuning it to various files. Or do you make it as pure as possible and tune the preceding stages to give a more stationary stream of data to this final state of entropy compression. Or does one just take a set of files and try to tune the whole set of passes to make it work well on some set of data.
    I am just curious what other people think. I feel most know my thoughts in this area so please feel free to discuss your own. Or do most people think the same. Also do you like to work in binary for last stage or some larger set?

  2. #2
    Join Date
    Oct 2010
    Thanked 34 Times in 22 Posts
    I think your question aims at something different but for me, compression is always modeling+coding. Coding should be done with Arithmetic Coding or something familiar.

  3. #3
    Join Date
    Feb 2011
    St. Albans, England
    Thanked 0 Times in 0 Posts
    I've never understood what "Entropy Encoding" is other than to encode a pure entropy file so it's smaller (which it is not, that means I get to call it whatever I want when I get around to writing it). However, wikipedia claims "In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium." ... and goes on to state that Huffman & Arithmetic are Entropy encoding.

    I'm assuming that by this definition, it's any encoding method that does not use modelling ? Please feel free to correct me on this.

  4. #4
    Join Date
    Jun 2009
    Kraków, Poland
    Thanked 136 Times in 104 Posts
    Well, I think an entropy coder is given the probabilities of symbols in alphabet at any stage of compression process and must produce the unambiguous code word from it. It does not have any memory, besides some state independent of input alphabet (ie low, range, compressed output, etc).

  5. #5
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Kharkov, Ukraine
    Thanked 1,397 Times in 802 Posts
    Well, psrc has a secondary model in it, but I believe that its still an entropy coder
    As I see it, the model outputs some quantitative estimations of amount of information in data symbols
    (entropy = -information; so its frequently used as a synonym, to avoid tautology)
    Then we need an algorithm to efficiently transform that information into a number.
    But computers have limited precision, so we need an approximation which would be close
    to for the whole input.
    I guess, the algorithm that approximates Shannon Entropy for given model output is
    called entropy coder.

Similar Threads

  1. Replies: 7
    Last Post: 19th March 2011, 11:50
  2. PPMX - a new PPM encoder
    By encode in forum Data Compression
    Replies: 14
    Last Post: 30th November 2008, 17:03
  3. about files to test encoder
    By Krzysiek in forum Data Compression
    Replies: 3
    Last Post: 9th July 2008, 22:22
  4. fcm1 - open source order-1 cm encoder
    By encode in forum Data Compression
    Replies: 34
    Last Post: 5th June 2008, 00:16
  5. balz v1.00 - new LZ77 encoder is here!
    By encode in forum Forum Archive
    Replies: 61
    Last Post: 17th April 2008, 23:57

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts