Results 1 to 3 of 3

Thread: Sequential implementation of ANS-like entropy coder

  1. #1
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Kharkov, Ukraine
    Thanked 1,360 Times in 777 Posts

    Sequential implementation of ANS-like entropy coder

    > What is really interesting is combining FIFO with simplicity of ANS for large alphabet

    Actually I'm still intending to use rangecoders for a while.
    Most of rANS optimizations are easily applicable to rc,
    and there's basically no difference in performance for binary alphabets.

    Also, most non-linear model components (FSM counters, logistic mixing, SSE, etc)
    are not compatible with non-binary alphabets anyway,
    so I think its more important to look for speed optimizations for binary coders
    instead, rather than just being happy about vectorized order0 models.

    But if you want a FIFO ANS, I might have some ideas.
    Well, its obvious that order reversing is inherent if you
    want to use state'=state*SCALE+symbol.low updates.
    Its the same as converting decimal to binary basically -
    you can do FIFO decoding if you know all the coefs...
    which you might actually know if you use a combinatoric model.

    However, in that case either encoding would have to look like
    or decoding like
    bit = state/LUT[freq0,freq1]%SCALE > probability;

    So if you require
    encoding: state'=state*SCALE+symbol.low
    decoding: symbol=LUT[state%SCALE]; state/=SCALE;
    Then its obvious that the order would be inverted.


    1) We can combine ANS w/o renorm with a high-precision rangecoder, like

    this rc supports 64-bit symbol intervals, so we can encode 64-bit
    rANS states with it, and its possible to pack quite a lot into 64 bits.
    For example, it can be 4 bytes with a 16-bit freqtable.

    Note that the number of ANS instances in this case is potentially unlimited,
    because ANS states would be properly packed.

    2) As I already mentioned, the "payload" in ANS state update
    doesn't have to come from this specific state.

    I mean, you always use
    state'=(state/symbol.range)*SCALE+symbol.low+(state%symbol.range )

    But the important part is actually

    while the log2(symbol.range) bits of payload can be taken anywhere,
    like from another block of data, or another interleaved stream at least.

  2. Thanks (2):

    encode (2nd July 2017),RamiroCruzo (2nd July 2017)

  3. #2
    Join Date
    Nov 2013
    Kraków, Poland
    Thanked 262 Times in 161 Posts
    Indeed it would be nice to be able to remove the ANS need of buffer: of the actual data for i.i.d. source or simple context models (e.g. Markov), or additionally the probabilities for adaptive models.
    The advantage of ANS is mainly the large alphabet case: to reduce the number of steps (and remove binarization), like LZMA -> LZNA using 16 size (4bit) alphabet with vectorized probability adaptation and symbol search.
    I wonder if similar approach could be used to fill the cliff in Pareto frontier between zpaq and LZNA - context mixing on not binary but 4bit symbols?

    Returning to FIFO ANS, prefix codes can be seen as a special case, e.g. A->00, B->010, C->011, D->1 corresponds to AABCDDDD symbol spread for tANS.
    So there is a large family of tANS for which we can have FIFO - the question is if it can be enlarged?

    For binary alphabet there are a few approaches for FIFO finite state machines, like quasi AC or M/Q code e.g. from CABAC.
    However, taking it to larger alphabet is rather impractical due to much faster growth than for tANS: of memory requirements and preparation costs.
    For this task there is a tough need to develop field combining automata theory with information theory - seeing states of automate as buffer containing fractional amounts of bits, like lg(x) in tANS automate.

    Another question is to cross the arithmetic: FIFO RC - LIFO rANS boundary, but it might be impossible (?)

  4. #3
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Kharkov, Ukraine
    Thanked 1,360 Times in 777 Posts
    > context mixing on not binary but 4bit symbols?

    LZNA approach should be compatible with linear mixing.
    For example of bytewise linear mixing, see
    It might be also still possible to apply logistic updates to linear mixer weights.
    Some examples of linear vs logistic mixing results with optimized parameters:

    Btw, as to FIFO huffman vs FSE, there's also my "fritcode" idea.
    That is, if we'd work with "bits" that have a non-uniform probability (ie p0=0.9; p1=0.1), it becomes
    possible to build optimal bitcodes for non-power-of-2 probabilities.
    Then the skewed bitcode can be encoded with a secondary EC - with all bits having the same probability it should be simple/fast enough.
    There's more in this post - .

  5. Thanks:

    RamiroCruzo (2nd July 2017)

Similar Threads

  1. ELI5: FSE/ANS
    By cassini in forum Data Compression
    Replies: 10
    Last Post: 28th September 2016, 23:53
  2. ao0ec: Bytewise adaptive order 0 entropy coder
    By Kennon Conrad in forum Download Area
    Replies: 7
    Last Post: 20th March 2015, 08:18
  3. nARANS (Nania Adaptive Range Variant of ANS)
    By Nania Francesco in forum Data Compression
    Replies: 9
    Last Post: 16th November 2014, 16:33
  4. Which ANS is the Best
    By calthax in forum Data Compression
    Replies: 8
    Last Post: 9th September 2014, 17:13
  5. ENTROPY 0.5
    By comp1 in forum Download Area
    Replies: 3
    Last Post: 7th February 2013, 00:21

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts