Results 1 to 2 of 2

Thread: Recursive LZ

  1. #1
    Member chornobyl's Avatar
    Join Date
    May 2008
    Location
    ua/kiev
    Posts
    153
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Recursive LZ

    Idea is simple, create compressor that could compress its own output
    (not endlessly of course but at least several times)
    To make this possible needed few things:
    -refuse from using enthropy coders
    -standartise input and outbut(bytewise)
    -don't mix different codes together

    As an example may take Matts BARF due to its simplicity
    (of course its lz section and not filename tricks)

    2byte match on the distance of 224 replaced with 1byte
    but all codes written to single stream wich is no good
    This results in only 0.7% compression for second pass

    If we separate literals from match/literal markers we'll get:
    first pass compression will be same but
    second pass will compress better because first stream of literals will look more like source file with some removed 2byte sequences instead of lit+mat/lit mess
    It will compress as repeated srings become closer than 224bytes before and now could be sucsessfully compressed
    Moreover even codes now standing independantly and long runs of matches of same length or max lit length of 31 occur nearby so could be compressed better(with lz) then simply enthropy coded.

  2. #2
    Member
    Join Date
    Sep 2018
    Location
    Philippines
    Posts
    38
    Thanks
    22
    Thanked 0 Times in 0 Posts
    I thought of a recursive compression algorithm too before.

    I think it's LZ77.
    Last edited by compgt; 25th July 2019 at 16:27.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •