-
29th September 2008, 16:59
#1
Member
Recursive LZ
Idea is simple, create compressor that could compress its own output
(not endlessly of course but at least several times)
To make this possible needed few things:
-refuse from using enthropy coders
-standartise input and outbut(bytewise)
-don't mix different codes together
As an example may take Matts BARF due to its simplicity
(of course its lz section and not filename tricks)
2byte match on the distance of 224 replaced with 1byte
but all codes written to single stream wich is no good
This results in only 0.7% compression for second pass
If we separate literals from match/literal markers we'll get:
first pass compression will be same but
second pass will compress better because first stream of literals will look more like source file with some removed 2byte sequences instead of lit+mat/lit mess
It will compress as repeated srings become closer than 224bytes before and now could be sucsessfully compressed
Moreover even codes now standing independantly and long runs of matches of same length or max lit length of 31 occur nearby so could be compressed better(with lz) then simply enthropy coded.
-
-
29th September 2018, 13:28
#2
I thought of a recursive compression algorithm too before.
I think it's LZ77.
Last edited by compgt; 25th July 2019 at 17:27.
-
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules