Idea is simple, create compressor that could compress its own output
(not endlessly of course but at least several times)
To make this possible needed few things:
-refuse from using enthropy coders
-standartise input and outbut(bytewise)
-don't mix different codes together

As an example may take Matts BARF due to its simplicity
(of course its lz section and not filename tricks)

2byte match on the distance of 224 replaced with 1byte
but all codes written to single stream wich is no good
This results in only 0.7% compression for second pass

If we separate literals from match/literal markers we'll get:
first pass compression will be same but
second pass will compress better because first stream of literals will look more like source file with some removed 2byte sequences instead of lit+mat/lit mess
It will compress as repeated srings become closer than 224bytes before and now could be sucsessfully compressed
Moreover even codes now standing independantly and long runs of matches of same length or max lit length of 31 occur nearby so could be compressed better(with lz) then simply enthropy coded.