What's the state of the art in patch/delta compression?
eg. you sent a large (100M-1G) archive previously, and you wish to update a few parts of that. You can use the previous archive as context for the patch compression.
Back in the day, most of the serious command line compressors had the option to precondition their model (eg. ACB, Rkive, PPMZ) but I just tried searching a bit and I don't see it as an option on any of the modern/mainstream things I checked (RAR, FreeArc, 7z).
The most obvious way to do a patch compress is to just use a dictionary coder and preload the dictionary from the previous archive. I'm very surprised that the mainstream archivers don't seem to offer this option, maybe I'm just missing it.
Any advice?
A related question : has there been any clever work on very large window matching? Say for example I wanted to find LZ77 string matches in a 1GB window, and I'm okay with a min match len of 16 or something big like that. Is there a better way to do that? How about out-of-core string matching for files larger than memory? eg. if you have two 1 TB files that are nearly identical and you wish to compress the differences.