I was thinking about the concept of a "practical archiver". A program allowing you to choose between several degrees of compression at the cost of your system's resources and time spent.
It has been proved, the best choice for lossless compression is to understand first what are you trying to compress, then adapt the method to the data. That's what makes FreeArc, NanoZip or Zpaq efficient archivers, against, let's say, .tar.bz method. And that´s also why they say data compression is an AI problem.
Leaving parsing and recognition alone for the moment, I realized that the solution in most archivers is far from optimal today when it comes to still images. For example, FreeArc uses TTA algo for PCM audio, which is great. But it also uses mm+grzip to compress *.BMP, and it does it NON-SOLID way. A catastrophe when you have a bunch of small images.
So, here is the question: Given the nice amount of image compressors out there, which one would be worth a try to include as a dedicated method in an archiver?
As I see the thing, the following conditions are needed or at least wanted, in no specific order:
1) Open source, or available as a library
2) Asymmetric
3) Multi-threading capable
4) Stable and robust
5) Reasonably fast
Who's your candidate?