I'd want you to discuss the subject.
Speaking about all parts of archiver - compression strength, compression and decompression speed, UI, reliability, new technologies (like SSE, multi-threading etc.) usage, resonable memory usage and so on.
I think that there is no ideal archiver, but my best ones are WinRAR and 7-Zip. Their strengths are: best GUI, powerfull CLI versions, latest CPU optimizations, multithread support, most spreaded archives support, large tweaking capabilities, good default settings. Besides 7-Zip has support for large memory amounts.
WinRAR is weak in small dictionary size, so it doesn use most of the memory, i think it's due to back compatibility.
My best nowadays is 7-Zip. Using fast compression, 16 mb dictionary and 16 word size it beats WinRAR both in speed and compression most of the time. But it has relatively (in contrast to WinRAR, uharc, sbc) weak multimedia support and doesn't support compression algorithm change in one archive during compression. Other bad property is very bad (in terms of speed) compression of already compressed data. Most algorithms, except LZP and DEFLATE, just get stuck and drop compression speed significantly.
Furthermore i'd like to see "thor ex" (as speedy) and "uharc mz" (as fast) algorithms included in 7-zip. I know that LZP has relatively small decompression speed, but it doesn't matter for me.
One more interesting archiver i'd like to mention is FreeArc (http://freearc.narod.ru/). It uses PPM, LZMA, BWT and LZP algoritms but, unfortunately, doesn't change them dynamically in one archive.
So i want to ask: can anyone complete one practical archiver, which will use rising power of recent machines more effectively? Yes, it's cool to invent new algoritms and methods, but most people will never use your brilliant programs because of low speed and compatibility uncertainty. If someone will make new archiver or complete 7-Zip it will be awesome! And please remember - you can hardly find a computer with less then 256 mb memory, most have 512 and more.
Thanks.