Hello all,
after a bit of experimenting with pipes i found a fairly quick method of compressing huge files at the cost of lots of memory. The attached batches simply run both commands in parallel, usefull if you got memory to burn and at least a dual core machine. It also helps to overcome the 2GB process memory limit in XP32. At the bottom there are some measurings.
Attached you will find 4 batches for use with rep 1.2a and 7z 9.07/ccm 1.30c.
The batches are really crude, no error checking whatsoever, maybe somebody who is better at batches can elaborate.
Usage is simple: batch file_to_compress.
Unfortunatly there are not many compressors/filters which are able to use pipes for processing.
Maybe there is a wrapper to let normal compressors use stdin/out? I seem to dimly remember there was something similar for DOS/djgpp in the olden days.
Hope you find it usefull and best regards
evg
Source
Ubuntu karmic-desktop-i386.iso.pcf40, size 1,79GB (.iso 657MB)
precomp40 -slow -l3, took some hours
fa 0.52 Sept 8 2009
comp 419984KB, 2097 seconds, used memory ~1.4GB
fa decomp 177 seconds
-mrep:a99:b1gb:h24+exe+delta+lzma:128mb:max:a1:273: mc256
rep7z rep 1.2a+7za 9.07b
comp 419760KB, 1176 seconds, used memory ~2.5GB (Taskmanager)
unrep7z decomp 133 seconds
similar settings to fa, see batch
All files md5 checked out ok
Timings on Athlon X2 5600/4GB and XP32-SP3