i just tried my development version (it is why this service popular). i plan to improve binary files compression and then release it
i just tried my development version (it is why this service popular). i plan to improve binary files compression and then release it
Yes I added decompression time and compare OK (file size only) at http://www.metacompressor.com/uploads.aspx where results of succeeded jobs are listed.Originally Posted by Bulat Ziganshin
Pareto Frontier (Decompression process time). Testfile 1 from http://www.metacompressor.com/uploads.aspx.
Code:SIZE CTM CPTM DTM DPTM 399,933,429 204 13:18.974 26 04.196 lzturbo 0.92 | -19 374,203,844 212 13:50.440 25 04.570 lzturbo 0.92 | -29 357,376,380 675 11:14.517 20 05.288 cabarc | -m lzx:21 N 267,109,958 240 03:57.433 33 07.659 tornado 0.4a | -11 266,715,072 541 08:58.999 33 08.392 tornado 0.3 | -11 263,651,514 987 16:21.745 32 08.486 tornado 0.3 | -12 248,744,915 688 15:25.569 35 17.097 7za 4.57 | -t7z -mx=9 236,817,357 266 04:28.368 240 03m46.435 arc 0.40 | -m6 232,225,857 292 04:42.861 257 04m02.550 arc 0.40 | -m7 183,236,508 1392 23:10.000 1413 23m32.823 paq9a 9a |
Note: lzturbo is using 4 cores at ~100%, so the process time must
be (normally) divided by 4, making lzturbo the first general-purpose compressor that can decompress > 1gb/s (with proper i/o).
- Tornado (mode -11 & -12) is using > 1,5 gb ram![]()
tornado -7 mode uses 64megs for decompression and compress better than cabarc. but your rating is based on time, not memreqs.
you not wrote how many memory reqd for 4-threaded -19 compression - it's also 1.5gb, yes?
I wanted to signal that have launched different times for the TEST 5 Ringses 1.3 but not from the result! does it perhaps go to Crash?
Please Help me!
Bulat Ziganshin
lzturbo is using maximal ~230mb per core for compression in mode
-11. (also <1gb for this test). The difference is: this mode can
also be used with if a users have < 512mb ram (use lzturbo in single core mode) but to use tornado -11 & -12 mode you must have 2gb ram installed.
This is only a simple note. memory usage is also important to report.
sorry i mean lzturbo mode -19 & -29 (all modes with level 9)
They where all successfully submitted but If a process run for more then one hour and compression is not finished the process is canceled, there is no feedback for this yet.Originally Posted by Nania Francesco Antonio
I added peak memory used during compression and peak memory used during decompression http://www.metacompressor.com/uploads.aspx
I saw you submitted some new tests, after 9,970 seconds (2 hours 46 min) I manual canceled Rings 1.4 test version busy compressing test file5, it had already created a 47,700,410,368 bytes output file!Originally Posted by Nania Francesco Antonio
Maybe an idea to add that when the output file size is xx% bigger then the input file size to stop compression and exit the application with an application error.
Other tests succeeded and had all improved compression ratio.
Ok! Thanks!
Just wanted to say big thanks, sportman! Your live-testing environment is awesome. Just a couple of minutes after the request all results were already online.
![]()
can someone test bwt based compressors (eg. dark) with big block size? matt on ltcb for unknown reason uses block sizes that are 1/2 1/3 1/4 of enwik9 size instead of using max possible dictionary size (block size should be maximal unless file isn't homogenous, in such cases it should be divided to maximal homogenous blocks).
i tried testing dark but it failed.
Thats because Dark v0.51 with p-b390mf give error "Not enough memory!"Originally Posted by donkey7
There is only 2GB memory in test system.
5N is too much for his testing systemOriginally Posted by donkey7
BBB should, on the other hand, handle whole enwik9 even on 2GB system.