Page 4 of 4 FirstFirst ... 234
Results 91 to 105 of 105

Thread: Compression speed benchmark

  1. #91
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    i just tried my development version (it is why this service popular). i plan to improve binary files compression and then release it

  2. #92
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    1,040
    Thanks
    104
    Thanked 420 Times in 293 Posts
    Quote Originally Posted by Bulat Ziganshin
    thanks for decompression times! but ive send today lot of jobs and they are not appeared here.
    Yes I added decompression time and compare OK (file size only) at http://www.metacompressor.com/uploads.aspx where results of succeeded jobs are listed.

  3. #93
    Member
    Join Date
    Aug 2007
    Posts
    30
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Pareto Frontier (Decompression process time). Testfile 1 from http://www.metacompressor.com/uploads.aspx.

    Code:
     
       SIZE           CTM     CPTM      DTM    DPTM 
    399,933,429    204  13:18.974    26    04.196 lzturbo 0.92 | -19 
    374,203,844    212  13:50.440    25    04.570 lzturbo 0.92 | -29 
    357,376,380    675  11:14.517    20    05.288 cabarc | -m lzx:21 N 
    267,109,958    240  03:57.433    33    07.659 tornado 0.4a | -11 
    266,715,072    541  08:58.999    33    08.392 tornado 0.3  | -11 
    263,651,514    987  16:21.745    32    08.486 tornado 0.3  | -12 
    248,744,915    688  15:25.569     35   17.097 7za 4.57 | -t7z -mx=9 
    236,817,357    266  04:28.368   240   03m46.435 arc 0.40  | -m6 
    232,225,857    292  04:42.861   257   04m02.550 arc 0.40  | -m7 
    183,236,508  1392  23:10.000  1413  23m32.823 paq9a 9a  |

    Note: lzturbo is using 4 cores at ~100%, so the process time must
    be (normally) divided by 4, making lzturbo the first general-purpose compressor that can decompress > 1gb/s (with proper i/o).
    - Tornado (mode -11 & -12) is using > 1,5 gb ram

  4. #94
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    tornado -7 mode uses 64megs for decompression and compress better than cabarc. but your rating is based on time, not memreqs.

    you not wrote how many memory reqd for 4-threaded -19 compression - it's also 1.5gb, yes?

  5. #95
    Tester
    Nania Francesco's Avatar
    Join Date
    May 2008
    Location
    Italy
    Posts
    1,583
    Thanks
    234
    Thanked 160 Times in 90 Posts
    I wanted to signal that have launched different times for the TEST 5 Ringses 1.3 but not from the result! does it perhaps go to Crash?
    Please Help me!

  6. #96
    Member
    Join Date
    Aug 2007
    Posts
    30
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Bulat Ziganshin
    lzturbo is using maximal ~230mb per core for compression in mode
    -11. (also <1gb for this test). The difference is: this mode can
    also be used with if a users have < 512mb ram (use lzturbo in single core mode) but to use tornado -11 & -12 mode you must have 2gb ram installed.
    This is only a simple note. memory usage is also important to report.

  7. #97
    Member
    Join Date
    Aug 2007
    Posts
    30
    Thanks
    0
    Thanked 0 Times in 0 Posts
    sorry i mean lzturbo mode -19 & -29 (all modes with level 9)

  8. #98
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    1,040
    Thanks
    104
    Thanked 420 Times in 293 Posts
    Quote Originally Posted by Nania Francesco Antonio
    I wanted to signal that have launched different times for the TEST 5 Ringses 1.3 but not from the result! does it perhaps go to Crash?
    Please Help me!
    They where all successfully submitted but If a process run for more then one hour and compression is not finished the process is canceled, there is no feedback for this yet.

  9. #99
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    1,040
    Thanks
    104
    Thanked 420 Times in 293 Posts
    I added peak memory used during compression and peak memory used during decompression http://www.metacompressor.com/uploads.aspx

  10. #100
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    1,040
    Thanks
    104
    Thanked 420 Times in 293 Posts
    Quote Originally Posted by Nania Francesco Antonio
    I wanted to signal
    I saw you submitted some new tests, after 9,970 seconds (2 hours 46 min) I manual canceled Rings 1.4 test version busy compressing test file5, it had already created a 47,700,410,368 bytes output file!

    Maybe an idea to add that when the output file size is xx% bigger then the input file size to stop compression and exit the application with an application error.

    Other tests succeeded and had all improved compression ratio.

  11. #101
    Tester
    Nania Francesco's Avatar
    Join Date
    May 2008
    Location
    Italy
    Posts
    1,583
    Thanks
    234
    Thanked 160 Times in 90 Posts
    Ok! Thanks !

  12. #102
    Programmer
    Join Date
    Feb 2007
    Location
    Germany
    Posts
    420
    Thanks
    28
    Thanked 163 Times in 18 Posts
    Just wanted to say big thanks, sportman! Your live-testing environment is awesome. Just a couple of minutes after the request all results were already online.

  13. #103
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,505
    Thanks
    26
    Thanked 136 Times in 104 Posts
    can someone test bwt based compressors (eg. dark) with big block size? matt on ltcb for unknown reason uses block sizes that are 1/2 1/3 1/4 of enwik9 size instead of using max possible dictionary size (block size should be maximal unless file isn't homogenous, in such cases it should be divided to maximal homogenous blocks).

    i tried testing dark but it failed.

  14. #104
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    1,040
    Thanks
    104
    Thanked 420 Times in 293 Posts
    Quote Originally Posted by donkey7
    i tried testing dark but it failed.
    Thats because Dark v0.51 with p-b390mf give error "Not enough memory!"

    There is only 2GB memory in test system.

  15. #105
    Member
    Join Date
    Dec 2006
    Posts
    611
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by donkey7
    big block size
    5N is too much for his testing system BBB should, on the other hand, handle whole enwik9 even on 2GB system.

Page 4 of 4 FirstFirst ... 234

Similar Threads

  1. Compression and speed
    By Wladmir in forum Data Compression
    Replies: 4
    Last Post: 25th April 2010, 13:15
  2. MONSTER OF COMPRESSION - New Benchmark
    By LovePimple in forum Data Compression
    Replies: 225
    Last Post: 23rd December 2009, 11:57
  3. GCC 4.4 and compression speed
    By Hahobas in forum Data Compression
    Replies: 14
    Last Post: 5th March 2009, 18:31
  4. New benchmark for generic compression
    By Matt Mahoney in forum Data Compression
    Replies: 20
    Last Post: 29th December 2008, 09:20
  5. Synthetic compression benchmark
    By giorgiotani in forum Forum Archive
    Replies: 6
    Last Post: 3rd March 2008, 12:14

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •