Page 10 of 10 FirstFirst ... 8910
Results 271 to 281 of 281

Thread: NanoZip - a new archiver, using bwt, lz, cm, etc...

  1. #271
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    Could be, but we had this conversation in 2011 when I first wrote my book. I wrote a lot of detail about the Calgary corpus because it is cited in papers going back 30 years, and less about newer benchmarks. Sami seemed very sensitive to any perceived criticism by me of his work (which I thought was quite good). I did expand my brief mention to a whole section on his benchmark at his insistence after he accused me of being biased against his work. (I did write a lot about my own benchmarks, so I suppose he was right). But the fact is he has not updated his benchmark since 2012 and nobody else uses it, just like too many other newer benchmarks that were abandoned.

    I don't know what Sami is doing now. With his technical skills I am sure he could be very successful.

  2. #272
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    I just received the sad news in a private email that Sami Runsas died of a brain tumor in Jan. or Feb. 2013. He did not tell anyone about his illness until Dec. 2012 when his condition was hopeless. I think he was about 30 years old.

  3. #273
    Member
    Join Date
    May 2012
    Location
    United States
    Posts
    324
    Thanks
    182
    Thanked 53 Times in 38 Posts
    Quote Originally Posted by Matt Mahoney View Post
    I just received the sad news in a private email that Sami Runsas died of a brain tumor in Jan. or Feb. 2013. He did not tell anyone about his illness until Dec. 2012 when his condition was hopeless. I think he was about 30 years old.
    I had exchanged e-mails with him in September of 2014...
    Last edited by comp1; 20th June 2015 at 19:01.

  4. #274
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,507
    Thanks
    742
    Thanked 665 Times in 359 Posts

  5. #275
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    OK, now I am mystified.

  6. #276
    Tester
    Stephan Busch's Avatar
    Join Date
    May 2008
    Location
    Bremen, Germany
    Posts
    876
    Thanks
    472
    Thanked 175 Times in 85 Posts
    I cannot believe this

  7. #277
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    497
    Thanks
    228
    Thanked 84 Times in 64 Posts
    Quote Originally Posted by comp1 View Post
    I had exchanged e-mails with him in September of 2014...
    I just don't get it... What's going on?

  8. #278
    Member nikkho's Avatar
    Join Date
    Jul 2011
    Location
    Spain
    Posts
    546
    Thanks
    219
    Thanked 164 Times in 105 Posts
    Quote Originally Posted by Gonzalo View Post
    I just don't get it... What's going on?
    Same here. Site says updated on april 2014: http://nanozip.net/

  9. #279
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    873
    Thanks
    49
    Thanked 106 Times in 84 Posts
    I liked nanozip for its effiency. it hit a good sweetspot for me. i just really never got Sami himself he always sems hostile when people where not overpraising his software. Back when Christian i think it was ( RZM CCM(z)) tried the text compression with a permuted alphabet to see if it used word based dictionary prefiltering or not, and came to the conclusion it looked liked it i did, Sami was quick to accuse him for just wanting to badmouth his software.

  10. #280
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    873
    Thanks
    49
    Thanked 106 Times in 84 Posts
    Funny note.
    I've juat got acces to my old backup archive again re acutally now converting all nanozip -cO or -co made archives back to 7-zip and optimzining it with m7zrepacker. and it actually seems to create a tiny bit smaller archives with m7zrepacker.
    compression time is offcause of the charts but with the improved decompression time and less step to handle i highly prefer thiz way. just shows ther are still a lot of potential in 7-zip that is still unharnessed in its native compression software

  11. #281
    Member
    Join Date
    Nov 2015
    Location
    France
    Posts
    7
    Thanks
    2
    Thanked 0 Times in 0 Posts
    I have tested nanozip in the scope of my specific test, and as I'm still searching the best archiver for it, I'm ok for splitting the discussion to another topic.

    The "corpus": 10+GB of game data (thus not distributable), with most parts already compressed (audio/video or specific pack format), hence a compression ratio of 80% with 7zip ultra/48MB dictionary. I want to avoid long install times (updates, net-only data, ...), much like why one would want to backup his system, besides documents.

    I'm interested in compression ratio and decoding speed. The benchmark has 7zip as anchor: around 84xx of data, 18 minutes to compress, 7 to decompress (no file output, around 27MB/s of decompressed data). By the way, I'd like speedier decompression; please, no discussion on why it's unnecessary.

    The archiving is for my own needs and system: Win64, 8GB, 6 cores. So, memory and encoding speed are somewhat not an issue: encoding can be done overnight and use most of the memory (say, 6GB). Decoding may have something running alongside, so 4GB is probably the upper limit.

    I have looked at various benchmarks like Matt Mahoney's to sort and keep (those that looked like) the best performers, and produced this list: *paq* family, 7zip, rar, nanozip, freearc.

    rar (actually, Winrar 5.3) was impressive: 10 minutes compressing, ~3 minutes decompressing, 86xx file. I have yet to decide whether I prefer 7zip (but I found nanozip).

    freearc 0.666 was efficient but unfortunately too slow: at the ultra level, 1H encoding for a 80xx file but 18MB/s output. More asymmetric (or rather, speedier decoding) led to lower compression ratio than 7zip with actually not enough improvement. If you have better settings or version...

    Then, nanozip. Here are more precise benchmarks (can't seem to prettyprint it):
    No code has to be inserted here.
    The results rate nanozip as probably the "best" for my case, certainly due to the good multithreading support (which 7zip lacks). Unfortunately, it lacks in several aspects: data "sustainability" (development discontinued, author having uncertain situation, past issues), ease of use (I do prefer a GUI in the end).

    Do you have any suggestion (settings, version or archiver)? For now, GUI integration is less important, but availability of source code is surely providing bonus points in my test.

    Thanks in advance
    Last edited by kurosu; 29th November 2015 at 11:54.

Page 10 of 10 FirstFirst ... 8910

Similar Threads

  1. Nanozip decompression data troubles
    By SvenBent in forum Data Compression
    Replies: 11
    Last Post: 12th January 2009, 23:25
  2. BWT - how to use?
    By m^2 in forum Data Compression
    Replies: 29
    Last Post: 6th November 2008, 03:01
  3. NanoZip huge efficiency issue
    By m^2 in forum Data Compression
    Replies: 9
    Last Post: 10th September 2008, 22:51
  4. enwik9 benchmark nanozip, bliz, m99, dark
    By Sami in forum Data Compression
    Replies: 6
    Last Post: 31st July 2008, 21:24
  5. DARK - a new BWT-based command-line archiver
    By encode in forum Forum Archive
    Replies: 138
    Last Post: 23rd September 2006, 22:42

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •