Results 1 to 20 of 20

Thread: BCM v0.05 is here! [!]

  1. #1
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts

    Cool BCM v0.05 is here! [!]

    No comments!

    Enjoy!
    Attached Files Attached Files

  2. #2
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    408
    Thanks
    0
    Thanked 5 Times in 5 Posts
    I saw one test for version 0.04 compressing a big video compressed file very good and now tested it myself.
    It seems like bcm works very good for hardly compressable files or at least for divx files.

    733,030,400 bytes -> 713,688,280 bytes
    Thats very much in my view. lzma won no mb in the middle of the file (I aborted then) and also ccm not this much 720,320,558 bytes.

    EDIT:
    Seems to be bwt. Blizzard got it down to 713,568,995 byte
    Last edited by Simon Berger; 6th March 2009 at 01:38.

  3. #3
    Moderator

    Join Date
    May 2008
    Location
    Tristan da Cunha
    Posts
    2,034
    Thanks
    0
    Thanked 4 Times in 4 Posts

    Thumbs up

    Thanks Ilia!

  4. #4
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Great
    Keep up the good work!

  5. #5
    Member
    Join Date
    Jan 2009
    Posts
    18
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Thumbs up

    updated results of my test on video dvd:

    2,194,866,176 (2,04gb) - original iso
    2,012,827,345 (1,87gb) - WinRAR 3.80b5 (best, all filters)
    1,943,719,107 (1,80gb) - FreeArc 0.50 (-mx -ld=1gb)
    1,930,752,565 (1,79gb) - 7-Zip 4.60 (ultra, 128mb dict)
    1,861,798,652 (1,73gb) - CCMx 1.30c (model 7)
    1,822,283,097 (1,69gb) - BCM 0.04
    1,821,674,227 (1,69gb) - BCM 0.05 (-b65536)
    1,807,878,526 (1,68gb) - BCM 0.05 (-b131072)
    1,792,963,738 (1,66gb) - BCM 0.05 (-b307200)
    1,779,106,230 (1,65gb) - NanoZip 0.06a (-cc -m64m)
    1,756,546,785 (1,63gb) - NanoZip 0.06a (-cc -m1.5g)
    1,756,449,884 (1,63gb) - NanoZip 0.06a (-cc -m2g)

    very nice!! especially considering that bcm is about 2-3x faster than nz_cm

    i'm not an expert but i was wondering... would it be possible to use solid blocks + dictionary with this algorithm? like 7-zip lzma
    in my guess it could improve ratio on big files or sets of similar files

  6. #6
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    408
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Why don't you test real competitors of bcm? Sure it is interesting to see programs like ccm and 7z for example. But better show results of blizz, bbb, mcomp and I thought in nanozip has also an implementation of bwt but didn't find something alike.

  7. #7
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts

  8. #8
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,423
    Thanks
    223
    Thanked 1,052 Times in 565 Posts
    i've got 171,857,720 with bcm -b406991 enwik9

  9. #9
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts
    Actually, BCM is the strongest of all BWT compressors. Strangely, but in some rare cases Blizzard may win, but often it looses with a huge gap.

    A10.jpg
    BCM -> 824,920 bytes
    BLIZ -> 825,413 bytes
    BBB -> 827,834 bytes
    NZ -> 832,149 bytes
    MCOMP -> 833,195 bytes
    Original -> 842,468 bytes


  10. #10
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by encode View Post
    Actually, BCM is the strongest of all BWT compressors. Strangely, but in some rare cases Blizzard may win, but often it looses with a huge gap.
    It is very competitive indeed Though I think nz -cO is still usually better, both with performance and speed... But it's got some clever filters too, so there's a room to improve Do you plan on tranforming it into a full-size archiver with solid archives, recursive directories etc. etc.?

  11. #11
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts
    Quote Originally Posted by nanoflooder View Post
    It is very competitive indeed Though I think nz -cO is still usually better, both with performance and speed... But it's got some clever filters too, so there's a room to improve Do you plan on tranforming it into a full-size archiver with solid archives, recursive directories etc. etc.?
    As a note, as my test shown, NanoZip with its optimal methods additionally uses a few algorithms - i.e. not only pure BWT, along with cool filters of course. And this is a key - we have a bunch of compression methods and filters and we choose the best/right one.
    Anyway, I have no such plans for the closest future - it's too time consuming and I have at alomost no spare time currently. I tested a few filters, like delta, with BCM and multimedia data like pictures and audio and it showed very nice results. So, as to BCM, I do plan a set of good filters, better CM maybe, and nice-to-have features...

  12. #12
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,423
    Thanks
    223
    Thanked 1,052 Times in 565 Posts
    I asked encode to build a modified version, with VirtualAlloc and different
    allocation order, and here're the results (enwik9):

    169,396,682 (with -b488282)
    169,428,575 (with -b508752)

    508752 is maximum possible, as 508752*4/1024=1987M.
    and first setting is near half of the file.

  13. #13
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts
    Tested BCM with LZP preprocessing. Well, in some cases like with "bookstar" and "pht.psd" LZP might help, in others the opposite thing - it seriously hurts compression like with "world95.txt". Now I understand why Blizzard with its simpler CM model still may beat BCM in some cases - thanks to its LZP preprocessing - this is the key...

  14. #14
    Programmer osmanturan's Avatar
    Join Date
    May 2008
    Location
    Mersin, Turkiye
    Posts
    651
    Thanks
    0
    Thanked 0 Times in 0 Posts
    For me, overall performance is better - not single file. If you make it a bit faster and keep compression strength, that will be really good I'm sure, there must be some places for further improvements.
    BIT Archiver homepage: www.osmanturan.com

  15. #15
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts

    Thumbs up

    BCM is a very nice compressor ! I like it. Congrats to you !
    Take in mind that Bliz is 6M while BCM is only 5M ! (up too 500M blocks, that's the way I like it)
    I wonder what sorting model you have used.. something like Dark.. ?

    Do you have an idea about the time use between the sorting and the CM ?
    (on txt-like files for example).
    (How much does the CM slows down ? )
    If I compare the timings for E9 with the timings from Dark, the CM slows down a bit, but not too much. It seems a good balance to me.

    I hope, just like osman says, you can squeeze out some time without breaking the good ratio.

    Ooh, btw, my calender told me there should be a new BIT around by now ... did I miss it ?

  16. #16
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  17. #17
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts

    Question

    Quote Originally Posted by Black_Fox View Post
    Hmm...encode obviously throw out the exe-filter...
    Why did he do so ?

  18. #18
    Programmer osmanturan's Avatar
    Join Date
    May 2008
    Location
    Mersin, Turkiye
    Posts
    651
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by pat357 View Post
    Ooh, btw, my calender told me there should be a new BIT around by now ... did I miss it ?
    Next upcoming version will have totally new design which including data fragmentation, deflate recompression, audio/bitmap specialized codecs, filters and more faster LWCX codec... Still working on fragmentation and deflate recompression. I guess it will be ready until May or June.
    BIT Archiver homepage: www.osmanturan.com

  19. #19
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts
    Quote Originally Posted by pat357 View Post
    Hmm...encode obviously throw out the exe-filter...
    Why did he do so ?
    I already explained that in my previous thread... Cuttent EXE-filter is bad on big blocks - I need a brand new EXE-filter here.
    Also, it is not possible to make BCM's CM part faster at the same complexity. All stuff optimized as much as possible. Making it less complex, but more efficient is probably possible, but I'm not sure. Current CM is a light-weight version of my CM, this means, it is already as simple as it possible, adding additional stuff like SSE2 and dynamic mixing will improve compression even further at the cost of compression/decompression speed loss, probably very notable loss...

  20. #20
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,984
    Thanks
    377
    Thanked 352 Times in 140 Posts

Similar Threads

  1. BCM v0.10 is here!
    By encode in forum Data Compression
    Replies: 45
    Last Post: 20th June 2010, 22:39
  2. BCM's future
    By encode in forum Data Compression
    Replies: 17
    Last Post: 9th August 2009, 02:00
  3. BCM v0.06,0.07 is here! [!]
    By encode in forum Data Compression
    Replies: 34
    Last Post: 31st May 2009, 17:39
  4. BCM v0.04 is here! [!]
    By encode in forum Data Compression
    Replies: 64
    Last Post: 5th March 2009, 17:07
  5. BCM v0.03 is here! [!]
    By encode in forum Data Compression
    Replies: 25
    Last Post: 14th February 2009, 15:42

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •