Page 1 of 2 12 LastLast
Results 1 to 30 of 43

Thread: video compression (test)

  1. #1
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts

    video compression (test)

    hi all!

    since my videos were tsking a lot of space on my hard drive, i decided to see what today's compressors were able to do with these files

    first of all, i noticed that the compression ratio depends of the codec used in these videos (example: .mkv compresses way less than Divx)

    i will not include paq8 in my results unless there is a way to watch my videos without compressing the archive (like with WinRar)

    i took a whole folder of .mkv videos and compressed it (size: 1*155*059*549 bytes
    my results up to now:
    best: -- Nanozip 0.07 -cm -1000mb --- 1*128*418*608 bytes
    -- WinRar best setting solid archive --- 1*144*753*960 bytes
    -- FreeArc Ultra --- 1*150*272*831 bytes

    i tried Precomp 0.4 but it increased the final archive size by a few Kb...

    since Nanozip seems to be the best, i'll try to tweak it with different settings,
    but i don't know how when i try to open the commandline, it closes right away? could this be due to the fact that i have Windows 7 x64?

    any tips and suggestions will be greatly appreciated!

  2. #2
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    38
    Thanked 168 Times in 84 Posts
    Try Squeez with:
    Code:
    sqc -fmt SQX2 -m5 -MD32768 -uxx9
    sqc -fmt SQX2 -m5 -MD32768 -uxx9 -FMM2
    As for NanoZIP, try it with:
    Code:
    nz -nm -cc -m2g
    And how do you try to run it? By double-clicking the nz.exe or running the nz from console window opened in the dir with nz.exe ?

  3. #3
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Hello and welcome to the forums!

    There are very few compressors that can compress already compressed video (which covers pretty much every video you can download nowadays). I actually don't know even one

    Note that MKV is not a codec, but a container (for example there can be an x264 video track, MP3 audio track and SRT subtitle track in one MKV container forming complete movie).

    The PCF output file from Precomp has to be compressed, as the data itself is inflated to be better compressible - just in case you didn't do it - but Precomp presumably won't help here anyway.

    Lastly, for opening commandline run "command prompt" (cmd.exe) from start menu and go into the directory with NanoZip, then run it and list of options will be displayed.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  4. #4
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts
    ok so i managed to make it work -- thanks a lot

    now, my results with a mkv video using H.264/AVC for video and OGG Vorbis for audio

    original -- 91*896*303 bytes

    nanozip 0.07 -cm -1024mb -- 91*851*186 bytes

    nz.exe -nm -cc -m1g -- 93*205*264 bytes


    isn't it strange? -cc and -cm are in fact the same setting!

    and i tried your m2g setting but i get the "out of memory" error, even if i have 5G ram free...

    downloading sqeez right now!


    EDIT: yes, i compressed the.pcf file and it's a few kb bigger than nanozip without using precomp
    Last edited by Lone_Wolf; 6th January 2010 at 19:52.

  5. #5
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Black_Fox View Post
    There are very few compressors that can compress already compressed video (which covers pretty much every video you can download nowadays). I actually don't know even one
    Ocarina Networks.

  6. #6
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts
    i don't think they can do better than nanozip or paq8
    and yeah, businesses WILL save space since most of their files are text...

    nothing new there... and if they can decompress in realtime, i think we can say that their compression ratio is not phenomenal (cough*PAQ8*cough)


    and squeez gives me a larger file than the original...

  7. #7
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    I believe they can, otherwise they couldn't afford to sell their products (and pay for their scientists) when there are FA, NZ and many others for free.

    m^2: Thanks, too bad I can't try it at home...
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  8. #8
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,501
    Thanks
    741
    Thanked 664 Times in 358 Posts
    Quote Originally Posted by Black_Fox View Post
    I believe they can, otherwise they couldn't afford to sell their products (and pay for their scientists) when there are FA, NZ and many others for free.
    you know that commercial winzip and winrar has worse compression that fa/nz/7z

  9. #9
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Lone_Wolf View Post
    i don't think they can do better than nanozip or paq8
    and yeah, businesses WILL save space since most of their files are text...

    nothing new there... and if they can decompress in realtime, i think we can say that their compression ratio is not phenomenal (cough*PAQ8*cough)


    and squeez gives me a larger file than the original...
    You're wrong, there's one new thing: codecs specializing in lossless recompression of lossy movies.
    If it's done properly, it could be far stronger than PAQ (far-if we count bytes saved, not left). I don't know how does it work in practice though.

  10. #10
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Bulat Ziganshin View Post
    you know that commercial winzip and winrar has worse compression that fa/nz/7z
    This is a different case. Businesses evaluate such products and if it didn't work, nobody would buy it. And they claim to have installations like 20 PB in Kodak.
    I don't know if anybody bought if for video recompression strength, but I think it works.

  11. #11
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Quote Originally Posted by Bulat Ziganshin View Post
    you know that commercial winzip and winrar has worse compression that fa/nz/7z
    Yes, I know, frankly I wonder who still uses WinZip today. They can both survive because WinRAR is a family business (and also has good GUI), while WinZip is backed by Corel, so neither has money issues. Reminds me of AMD vs. Intel situation where one time AMD was both better and cheaper but people were still buying Intel

    Also, during the brief research I found a blogpost where someone "provided a 2GB USB key with a varying array of pre-compressed formats such as jpg, tiff, mp3, Ocarina de-duplication and compression challenge pdf and zipped files. The outcome after being put through Ocarina?s ?optimizers? was a 29% saving through the use of de-dupe and compression." - that doesn't sound bad. Now there is of course the question how well would precomp + some common archiver work when adapted to that kind hardware Ocarina uses.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  12. #12
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,501
    Thanks
    741
    Thanked 664 Times in 358 Posts
    Quote Originally Posted by m^2 View Post
    This is a different case. Businesses evaluate such products and if it didn't work, nobody would buy it. And they claim to have installations like 20 PB in Kodak.
    I don't know if anybody bought if for video recompression strength, but I think it works.
    read carefully, i don't said that it doesn't work, i say that it doesn't necessarilly better than nz
    Last edited by Bulat Ziganshin; 6th January 2010 at 23:01.

  13. #13
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by m^2 View Post
    You're wrong, there's one new thing: codecs specializing in lossless recompression of lossy movies.
    If it's done properly, it could be far stronger than PAQ (far-if we count bytes saved, not left). I don't know how does it work in practice though.
    what is that codec's name? i think it's worth a try

    what's positive about that is that you don't need to decompress it so even if it's on par (compression wise) with PAQ, i'd take it

  14. #14
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Lone_Wolf View Post
    what is that codec's name? i think it's worth a try

    what's positive about that is that you don't need to decompress it so even if it's on par (compression wise) with PAQ, i'd take it
    That coded is called Native Format Optimization, at least I didn't see them naming individual parts of it. You need to buy Ocarina suite to get it anyway. Or ask, maybe you'll get some free trial. They organized a pendive party some time ago, maybe they'll do it again.

    ADDED:
    [OT]
    And BTW, I'm not affiliated with this company. I'm just impressed with their technology.
    And BTW, they probably use ZPAQ-like codecs embedded with data, because they claim that any reader can read any file. Cool.
    [/OT]
    Last edited by m^2; 6th January 2010 at 23:07.

  15. #15
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts
    does anybody knows a software that would de-duplicate a file so i could compress the output file after?

    unless compressors like Nanozip and Freearc already do de-duplicating...

  16. #16
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Lone_Wolf View Post
    does anybody knows a software that would de-duplicate a file so i could compress the output file after?

    unless compressors like Nanozip and Freearc already do de-duplicating...
    Dedup is just a form of compression. Most popular compressors use some form of it, though specialized dedup software works differently from them (and has different limitations).
    You could try to run srep before NZ, it's the closest thing you can get, though I don't expect you to make any savings this way.

  17. #17
    Member
    Join Date
    Sep 2008
    Location
    France
    Posts
    863
    Thanks
    459
    Thanked 257 Times in 105 Posts
    From what i understand of Ocarina De-duplication, the logic behind is that identical objects can be present several times in a large datacenter, a pretty cool example being an attached file within an email sent to a bunch of mailboxes hosted in the same datacenter. Same thing for an image or any content re-used several times accross different websites hosted into a same datacenter.

    So obviously, this logic suits very well for large data storage centers, but is less applicable for a single large file, and let alone a film.

  18. #18
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,322
    Thanks
    209
    Thanked 1,001 Times in 526 Posts
    It might be a good idea to demux the streams from the container.
    In the case with mkv it can be done using
    http://www.bunkus.org/videotools/mkv....0.0-setup.exe
    To be specific, something like
    mkvextract tracks 1.mkv 1:1.ogg 2:2.ogg -- for extracting the streams
    mkvmerge -o 1out.mkv 1.ogg 2.ogg -- for merging the streams into new mkv

  19. #19
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Some tests i did a while back, because of the nature of already compressed video/audio streams in the containers i didn't bother playing with them during this test but focused on just using preprocessors instead. Plus i've been playing a lot with rep:## recently and you can get massive gains(several % easily) over just using plain ol +rep which can even hurt compression. Numbers around ~24 seem to give the best results on the big tests i've done with JPEGs. In this test i had my scripts set to :24 as it was the best average number i had found so far in previous tests(just on JPEGs though upto that point) and i don't think i tried any other numbers, in general it changes from file to file though but not by a lot. My notes just as they were written:

    AVIs like m0+delta+rep:24
    MPGs like m0+rep:24
    WMVs like m0+rep:24
    MP4s like m0+delta+rep:24 so far, only tested 1 file
    FLVs depend on what codec is inside it? both m0+rep:24 and m0+delta+rep:24 give best

    I don't think i have any mkv's around, and these weren't massive files, just various misc videos collected over the years of the net from games/trailers/funnies etc. Big gains were seen on AVIs, but MKV is a more efficient container so there would be gains, but not as substantial i'd imagine.

  20. #20
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,501
    Thanks
    741
    Thanked 664 Times in 358 Posts
    Quote Originally Posted by Intrinsic View Post
    AVIs like m0+delta+rep:24
    MPGs like m0+rep:24
    WMVs like m0+rep:24
    MP4s like m0+delta+rep:24
    "m0+" is redundant

  21. #21
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Intrinsic View Post
    AVIs like m0+delta+rep:24
    MPGs like m0+rep:24
    WMVs like m0+rep:24
    MP4s like m0+delta+rep:24 so far, only tested 1 file
    FLVs depend on what codec is inside it? both m0+rep:24 and m0+delta+rep:24 give best
    You mention only containers. What matters are codecs. You can losslessly convert MPG to MP4 and the best mode won't change.

  22. #22
    Member
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    9
    Thanks
    0
    Thanked 0 Times in 0 Posts
    sorry guys... i don't think i understood very well (i'm new to all this and i've just start programming about 4 months ago sorry)

    i've had an idea... would it be a good idea to store the position of either the "0" or the "1" (binary) in an array and then generate a formula that will give these positions so we would just have to store the formula and a few other things
    we would only need the position of the "0" for example, since we will fill the empty spaces with "1"

    even if there is like 1 formula per Mb we would still save a lot of space and time
    (the more answers there are, the longer it takes to compute the formula)

    now the only thing missing is a formula generator...

  23. #23
    Member
    Join Date
    Sep 2008
    Location
    France
    Posts
    863
    Thanks
    459
    Thanked 257 Times in 105 Posts
    now the only thing missing is a formula generator...
    just a little small issue really....

  24. #24
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    38
    Thanked 168 Times in 84 Posts
    Quote Originally Posted by Lone_Wolf View Post
    isn't it strange? -cc and -cm are in fact the same setting!

    and i tried your m2g setting but i get the "out of memory" error, even if i have 5G ram free...
    First of all. Where did you get this -cm switch? There is no such switch in NanoZIP. When I try to run it with it I get:
    Code:
    Unknown argument: -cm
    As for "out of memory" error. Here is the quote from nanozip.net:
    Update Nov-25: Selecting more than 2 GB memory for compression is a known issue. Use less than 2 GB memory to avoid problems.
    Quote Originally Posted by Lone_Wolf View Post
    and squeez gives me a larger file than the original...
    It can happen. But Squeez gives me best results with BIK format movies.

    Quote Originally Posted by m^2 View Post
    Ocarina Networks.
    Thanks! Never heard about it before.
    Just read some info about it and ... what to say. Its not available for mere mortals and compression enthusiasts so its hard to make any conclusions about it. By the way, Wiki Article states:
    Members of the company's technical advisory board include Bill Joy and Matt Mahoney.
    So maybe Matt can drop some light on it.

    Quote Originally Posted by Bulat Ziganshin View Post
    you know that commercial winzip and winrar has worse compression that fa/nz/7z
    This is the false statement and you know it for yourself. How you can tell such things?
    For example JPEG compression. Call me pessimistic but I believe that FA\NZ\7z will never be able to compress JPEGs as good as StuffiT does.
    Also a small BMP test. 120 24-bit BMP files of 411 599 424 bytes. Basicly a groups of similar pictures with different resolutions.
    Code:
    Original                           411 599 424
    StuffiT --recompression-level=2     73 680 612
    RAR -m5 -md4096 -s -mcc+           103 837 720
    ARC -mx                            113 029 145
    ARC -mmm:3*8:o54+grzip:m1           94 730 975
    It also brings obvious thing that FA can't properly detect and assign good values for mm. Only with exact option given the compression ratio becomes fine but anyway much worser than StuffiT.

    And small WAV test. 44100\2ch\16bit image of 773 930 348 bytes.
    Code:
    ARC -mtta:m3    511 489 856
    WinZIP -ez      510 886 518
    No surprise here. Wavpack always was (and is) better than TTA.

    Quote Originally Posted by m^2 View Post
    I don't know if anybody bought if for video recompression strength, but I think it works.
    Maybe it works but only for a small amount of files. There are more than 2000 video codecs exist so can you belive that they create recompression support for at least 10% of them?

  25. #25
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Skymmer... It doesn't help WinRAR and WinZIP from having worse compression than FA when Stuffit is better than FA

    Quote Originally Posted by Skymmer View Post
    This is the false statement and you know it for yourself. How you can tell such things?

    (...snip...)

    And small WAV test. 44100\2ch\16bit image of 773 930 348 bytes.
    Code:
    ARC -mtta:m3    511 489 856
    WinZIP -ez      510 886 518
    No surprise here. Wavpack always was (and is) better than TTA.
    WinZIP can be better at WAV files, but is that also comparably fast? I don't know how is TTA vs WV, thus asking.
    I still believe that at compression efficiency and also for maximum compression FA can beat WinZIP and WinRAR anytime

    Quote Originally Posted by Skymmer View Post
    There are more than 2000 video codecs exist so can you belive that they create recompression support for at least 10% of them?
    That may be true, but I strongly believe there are no more than 5 used in my entire 400 GB HD collection. Similar for audio, the most prominent codecs stay in mainstream use for a few years.
    Last edited by Black_Fox; 9th January 2010 at 13:59.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  26. #26
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    Quote Originally Posted by Skymmer View Post
    So maybe Matt can drop some light on it.
    I suppose. Ocarina compresses file systems, not files. It runs in the background compressing infrequently used files and decompresses on demand, all transparent to the user (except maybe some delay in retrieving compressed files). We don't have just a regular file compressor or archiver. (There is lots of free software for that).

    We often write custom compressors for individual customers. I've written custom software for several different image types, seismic data, and data from DNA sequencers.

    NFO (native format optimization) is a separate option for lossy compression. So far we only do this with JPEG. We discard headers and thumbnails, optimize the Huffman tables, and optionally reduce the image quality to produce smaller JPEG files. We can combine this with lossless JPEG compression (like PAQ but faster). Others do NFO by different names, like pngcrush or "jpegtran -optimize -progressive".

    We don't compress video yet, but it's something we may have soon.

  27. #27
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    333
    Thanks
    36
    Thanked 36 Times in 21 Posts
    precomp can compress videos but only of MJPEG (as i proved it before) and no other can yet and this feature can easily be configured within FA. So, FA can compete in one area but let me refer to AVI lossless demuxer i remember that Shelwien posted (avip_demo.rar) that can improve compression little and if could expanded to include other mentioned above containers (also 7-zip can unpack most of FLV files to its stream but not re-compress) and taking into consideration that you have mp2 & mp3 preprocessors (slowly evolving) i think there a chance that all these could fulfill the task.

    BTW: Sound slimmer to issue a newer version on FEB2010 i hope it adds AAC,WMA and OGG formats

  28. #28
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    38
    Thanked 168 Times in 84 Posts
    Quote Originally Posted by Black_Fox View Post
    Skymmer... It doesn't help WinRAR and WinZIP from having worse compression than FA when Stuffit is better than FA
    Ok ok. Maybe I was a bit captious but anyway I prooved that commercial tools not always worser that free ones. Anyway, I love FA\NZ\7z too so please don't take it as offensive

    Quote Originally Posted by Matt Mahoney View Post
    I suppose. ....
    Thanks for information Matt! Now we know something at least.

  29. #29
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Matt Mahoney View Post
    I suppose. Ocarina compresses file systems, not files. It runs in the background compressing infrequently used files and decompresses on demand, all transparent to the user (except maybe some delay in retrieving compressed files). We don't have just a regular file compressor or archiver. (There is lots of free software for that).

    We often write custom compressors for individual customers. I've written custom software for several different image types, seismic data, and data from DNA sequencers.

    NFO (native format optimization) is a separate option for lossy compression. So far we only do this with JPEG. We discard headers and thumbnails, optimize the Huffman tables, and optionally reduce the image quality to produce smaller JPEG files. We can combine this with lossless JPEG compression (like PAQ but faster). Others do NFO by different names, like pngcrush or "jpegtran -optimize -progressive".

    We don't compress video yet, but it's something we may have soon.
    Thanks a lot for the information.
    So are statements like
    Quote Originally Posted by http://www.ocarinanetworks.com/technology/nfo
    Ocarina's team of PhD researchers has carefully developed enhanced image and video coding algorithms
    just shameless lies? Or maybe you developed some algorithms that turned out to be useless and didn't make it to production? Or something else?

  30. #30
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    MPEG is still under development, but I have written compressors for JPEG (lossless and NFO) and some specialized image formats like EXR, SPM and some TIFF formats for Ocarina that are being used by customers.

Page 1 of 2 12 LastLast

Similar Threads

  1. JPEG Compression Test [April 2010]
    By Skymmer in forum Data Compression
    Replies: 18
    Last Post: 7th February 2011, 23:30
  2. JPEG Compression Test [December 2009]
    By Skymmer in forum Data Compression
    Replies: 9
    Last Post: 23rd December 2009, 21:06
  3. PIM Archiver Video
    By encode in forum Forum Archive
    Replies: 3
    Last Post: 28th October 2007, 21:46
  4. Fun video
    By encode in forum Forum Archive
    Replies: 7
    Last Post: 22nd October 2007, 23:26
  5. The "Nuff said" video!
    By encode in forum Forum Archive
    Replies: 5
    Last Post: 3rd January 2007, 23:11

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •