Results 1 to 19 of 19

Thread: JPEG Compression Test [April 2010]

  1. #1
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    38
    Thanked 168 Times in 84 Posts

    Lightbulb JPEG Compression Test [April 2010]

    Here is the 2nd round of JPEG Compression Test. Basically, its made due the fact that new PackJPG have been released. But there are also new versions for every competitor and we have a new competitor, called JPACK from Infima. And afterall, I always felt that previous test was a little bit "unclean". Now I hope that everything have been done properly. Let's go...

    Testbed, system, tools
    As same as before I used one photo session made with SONY DSC-W50 camera. It consists of 295 JPEG files. All files have 2816x2112 or 2112x2816 resolution depending on orientation. The only difference is that all files passed through
    Code:
    jpegtran.exe -copy none
    It was done to avoid compatibility problems appeared in previous test. Original overall size is 599 548 561 bytes.
    All files are not Progressive, but not sure if Huffman tables are optimised or not. I provided a couple of files used below, so feel free to take a look and give a comment.
    Test conducted on Win XP Pro SP3, AMD64 4000+ (Single Core). Performed in clean enviroment, i.e. most of services are turned off and no background tasks are running. ConsMeter tool used for time and memory consumption measurement. More exactly speaking the Total time and Peak working set size values. Buzzer tool from LovePimple have been also used.

    Competitors, versions, settings.
    Code:
    JPACK                                              -x
    PackJPG v2.3c                                      default
    PackJPG v2.4                                       default
    PackJPX v2.4                                       default
    WinZIP console v3.2 (Engine: 25.0.9069.0)          -ez
    PreComp v0.4.0 (PackJPG v2.4 WIP4)                 default
    StuffiT v14.0.0.16 (Engine: 14.0.2383.921)         --jpeg-no-thumbnails
    PAQ8px_68 (SSE2_wildcard from M4ST3R)              -8
    Now its time to tell something about JPACK. Please read here. Pay attention at:
    JPACK SDK is based on Infima?s new and innovative neural networks compression method.
    Now lets look at the resulting archive. See the following screenshot. Don't you think that we seen similar structure somewhere ? More interesting is that they didn't mentioned the PAQ anywhere. So, for me they look like shameless thiefs.

    Resulting table
    Code:
                        size       comp.    comp.mem.    deco.    deco.mem.       %
                    -----------    -----    ---------    -----    ---------    ------
    Original        599 548 561      ***          ***      ***          ***     100 %
    JPACK           500 331 199     1589      9.8 MiB     1518      9.8 MiB    83.5 %
    PackJPG 2.3c    479 209 871      946     31.3 MiB      948     34.0 MiB    80.0 %
    WinZIP          476 073 338      551     16.9 MiB      525     16.1 MiB    79.4 %
    PackJPG 2.4     473 000 941      836     29.7 MiB      847     32.7 MiB    78.9 %
    PreComp 0.4.0   472 267 302     1100     27.9 MiB     1060     30.7 MiB    78.8 %
    PackJPX 2.4     467 048 758      804     33.8 MiB      802     35.2 MiB    77.9 %
    StuffIt         456 661 409      975     20.9 MiB      874     13.0 MiB    76.2 %
    PAQ8px_68       455 557 509    14900    899.2 MiB    14900    899.2 MiB    76.0 %
    Progressive JPEG support table
    Let's see what competitors support Proressive JPEGs.
    Code:
    JPACK         No
    PackJPG       Yes
    WinZIP        No
    PreComp       Yes
    StuffiT       Yes
    PAQ8px        No
    Random thoughts
    First of all, now PAQ beats StuffIt. Sure, too slow and memory hungry but anyway... Viva PAQ family
    Also its nice to see that PackJPG beats WinZIP. And now the strange things. For some reasons PreComp (PackJPG 2.4 WIP4) still packs better than final PackJPG 2.4. I suppose that its result of speed optimization with compression ratio sacrifice. But the most interesting thing is the PackJPX results. The question is:
    Why the same model have not been included into PackJPG ?
    I saw that Raymond said something about "solid" behaviour. But "solid" term is inapplicable here. If you'll look into the resulting SFX file you'll quickly notice the following structure:
    Code:
    SFX stub -> hex: 07 00 00 00 -> filename -> hex: 00 -> compressed data
    You can even easily cut one of the block and SFX will continue to work. Furthermore, here is an another test, conducted on one big 50 533 219 byte JPEG file.
    Code:
    big_building.jpg    50 533 219
    big_building.pjg    43 170 370
    big_building.exe    42 923 450
    So, obviously, there is an improved model built into PackJPX. And I don't understand what is the reason of including improved model into PackJPX only.

    Anyway... I hope the test is usefull and thanks for reading !

    EDIT: Almost forgot. Example files can be found here.
    Last edited by Skymmer; 28th April 2010 at 19:02.

  2. #2
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,474
    Thanks
    26
    Thanked 121 Times in 95 Posts
    Skymmer:
    Pay more attention to grammar & spelling. "Maded" is a vey bad mistake.

  3. #3
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    It's sad that you didn't make the test data more diverse.
    Photos taken with the same camera, the same resolution, the same compression setting, probably w/out further processing make up a very specific case and I would be very careful extrapolating the results to all jpegs.

    About PackJPX:
    The author told me that he put working version of 2.5 code in there by mistake.
    Last edited by m^2; 28th April 2010 at 19:58.

  4. #4
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    408
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Interesting test, thanks.
    One thing no one tested yet is the lpaqjpgtest.exe you can get from the open source package Shelwien linked in the packjpg thread. Maybe this can even beat paq8px.

  5. #5
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    566
    Thanks
    217
    Thanked 200 Times in 93 Posts
    Quote Originally Posted by Simon Berger View Post
    Interesting test, thanks.
    One thing no one tested yet is the lpaqjpgtest.exe you can get from the open source package Shelwien linked in the packjpg thread. Maybe this can even beat paq8px.
    I don't think so. Results for A10.jpg:

    Code:
    lpaqJPGtest: 751,026 Time: 52 s
    PackJPG 2.4: 680,430 Time: 4 s
    paq8o8 -3:   643,535 Time: 121 s
    It doesn't even look for JPG streams, f.e. a concatenation of two JPG files wasn't detected in my tests and was compressed with lpaq without preprocessing.

    A good thing about lpaqjpgtest could be to change it (combining the code with the other example program extrJPG) to implement a solid mode. Although it states that only single file processing is possible at the moment, it could be relatively easy to modify it so it decompresses multiple JPG files or even to detect JPG streams inside files.

    But it's questionable if this makes any sense. Speed is better than with PAQ, but compression ratio is even worse as with PackJPG. Perhaps combining the code with some other compressor would be better for this type of data than using lpaq.
    http://schnaader.info
    Damn kids. They're all alike.

  6. #6
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    Quote Originally Posted by Skymmer View Post
    Now its time to tell something about JPACK. Please read here. Pay attention at:

    Now lets look at the resulting archive. See the following screenshot. Don't you think that we seen similar structure somewhere ? More interesting is that they didn't mentioned the PAQ anywhere. So, for me they look like shameless thiefs.
    It doesn't surprise me that Infima is still stealing PAQ code. They even tried to patent it. http://appft1.uspto.gov/netacgi/nph-...DN/20070233477

    I wrote to the USPTO over a year ago and pointed out how they even plagiarized text right from my technical report. So far no response and the patent is still pending.

    Of course that was all discussed before (starting at #1.

  7. #7
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    408
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Thank you schnaader. That's really far away from packjpg and especially from paq8 .
    I knew that lpaq isn't an archiver or something like this but I thought it uses the same jpeg unpacker/sorter packjpg has and with lpaq compression engine it is a bit better.
    But obviously packjpg has much more under the hood.

  8. #8
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    38
    Thanked 168 Times in 84 Posts
    Quote Originally Posted by Piotr Tarsa View Post
    Skymmer:
    Pay more attention to grammar & spelling. "Maded" is a vey bad mistake.
    Yes, its a vey bad mistake
    But seriously. Thanks for pointing it out. Obviously English is not my native language and I never had some special courses of it, so I can only hope that my school education and self-education let other people to understand my posts.

    Quote Originally Posted by m^2 View Post
    It's sad that you didn't make the test data more diverse.
    Photos taken with the same camera, the same resolution, the same compression setting, probably w/out further processing make up a very specific case and I would be very careful extrapolating the results to all jpegs.

    About PackJPX:
    The author told me that he put working version of 2.5 code in there by mistake.
    Well, maybe you're right but this test doesn't pretend to be some kind of "truth of life" or something like that. Just a test performed on real life data. So anybody can take a bunch of JPEGs and do the same I suppose.

    Quote Originally Posted by Matt Mahoney View Post
    It doesn't surprise me that Infima is still stealing PAQ code. They even tried to patent it. http://appft1.uspto.gov/netacgi/nph-...DN/20070233477
    Ah, now I see. Haven't seen that discussion before. By the way, JPACK executable protected with ACProtect. Somehow they try to hide the stealed code.

  9. #9
    Member
    Join Date
    Sep 2008
    Location
    France
    Posts
    863
    Thanks
    461
    Thanked 257 Times in 105 Posts
    It doesn't surprise me that Infima is still stealing PAQ code. They even tried to patent it. http://appft1.uspto.gov/netacgi/nph-...DN/20070233477
    That's just plainly disgusting.

    And apparently, this company tried to steal other's codes too.
    http://www.c10n.info/archives/415
    That a company like this can still exist after being proved guilty is very instructive on the judiciary system.
    They should be quite simply sued in court for copyright infringement, and stripped off. Alas, there is not much money to be gained from this absurd company that no lawyer will want to try it. Not a valuable enough target...


    Anyway, ultimately, a patent is only worthy if it allows to win actions in court.
    And that's where, should USPTO not do its job properly, "prior art" can be reminded to jury (to says the least, as here this is just blatant copy). In effect making the patent void.

  10. #10
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    Yes, I also remember a thread on comp.compression where after Infima was caught, their next version encrypted the code to try to hide the fact that they were still stealing code.

  11. #11
    Member Raymond_NGhM's Avatar
    Join Date
    Oct 2008
    Location
    UK
    Posts
    51
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Wink

    Quote Originally Posted by Skymmer View Post
    I saw that Raymond said something about "solid" behaviour. But "solid" term is inapplicable here. If you'll look into the resulting SFX file you'll quickly notice the following structure:
    Code:
    SFX stub -> hex: 07 00 00 00 -> filename -> hex: 00 -> compressed data
    You can even easily cut one of the block and SFX will continue to work. Furthermore, here is an another test, conducted on one big 50 533 219 byte JPEG file.
    Code:
    big_building.jpg    50 533 219
    big_building.pjg    43 170 370
    big_building.exe    42 923 450
    my intention of "solid" is NOT as "solid block" who called same in 7-Zip or in WinRAR,
    cause in this case, my meaning is "hard 'solid' worked" in PJA than PJG.

    also to better understanding, i will fix PJA header info from Skymmer for each
    PJG combined in SFX:

    1. 4 bytes: length of filename in byte value (maybe in future ver. folder path+filename)
    2. "filename.JPG" that's terminated with a 0x00
    3. 4 bytes for Compressed PJG size (started from "JS" sign)
    4. "JS" PJG header signature
    5. 1 byte for PackJPG version that's 4bits of left=Major ver. & 4bits of right=minor ver
    for ex. in PJG packed with v2.4, is value 18h=24 that's same 2.4
    6. 1 byte fixed value in 0x01
    7. Start point of re/compressed JPEG.
    Last edited by Raymond_NGhM; 30th April 2010 at 16:18.

  12. #12
    Member
    Join Date
    May 2010
    Location
    Germany
    Posts
    1
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Matt Mahoney View Post
    It doesn't surprise me that Infima is still stealing PAQ code. They even tried to patent it. http://appft1.uspto.gov/netacgi/nph-...DN/20070233477

    I wrote to the USPTO over a year ago and pointed out how they even plagiarized text right from my technical report. So far no response and the patent is still pending.
    Matt,
    Your letter did arrive to the USPTO and they did take it into consideration.
    You can check the status here:
    http://portal.uspto.gov/external/portal/pair
    The application number is : 11/420102

    Latest status of the patent is: Abandoned

    Tony

  13. #13
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    Thank you for finding this.

    If you click on the tab "image file wrapper" you can find the letter I sent to the USPTO (02-02-2009 misc. incoming letter) and their rejection letter (08-12-2009 non final rejection). The rejection also cites PAQ6, PAQ7, PAQ8, and a couple of papers on neural network compression by me and others. Infima did not respond to the non final rejection after 6 months so the patent was classified as abandoned.

    The first part of the rejection letter rejected 5 of the 10 claims because of a technicality. You can't patent algorithms or programs, so you have to word it as patenting a computer running the algorithm and they didn't do that. Then it goes on to reject all 10 claims based on prior art.

  14. #14
    Member Surfer's Avatar
    Join Date
    Mar 2009
    Location
    oren
    Posts
    203
    Thanks
    18
    Thanked 7 Times in 1 Post

    Post

    What about opensource jpegoptim 1.2.3 ? Lossless/lossy modes are supported.
    Last windows version is 1.2.2

  15. #15
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    It looks like you can do the same thing with cjpeg/djpeg and jpegtran from the IJG library. jpegtran -optimize optimizes the Huffman tables, which is pixelwise lossless but not bitwise lossless.

  16. #16
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Yeah...and I can't find any image where it's not weaker than jpegtran.

  17. #17
    Member Surfer's Avatar
    Join Date
    Mar 2009
    Location
    oren
    Posts
    203
    Thanks
    18
    Thanked 7 Times in 1 Post
    Thanks for explanation.

  18. #18
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Guess i may as well post this here. New IJG Jpeg out, 8c:

    Version 8c 16-Jan-2011
    -----------------------

    Add option to compression library and cjpeg (-block N) to use
    different DCT block size.
    All N from 1 to 16 are possible. Default is 8 (baseline format).
    Larger values produce higher compression,
    smaller values produce higher quality.
    SmartScale capable decoder (introduced with IJG JPEG required.

  19. #19
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,372
    Thanks
    213
    Thanked 1,020 Times in 541 Posts
    Guess I'd link the exes for it - http://nishi.dreamhosters.com/u/libjpeg_v8c.rar

Similar Threads

  1. Slow Visual 2010
    By Cyan in forum The Off-Topic Lounge
    Replies: 23
    Last Post: 24th May 2010, 03:03
  2. video compression (test)
    By Lone_Wolf in forum Data Compression
    Replies: 42
    Last Post: 14th January 2010, 23:50
  3. JPEG Compression Test [December 2009]
    By Skymmer in forum Data Compression
    Replies: 9
    Last Post: 23rd December 2009, 21:06
  4. 12th April - The Day of Astronautics
    By encode in forum Forum Archive
    Replies: 37
    Last Post: 13th April 2007, 11:26
  5. Squeeze Chart 2006 - 02 April/13 May
    By encode in forum Forum Archive
    Replies: 9
    Last Post: 17th July 2006, 05:39

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •