Results 1 to 20 of 20

Thread: PackJpg brute force methofs

  1. #1
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    since i discovered the -dev option in Packjpg i have always make several compression trials to manually optimize the compression.

    Does someone have a bruteforce batch for this ?
    or am I going to make one myself

  2. #2
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,475
    Thanks
    26
    Thanked 121 Times in 95 Posts
    can you post your best results - ie. examples with best gains - because i think it's not worth the time.

  3. #3
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    What ever if its worth the time must personal judgement

    i have made my batch and will add info later.
    my batch file i only going through 256 trials (C5->C20 & S5->S20)

  4. #4
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    12 jpeg files = 12.845.591
    Packjpg = 10.888.421
    Brute force = 10.853.915

    thats around 2.6kbytes per file saved

    but i takes very long time doing 256 encodings per file. and since there is a bug in my batch file somewhere, I'm going to redo it with a more intelligent way.

    ---
    @ECHO OFF

    ECHO @ECHO OFF > packtemp.bat
    ECHO Packjpg -dev %%2 %%3 %%4 >> packtemp.bat
    ECHO if %%~z5 GTR %%~z6 goto del5 >> packtemp.bat
    ECHO if %%~z6 GTR %%~z5 del %%6 >> packtemp.bat
    ECHO Goto END >> packtemp.bat
    ECHO :del5 >> packtemp.bat
    ECHO del %%5 >> packtemp.bat
    ECHO ren %%6 %%5 >> packtemp.bat
    ECHO goto end >> packtemp.bat
    ECHO :end >> packtemp.bat


    packjpg %1
    ren %~n1.pjg %~n1.tmp

    for /L %%S in (5,1,20) do for /L %%C in (5,1,20) do call packtemp.bat -dev -s%%S -c%%C %1 %~n1.tmp %~n1.pjg
    ren %~n1.tmp %~n1.pjg
    del packtemp.bat
    ---

    on some files it produces the filename____.pjg files.
    its reproducible so it has to be something with the jpg file or name

  5. #5
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    please can you let the batch store the encoding setting into a statistics file that descripe the original jpeg file (baseline or huffman optimized) and its size and the setting used..

    so we may after runing this on hunderds or thousands of file get the optimal setting or settings for most cases..

  6. #6
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    Actually i wouldn't know how to do it.
    but i might look into it

    but before using this for anykind of development any bugs have to be ironed out.

    I'm still not able to find the bug, where it on some files makes the "filename___.pjg" files


    i was thinking of making new batch which first searches for the S value end then for the optimal C value.
    but that requires that there is some kind of pattern in what combo of S and C values are best.

  7. #7
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    571
    Thanks
    219
    Thanked 205 Times in 97 Posts
    I just finished my brute force test on A10.jpg from www.maximumcompression.com - the results are available here:

    http://schnaader.info/graph_complete.png
    http://schnaader.info/result.txt

    Having this, I can test various algorithms to determine the quickest progression to the minimal filesize.
    The graph is rather simple, so simple algorithms should be the best.
    I tested a subdivision algorithm:

    Choose a range for C (C1...C2) and a range for S (S1..S2), then get the four filesizes for the corners (C1,S1), (C2, S1), (C1, S2), (C2, S2) and then take the quarter of that rectangle with the minimal filesize and recurse until C1 and C2 and S1 and S2 are very close to each other.

    Example for C = 5..30, S = 4..30:

    Testing C=5 S=4 : 711536
    Testing C=30 S=4 : 707228
    Testing C=5 S=30 : 704664
    Testing C=30 S=30 : 706093
    Dividing -> C= 5 .. 17 S= 17 .. 30
    Testing C=5 S=17 : 703638
    Testing C=17 S=17 : 698482
    Testing C=5 S=30 : 704664
    Testing C=17 S=30 : 700120
    Dividing -> C= 11 .. 17 S= 17 .. 23
    Testing C=11 S=17 : 696898
    Testing C=17 S=17 : 698482
    Testing C=11 S=23 : 697268
    Testing C=17 S=23 : 699215
    Dividing -> C= 11 .. 14 S= 17 .. 20
    Testing C=11 S=17 : 696898
    Testing C=14 S=17 : 697634
    Testing C=11 S=20 : 697082
    Testing C=14 S=20 : 697911
    Dividing -> C= 11 .. 12 S= 17 .. 18
    Testing C=11 S=17 : 696898
    Testing C=12 S=17 : 696915
    Testing C=11 S=18 : 697041
    Testing C=12 S=18 : 697074

    This would need only 20 packJPG calls (compared with 650 for brute force) and returns the correct minimal filesize of 696.898 bytes for C=11, S=17.
    Of course, it is important to choose the right region. Other results:

    C = 3..30, S= 2..30: 20 steps, C=16, S=17, 697.991 bytes
    C = 5..120, S = 4..60: 28 steps, C=12, S=33, 698.080 bytes
    C = 5..20, S = 5..20: 20 steps, C=12, S=14, 696.915 bytes
    http://schnaader.info
    Damn kids. They're all alike.

  8. #8
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    great .. well done.. as said there is a quick approach to reach the minimum thus i hope its adopted

  9. #9
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    #schnaader

    1:
    does this work for other pictures as well ?

    2:
    why does testing a bigger area decrease compression ?
    C = 5..120, S = 4..60: 28 steps, C=12, S=33, 698.080 bytes

    3:
    Where is the batch files for this

    4: if we need to have 4 trials before comparing the size (if I understand correctly)
    Wouldn't that be optimal for multi threading optimizing ?
    Doing 2 or 4 encodings at the same time to fully utilize dual cores and quad CPU's

    for a quad core that would men only 5 times the compression time for 20 step.
    with the speed of packjpeg. it would be something it would consider worth it

  10. #10
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    571
    Thanks
    219
    Thanked 205 Times in 97 Posts
    > 1: does this work for other pictures as well ?

    It should, but there should be more "full-force" tests to see if the size distribution can be extremely different for other picture types (screenshots etc.) and other regions or even another algorithm have to be used.
    Main need for this algorithm is that the distribution is roughly "linear", meaning that if you find a smaller size, following this direction further reduces filesize. The best would also be a distribution that only has one global minimum and no other local minima, but as the given distrubution has only small oscillations, this has just small impacts.

    > 2: why does testing a bigger area decrease compression ?
    > C = 5..120, S = 4..60: 28 steps, C=12, S=33, 698.080 bytes

    As explained, the algorithm is misleaded by some local mimimum. The other example (C = 3..30, S= 2..30) is used to show that the extreme locations (~2..5 for C/S) "confuse" the algorithm.

    > 3: Where is the batch files for this

    I made the tests with a PowerBASIC program. Perhaps I'll write another one in PowerBASIC or C++ that can be used to test the algorithm for a given image and region.

    >4: if we need to have 4 trials before comparing the size (if I
    > understand correctly)
    > Wouldn't that be optimal for multi threading optimizing ?
    > Doing 2 or 4 encodings at the same time to fully utilize dual cores
    > and quad CPU's

    Regardless of the algoriithm used, tests like this one always involve parallel program executions that can easily be assigned to different CPUs.
    But for optimal speed, it would be the best if the gained results could be used to improve the packJPG code. Perhaps it could get a slow method or something. When changing packJPG directly, some steps could be cached, like the decompression of the JPG, and runtime would not be 20 times slower, but only 5-15 times or something like that. Then this could be further optimized for dual- or quad-core CPUs.

    But as said, this is the far future, and now, there should first be some tests on the size distributions for different image types.
    http://schnaader.info
    Damn kids. They're all alike.

  11. #11
    Member
    Join Date
    Dec 2006
    Posts
    611
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by schnaader
    But for optimal speed, it would be the best if the gained results could be used to improve the packJPG code. Perhaps it could get a slow method or something.
    Normal mode - same mode as now, slow - your subdivision algorithm, insane - brute-force testing of all values

  12. #12
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    BTW there is a now PackJPG v2.3

  13. #13
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    packJPG v2.3 is now available here:
    http://www.elektronik.htw-aalen.de/PackJPG/
    or here:
    http://www.packjpg.de.nr

    This time there are no improvements for compression, though. This release focuses on compatibility and speed.

    What's new:
    - compatibility with:
    . JPEG progressive mode
    . JPEG extended sequential mode
    . CMYK color space
    . older CPUs
    - around 15% faster compression & decompression
    - new switch: [-d] (discard meta-info)
    - various bugfixes

    At this time there are no plans to include other JPEG modes (f.e. arithmetic).

    Unnecessary meta information can now be discarded using "-d". This reduces compressed files' sizes. Be warned though, reconstructed files won't be bitwise identical with the original files and meta information will be lost forever. There won't be any loss to image data or quality.

    from maximumcompression

  14. #14
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    great job Matthias Stirner.. i think its now well featured..

    is there going to be support of compressing multiple -jpg in to single PJG file..?

    and i noticed there is "-pgm" command in "-dev" so whats new in the "-dev" section?

    i have tested a jpg file (huffman optimized 2500x3542)

    pale.jpg 1,472,005 bytes 100%
    pale.jpg -artihmetic 1,339,004 bytes 2.5 sec 90.96%
    pale.pjg 1,178,090 bytes 6.3 sec 80.03 %
    pale.paq8o6 -1 1,102,550 bytes 67.70 sec 74.90%
    pale.paq8o6 -2 1,082,580 bytes 68.38 sec 73.54%
    pale.paq8o6 -3 1,067,851 bytes 66.08 sec (faster !) 72.54%
    pale.paq8o6 -4 1,060,054 bytes 63.45 sec (faster !) 72.01%
    pale.paq8o6 -5 1,057,519 bytes 61.81 sec (faster !) 71.84%
    pale.paq8o6 -6 1,057,189 bytes 59.95 sec (faster !) 71.82%
    pale.paq8o6 -7 1,057,254 bytes 60.41 sec 71.82%
    pale.paq8o6 -8 (out of memory)

    to test the CMYK option here a webpage that has both RGB and CMYK jpgs'

    packjpg is best based on time/ratio.. go ahead .. fly

  15. #15
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    they would increase decoding time greatly if the made some simpel multi core optimizing

    went doing packjpg *.pjg
    instead of doing it 1 file by 1 file. it should instead decode the more file at the same time.

  16. #16
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    answers from Mathis in Maximumcompression

    > any estimate as to when the pjg format will stabilize
    > enough as not to break compability with previous version?

    I'm still working heavily on better compression ratios. The next version will definitely break compatibility again, but maybe the format will stabilize afterwards.

    > did you test it on large images..? i will do in the next
    > few days..?

    I've tested it myself on thousands of JPEG files of different dimensions and from different sources. Further testing would be great, though.

    > is there going to be support of compressing multiple -jpg
    > in to single PJG file..?

    Yes, there will be in the near future. My former fellow student Holger Mundt is currently working on the packJPG GUI along with packJPG archive (.PJA) format.

  17. #17
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    #maadjordan

    i have just completteted compression of around 17.9GB of hi-res (6mpixels) pictures.
    I'm about to make a batch with does it by the subdivision algorithm.

    Could the info be used for further tunings ?

    it is pictures of on or more persons in difference places (outdoor/indoor)

  18. #18
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    335
    Thanks
    36
    Thanked 36 Times in 21 Posts
    i'll be glad to test it on some images of around 1.0GB of (10mpixels)

  19. #19
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    I'm about to make a batch file for sub division methode

    but let me ge its right

    CvalLo=5
    CvalHi=30
    SvaLo=5
    SvaHi=30

    The single file with the lowest size. this files Cval and Sval are saved and the other two are halved thereby decreasing the areas the Cvale/Sval corners cover?

    so if the single smallestfile would be c5 and s30
    them the CvalHi woduel ge decreased to 17 and the SvalLo woulde be increased to 18
    Is this correct ?

    second
    is it best to start wit at unevne arrays
    or an even array of cval/sval.

    5->30 = 26 values
    4->30 = 27 values

    using the even values the midle value would be ether 17 or 18m deending on whatever we are decreasing or increasing a value

    But using af ueven the value woulde be 17 for both.

    looking it in 2D means the starting with an even array. you would have "squares" match up to each others .
    By using uneven arrays. the borders would be the same

  20. #20
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    879
    Thanks
    51
    Thanked 106 Times in 84 Posts
    HMM the sub division gave me worse results then just using packjpg
    normally

    Standard JPG 73.618 bytes
    PackJPG STD (c9/s6) 60.005 bytes
    Sub division (c17/s10) 60.064 bytes

    or maybe did something wrong
    gotta check it later today

Similar Threads

  1. Brute forcing Delta block size
    By SvenBent in forum Data Compression
    Replies: 2
    Last Post: 2nd May 2009, 13:44
  2. PreComp + PackJPG
    By squxe in forum Data Compression
    Replies: 2
    Last Post: 16th May 2008, 20:53
  3. PackJPG v2.2 released!
    By LovePimple in forum Forum Archive
    Replies: 29
    Last Post: 3rd February 2008, 20:42
  4. Fastet Packjpg on the way ?
    By SvenBent in forum Forum Archive
    Replies: 3
    Last Post: 24th November 2007, 23:01
  5. PackJPG v2.0 released!
    By LovePimple in forum Forum Archive
    Replies: 3
    Last Post: 18th June 2007, 04:57

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •