Page 1 of 2 12 LastLast
Results 1 to 30 of 191

Thread: Precomp 0.4

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts

    Precomp 0.4

    Hi!

    Finally, Precomp 0.4 is out. Change list:

    • New switch -mjpeg for MJPEG recompression support.
    • Added recursion (aka multi-pass).
    • Added MIME Base64 streams support.
    • Added bZip2 streams support.
    • Added batch file errorlevels.
    • Improved GIF support for partial matches.
    • Linked zLib library static - ZLIB1.DLL is not needed anymore.
    • Fixed bug that slowed down Precomp for files larger than 4 GB.


    Have a look at http://schnaader.info
    http://schnaader.info
    Damn kids. They're all alike.

  2. #2
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Awesome


  3. #3
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    926
    Thanks
    58
    Thanked 116 Times in 93 Posts

    Talking

    Weepeeee

    will test soon

    Is the multi-pass automatically or... ?

    hehe
    multipass... leloo dallas mul ti pass.

  4. #4
    Member
    Join Date
    Jul 2008
    Posts
    54
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thaaanks!!long time waiting for this..time to test..
    PS: linux version soon?

  5. #5
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by SvenBent View Post
    Is the multi-pass automatically or... ?
    Yes, the default setting is recursion depth 10 (should be enough for most files )

    For example test with one of the EML files from http://stationeryheaven.com/lund.htm. These are Base64 encoded files containing JPG/GIF files.

    Also note the bZip2 stream support. bZip2 does allow some parameter tuning, but this is almost unused, so bZip2 streams require only 1 try and will almost be decompressable completely (so far I haven't found a bz2 file that isn't). Tested with enwik8.bz2 from LTCB:

    enwik8.bz2 (29,008,758 bytes) => enwik8.pcf (100,000,028 bytes) => enwik8_.bz2 (identical )

    Quote Originally Posted by John View Post
    PS: linux version soon?
    Indeed, a linux version is being forged. I haven't received a PackJPG library for linux yet so JPG recompression won't be supported, but everything else should work as with the Windows version.
    Last edited by schnaader; 21st March 2009 at 20:29.
    http://schnaader.info
    Damn kids. They're all alike.

  6. #6
    Member
    Join Date
    Jan 2009
    Location
    Germany
    Posts
    35
    Thanks
    0
    Thanked 0 Times in 0 Posts


    Big thanks!

    which mode is best to use with freearc?

  7. #7
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by mstar View Post
    which mode is best to use with freearc?
    Depends on the file you want to compress. Usually, just using Precomp without any additional options will do it, but for filetypes not yet supported but using zLib (like SIS, 3DM, zeno - see the filetypes list on my site), slow mode might be better.

    By the way, I just reuploaded Precomp because the progress display was a bit jumpy for higher recursion levels (above 2).

    Additionally, there have been some changes to the site. Results are now created using Google Charts looking better than the table before, batch error levels are listed and it is possible to download old versions (0.3 - 0.3..
    Last edited by schnaader; 21st March 2009 at 21:06.
    http://schnaader.info
    Damn kids. They're all alike.

  8. #8
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    357
    Thanks
    37
    Thanked 39 Times in 24 Posts
    Quote Originally Posted by schnaader View Post
    Hi!

    Finally, Precomp 0.4 is out. Change list:

    • New switch -mjpeg for MJPEG recompression support.
    • Added recursion (aka multi-pass).
    • Added MIME Base64 streams support.
    • Added bZip2 streams support.
    • Added batch file errorlevels.
    • Improved GIF support for partial matches.
    • Linked zLib library static - ZLIB1.DLL is not needed anymore.
    • Fixed bug that slowed down Precomp for files larger than 4 GB.


    Have a look at http://schnaader.info
    Great update.. many thanks..

    i've tested the "-mjpeg+" option and it reduced compressing time for my test file of (20mb) from 303sec to 65sec - decompression 42sec so great speed ..

    but why its not default on?

  9. #9
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    409
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Seems like there is any sort of bug, because mjpeg is on by default

    Did you have the exact same number of found/recompressed streams?

  10. #10
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Thanks a lot! Will try it
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  11. #11
    Moderator

    Join Date
    May 2008
    Location
    Tristan da Cunha
    Posts
    2,034
    Thanks
    0
    Thanked 4 Times in 4 Posts

    Thumbs up

    Thanks Christian!

  12. #12
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    357
    Thanks
    37
    Thanked 39 Times in 24 Posts
    Quote Originally Posted by Simon Berger View Post
    Seems like there is any sort of bug, because mjpeg is on by default

    Did you have the exact same number of found/recompressed streams?
    i have test v0.4 agains same file. first with no option then with "-mjpeg+" added. when added its faster (over 4x faster). thats why i said it needs to be activated always unless it affects existing JPG files which is not true as i tested.

    further more it seems that "bZip2" support does not work on this file:

    http://heanet.dl.sourceforge.net/sou.../7z465.tar.bz2

    please check it out..
    Last edited by maadjordan; 22nd March 2009 at 13:48.

  13. #13
    Member
    Join Date
    Mar 2009
    Location
    Prague, CZ
    Posts
    62
    Thanks
    32
    Thanked 7 Times in 7 Posts
    Thanks for this useful program, Im looking forward to mutliple files/directories support.
    This version crashes when trying to process this file (unzip first):
    It works only with -l0. (multiple -l0 passes also dont work)

    Michal

    BTW, any plans to update precomp.dll used by lprepaq/paq8o8pre?
    Attached Files Attached Files

  14. #14
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    Quote Originally Posted by maadjordan View Post
    further more it seems that "bZip2" support does not work on this file:

    http://heanet.dl.sourceforge.net/sou.../7z465.tar.bz2

    please check it out..
    this file is created with 7zip own bzip2 compression code. precomp can only decompress files whose compression algos are built-in into precomp itself, now 7zip bzip2 algo isn't among these

  15. #15
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by maadjordan View Post
    i have test v0.4 agains same file. first with no option then with "-mjpeg+" added. when added its faster (over 4x faster). thats why i said it needs to be activated always unless it affects existing JPG files which is not true as i tested.
    This is strange. I'll investigate this further, but at the moment I can't explain this behaviour. As Simon said, the mjpeg switch is enabled by default and even if it wasn't, it should slow down the process a bit instead of speeding it up.

    Quote Originally Posted by maadjordan View Post
    further more it seems that "bZip2" support does not work on this file:
    At least the file can be decompressed (5086720 bytes), although the recompression fails, as you can see when using debug mode. As I mentioned earlier, there are still some ways to modify/tune the bZip2 algorithm that aren't used at the moment because there are 250 (!) possibilities for it. I think I'll do some tests with those bZip2 files and adjust slow mode so it checks those possibilities.

    EDIT: This tuning parameter is all about speed, so there's no way to generate other output using the bzip2 library, so it really seems to be some 7-Zip algorithm used here.
    See the bzip2 documentation about the workFactor parameter:

    Note that the compressed output generated is the same regardless of whether or not the fallback algorithm is used.

    Quote Originally Posted by mhajicek View Post
    This version crashes when trying to process this file (unzip first):
    It works only with -l0. (multiple -l0 passes also dont work)
    Thanks, I'll have a look at this crash. This could either be some bug with recursion or with gZip streams.

    Quote Originally Posted by mhajicek View Post
    BTW, any plans to update precomp.dll used by lprepaq/paq8o8pre?
    Yes, this will happen soon, too. For Prepaq v3, I'll use one of the paq8p releases (although I'm not really sure which one to use, either an optimized paq8p1 or the new paq8p2).
    Last edited by schnaader; 22nd March 2009 at 21:21.
    http://schnaader.info
    Damn kids. They're all alike.

  16. #16
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    216
    Thanks
    0
    Thanked 0 Times in 0 Posts
    ABBYY uses their own zlib library called AbbyyZlib.dll to compress the Lingvo dictionaries. Unfortunately, it has very different export functions, and the produced file structure seems to be different because Precomp can't decompress it even in -brute mode. Could you please look at it and make a verdict on whether or not is it possible to make something out of this? Thanks in advance
    I have attached the library and a sample dictionary (I've chosen a very small one to make it able to be compressed in -brute mode).
    Attached Files Attached Files

  17. #17
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    688
    Thanks
    41
    Thanked 174 Times in 88 Posts
    Quote Originally Posted by schnaader View Post
    Hi!
    Finally, Precomp 0.4 is out....
    Schnaader, just thank you !!!
    I would like to report not the bug but little imprecision. Seems that PreComp stores the name of processed file in low-case. I mean if I use precomp TEST.DAT I got test.dat after recompression. It's not critical but mandatory for some of my tasks. Of course I can use -r -o solution but anyway.

    Quote Originally Posted by schnaader View Post
    ... but I hope I can add support for more stream types like CAB in the next versions ...
    It will be just great. As far as I know CAB format used inside of MSI files so make them smaller in distributive will be nice.

    Quote Originally Posted by nanoflooder View Post
    ABBYY uses their own zlib library called AbbyyZlib.dll to compress the Lingvo dictionaries. Unfortunately, it has very different export functions, and the produced file structure seems to be different because Precomp can't decompress it even in -brute mode.
    You'll not believe me but I thought about it just a few days ago. I suppose DSL Compiler applies not only zlib compression but also some kind of encryption on resulting lsd dics. By the way what Lingvo version you're using? I can't find AbbyyZlib.dll file in my Lingvo 12.

  18. #18
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by Skymmer View Post
    Schnaader, just thank you !!!
    I would like to report not the bug but little imprecision. Seems that PreComp stores the name of processed file in low-case. I mean if I use precomp TEST.DAT I got test.dat after recompression. It's not critical but mandatory for some of my tasks. Of course I can use -r -o solution but anyway.
    Conversion to lower case was done to prevent differences in archive when only the filename case differs. This should be indeed corrected, the best behaviour would be to not use the case of the input file parameter but to get the correct case from the file system instead. This will also be important for upcoming Linux versions.
    http://schnaader.info
    Damn kids. They're all alike.

  19. #19
    Member
    Join Date
    Jun 2010
    Location
    earth
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi schnaader, i'm using precomp from the firsts alphas and the programa it's excellent and very nice idea, i have some suggestions to improve the code:

    1-I dont' know if you know a coder called aluigi http://www.aluigi.altervista.org, him created some tools for zlib, optimizing the compresion thanks to lzma..
    Maybe you can take a look to his code (it in GPL license) and make a improver version.

    2-Why precomp can't recognize some zlib in png's?

    3-What about add a preprocessor for a some compressions, like lzss, lzo, etc have a look at the web of aluig, you have a lot of compresions for the games (and yes GPL license too)

    4-I tried to precomp Slackware 12 (it's using gzip .tgz) but precomp can't detect this compression..

    5-I tried to precomp the Wolfenstein (the new one) but i get 0 precompression and i know the game uses zlib

    This is a nice tool, opening the code sure in 1-2 days someone can improve it, like freearc creator or 7-zip creator or paq creator, who knows..actually the program looks stopped and needs improvements.

    Of course thanks for this tool.

  20. #20
    Member
    Join Date
    Dec 2009
    Location
    Netherlands
    Posts
    39
    Thanks
    3
    Thanked 0 Times in 0 Posts
    Schnaader, are there any plans for future releases or for releasing the source code? I would be eternally grateful

  21. #21
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    I had some (read: a lot of) troubles with moving to a new flat this year. Also, development is at a critical point where a major rewrite is necessary to fix problems, getting the source clean and solving the "not recompressable" problem with zLib.

    There'll be a small bugfix and minor changes release (0.4.1) perhaps, but it's also possible that I'll skip it and release a whole new version (0.4.5). I can't say when it will be finished, but I'm working on it

    Recently I started the rewrite and the source code will be much cleaner after this, so hopefully I can finally release the source code. Another advantage is a better recursion concept that simplifies recursion and should fix some weird bugs that sometimes occur.

    Another thing I thought about is to additionally release unstable versions to let people test new features earlier.
    Last edited by schnaader; 14th September 2010 at 22:59.
    http://schnaader.info
    Damn kids. They're all alike.

  22. #22
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    btw, new freearc is able to use external compressors in stdin-to-stdout mode, w/o creating huge tempfiles. can you please add its support to precomp, at least for decompression?

  23. #23
    Member PAQer's Avatar
    Join Date
    Jan 2010
    Location
    Russia
    Posts
    22
    Thanks
    3
    Thanked 0 Times in 0 Posts
    I' m also interested in stdout|stdin switch. Also, a nice feature will be detection of duplicated streams ([un]compressed), it can improve speed and potentially reduce size of pcf file.

  24. #24
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by Bulat Ziganshin View Post
    btw, new freearc is able to use external compressors in stdin-to-stdout mode, w/o creating huge tempfiles. can you please add its support to precomp, at least for decompression?
    Quote Originally Posted by PAQer View Post
    I' m also interested in stdout|stdin switch.
    After the rewrite I'll have a wonderful precompress_stream function that has virtual read, write and seek functions and should be able to use stdin/stdout easily. For now, I'll have a look at decompression. Large parts of it aren't using tempfiles anymore, so perhaps it's possible to use stdout for those and disable the other parts.

    Quote Originally Posted by PAQer View Post
    Also, a nice feature will be detection of duplicated streams ([un]compressed), it can improve speed and potentially reduce size of pcf file.
    Yes, I also thought about this and will implement it, but it has lower priority. For now, a Precomp->SREP->... chain does a good job for this. Of course, detecting duplicate compressed streams would give the Precomp part a nice speed boost for some files.
    http://schnaader.info
    Damn kids. They're all alike.

  25. #25
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    thanks, it will improve installation speeds a lot!

  26. #26
    Member zody's Avatar
    Join Date
    Aug 2009
    Location
    Germany
    Posts
    90
    Thanks
    0
    Thanked 1 Time in 1 Post
    Keep up good work - precomp together with srep, freearc and new IS script will allow highly compressed and fast installations of games or whatever ;D
    Thanks!

  27. #27
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    4,133
    Thanks
    320
    Thanked 1,396 Times in 801 Posts
    > wonderful precompress_stream function that has virtual read, write and seek functions

    I'd again suggest to consider the coroutine approach.
    To me it seems much more convenient for codec libraries than ugly callback APIs,
    and also coders based on it are faster (in my experience, and not comparing
    to plain memory-to-memory though).

    Example:
    http://nishi.dreamhosters.com/u/newbrc_2.rar

    The actual codec is hidden in process() here and works memory-to-memory.
    But when it encounters buffer bounds (or for any other reason, like
    seeking) it returns a corresponding status and resumes with the next
    call of the same function.

    \newbrc_2\src_1\coroproc.inc:
    Code:
      void processfile( 
        FILE* f, FILE* g, 
        byte* inpbuf,uint inpbufsize,
        byte* outbuf,uint outbufsize
      ) {
        init();
        // add an output buffer in advance to store headers etc
        addout( outbuf, outbufsize );
        while( 1 ) {
          int r = process(); 
          if( r==0 ) break;
          if( r==1 ) {
            uint l = fread( inpbuf, 1, inpbufsize, f );
            if( l==0 ) break;
            // add input data for status=1
            addinp( inpbuf, l ); 
          } else if( r==2 ) {
            fwrite( outbuf, 1, outbufsize, g );
            // add an output buffer for status=2
            addout( outbuf, outbufsize ); 
          }
        }
        fwrite( outbuf, 1,outptr-outbuf, g ); // flush
      }
    Another example is http://nishi.dreamhosters.com/u/marc_v1.rar
    (its more complex, but there's a newer coroutine class)

    And a simple demo is this: http://nishi.dreamhosters.com/u/fibo_1.rar
    (with symmetric coroutines, stack switching, and a faster custom setjmp replacement)

    Code:
    struct index : coroutine<index> {
    
      void do_process( void ) {
        uint a=1;
        while(1) {
          yield( a );
          a++;
        }
      }
    
    } F1;
    
    struct fibonacci : coroutine<fibonacci> {
    
      void do_process( void ) {
        uint a=0,b=1;
        while(1) {
          yield( b );
          b = b + a;
          a = b - a;
        }
      }
    
    } F2;
    
    int main( int argc, char** argv ) {
    
      for( int i=0; i<20; i++ ) {
        printf( "%i:%i ", F1.call(), F2.call() );
      } printf( "\n" );
    
      return 0;
    }

  28. #28
    Member
    Join Date
    Sep 2010
    Location
    Australia
    Posts
    46
    Thanks
    0
    Thanked 0 Times in 0 Posts
    for /r %%i in (*.pcf) do (precomp -r "%%i")
    works like a charm, but all files are put in main Dir
    eg Running script from Folder Alpha which contains a folder Zeta that contains pcf files, when i run my script all the pcf files are uncompressed but are dumped in Alpha Dir when it should be the Zeta. Any idea's about why this is happening? ps Precomp 4 is super slow but is very nice! running it on external Usb drive so that might explain the slowness. ill do more tests. but the folder situation has me stumped

  29. #29
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    630
    Thanks
    288
    Thanked 252 Times in 128 Posts
    Quote Originally Posted by Omnikam View Post
    for /r %%i in (*.pcf) do (precomp -r "%%i")
    works like a charm, but all files are put in main Dir
    You can solve this by using 2 batch files. Content of the first one (scr1.bat):

    Code:
    for /r %%i in (*.pcf) do call scr2 "%%~pi" "%%i"
    Content of the second one (scr2.bat):

    Code:
    pushd %1
    precomp -r %2
    popd
    Quote Originally Posted by Omnikam View Post
    ps Precomp 4 is super slow but is very nice! running it on external Usb drive so that might explain the slowness.
    On external drives, the temporary files slow things down massively, yes. But as temporary files are stored in Precomp's running directory, you can prevent this by running Precomp from a faster disk, the temporary files will be created there in this case. So, for example, if your external drive has letter F, your fast harddisk has letter C:

    Code:
    C:
    precomp F:\mytestfile.iso
    This will create temporary files on drive C which should speed up the process, but will create the output file on F:\
    http://schnaader.info
    Damn kids. They're all alike.

  30. #30
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    it would be much faster to run precomp from RAM-disk. btw, how large is temporary data written to current directory?

Page 1 of 2 12 LastLast

Similar Threads

  1. Precomp (and Precomp Comfort) in 315 kb
    By Yuri Grille. in forum Data Compression
    Replies: 2
    Last Post: 1st April 2009, 20:40
  2. Precomp 0.3.8
    By schnaader in forum Data Compression
    Replies: 116
    Last Post: 6th March 2009, 10:37
  3. Precomp 0.3.5 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th August 2007, 15:55
  4. Precomp 0.3.3 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th July 2007, 18:27

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •