Page 4 of 4 FirstFirst ... 234
Results 91 to 117 of 117

Thread: Precomp 0.3.8

  1. #91
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by schnaader View Post
    Absolutely right. In fact, the next release will be 0.4 because there are many new features and I think it's better to release this version instead of a 0.3.9 version that is only used for a month. But I still don't know when the exact release date will be.
    This is good news Will it contain [any] CAB support?

  2. #92
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by nanoflooder View Post
    This is good news Will it contain [any] CAB support?
    Sorry, not yet. CAB, CHM and some others are formats that will need much time. First attempts using the SDK or trying to hardcode the fileformat weren't successful yet.
    Last edited by schnaader; 2nd January 2009 at 16:18.
    http://schnaader.info
    Damn kids. They're all alike.

  3. #93
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    [something has just died inside of me]

  4. #94
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Another thing that bothers me about Precomp: is there any way to increase the speed (significantly) in -brute mode? Had it been ~1-5 KB/s - it would be just fine! But now it is rather too slow to proceed into r/l

  5. #95
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    409
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Bruteforce is bruteforce it tries to find any stream at every byte. You can't really speed it up and you could see it as a test chance or good for mainly small files.

  6. #96
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    From my limited testing the hit rate on -brute is so small as to not even be worth it over -slow. And yeah -brute is always gonna be mainly cpu/memory dependant so the only easy way to speed it up is to invest in better hardware. Can't remember if it takes advantage of multi-core CPUs but that would be one way to speed it up sharing the load over the cores available to it.

  7. #97
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    -brute mode does help with InstallShield CABs. It basically almost decompresses them. Unfortunately, I can't test anything bigger than 200 KB for speed reasons. I just thought, maybe the algorythm itself supports some shaving-off or simplifying, to make it reasonably fast? As I said, 1-5 KB/s would be just fine!
    And, yes, -fast -brute does work, but not always: neither for speed - if the file cannot be decompressed, it runs the whole brute process, nor for compression - sometimes the file simply has more than 1 stream to decompress.

  8. #98
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Brute mode could get much faster in some later versions when stream parsing is done by Precomp itself instead of zLib DLL and perhaps could replace slow mode. But this mode is part of the experimental state of Precomp - if you find a file that can be decompressed with it, this format can be analysed and will perhaps be supported by Precomp in one of the next versions (@nanoflooder: this will also happen for CAB, just not in the next version).

    So speeding up the time between discovering a new file format and supporting it is much more useful than speeding brute mode up, especially because it often lowers compression ratio for big files by decompressing wrong streams. This is the real bottleneck I've to work on.

    One possibility would be to allow update DLLs - for example a special CAB DLL could tell Precomp where the streams in CAB files are. Then you could add this DLL to your Precomp directory to enable CAB support. The required functions for those DLLs would be described, so they could be made by everyone and I could distribute them on my site. For useful DLLs that are working and passed some testing on several files, the author could send the source code to me and it would be very easy to add it to the next Precomp version.
    http://schnaader.info
    Damn kids. They're all alike.

  9. #99
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by schnaader View Post
    Brute mode could get much faster in some later versions when stream parsing is done by Precomp itself instead of zLib DLL and perhaps could replace slow mode.
    This is just what I've been thinking about yesterday
    Quote Originally Posted by schnaader View Post
    But this mode is part of the experimental state of Precomp - if you find a file that can be decompressed with it, this format can be analysed and will perhaps be supported by Precomp in one of the next versions (@nanoflooder: this will also happen for CAB, just not in the next version).
    So speeding up the time between discovering a new file format and supporting it is much more useful than speeding brute mode up, especially because it often lowers compression ratio for big files by decompressing wrong streams. This is the real bottleneck I've to work on.
    This is true. It's much more useful to develop new format support than speeding the -brute mode up.

    All I can say is that Precomp will be a very powerful addition to any archiver (as long as it's better than deflate of course) because recompressing jpg+gif+png+pdf+cab+zip would increase the resulting compression rate by whopping numbers. The main reason why I am so obsessed with cabs is their huge spread in Windows - .cab, .msi, .chm, .msp (especially the last two because I myself couldn't find any way to recompress them losslessly - even manually, without automation) and endless program distributions, which waste tons of space every day.

  10. #100
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Just another idea: it would be very useful to have a switch which would tell precomp to delete the .pcf if nothing has been decompressed, otherwise the original file. It is quite difficult to automate this process via .bat files and usage of "None of the..." string at the stdout.

  11. #101
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by nanoflooder View Post
    Just another idea: it would be very useful to have a switch which would tell precomp to delete the .pcf if nothing has been decompressed, otherwise the original file. It is quite difficult to automate this process via .bat files and usage of "None of the..." string at the stdout.
    Difficult? More complicated than adding a switch, but with grep it's easy.
    It's actually 1 more command.

  12. #102
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Uhm... How would you do that? Even with grep (or, "find" in win32)...
    I process the files this way:
    Code:
    echo.>y
    dir /s /b *.* > ..\list.txt
    for /f "usebackq delims=" %f in (..\list.txt) do (
    precomp -slow -o"%i.pcf" "%i"<y>out.txt
    project1 out.txt
    )
    Where project1 is my application that parses out.txt, finds out if precomp was successful and deletes the respective file.

  13. #103
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by nanoflooder View Post
    Uhm... How would you do that? Even with grep (or, "find" in win32)...
    I process the files this way:
    Code:
    echo.>y
    dir /s /b *.* > ..\list.txt
    for /f "usebackq delims=" %f in (..\list.txt) do (
    precomp -slow -o"%i.pcf" "%i"<y>out.txt
    project1 out.txt
    )
    Where project1 is my application that parses out.txt, finds out if precomp was successful and deletes the respective file.
    Sorry, not 1 line.
    I meant:
    Code:
    grep "no gain" out.txt>did_smth.txt
    :: Do the rest as you would with a special switch, if did_smth.txt exists...
    But this is more complicated because > creates an empty file.

    Code:
    grep "no gain" out.txt>did_smth.txt
    FOR %%N IN (did_smth.txt) DO SET did_smth=%%~zN
    IF [%did_smth%]==[0] (
        ::did nothing
    ) ELSE (
        ::did something
    )
    Untested, might have some mistakes.
    Warning: it obviously doesn't work well if the file name contains "no gain".

  14. #104
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by nanoflooder View Post
    Just another idea: it would be very useful to have a switch which would tell precomp to delete the .pcf if nothing has been decompressed, otherwise the original file. It is quite difficult to automate this process via .bat files and usage of "None of the..." string at the stdout.
    At the moment, I think batch error levels aren't working, but they'll be implemented. With this, you can do something like this:

    Code:
        precomp [...]
        if errorlevel == 2 goto hd_full
        if errorlevel == 1 goto nothing_decompressed
        goto everything_right [not really needed]
    But I think I'll add a switch too, because it could be very useful for big files, you wouldn't have to wait for Precomp to write the PCF file when nothing could be decompressed. In this case, the switch could disable disk writing and increase (something like double) speed.
    Last edited by schnaader; 11th January 2009 at 23:43.
    http://schnaader.info
    Damn kids. They're all alike.

  15. #105
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by m^2 View Post
    Sorry, not 1 line.
    I meant:
    Code:
    grep "no gain" out.txt>did_smth.txt
    :: Do the rest as you would with a special switch, if did_smth.txt exists...
    But this is more complicated because > creates an empty file.

    Code:
    grep "no gain" out.txt>did_smth.txt
    FOR %%N IN (did_smth.txt) DO SET did_smth=%%~zN
    IF [%did_smth%]==[0] (
        ::did nothing
    ) ELSE (
        ::did something
    )
    Untested, might have some mistakes.
    Warning: it obviously doesn't work well if the file name contains "no gain".
    Thanks, I'll try to make a working script

    Quote Originally Posted by schnaader View Post
    At the moment, I think batch error levels aren't working, but they'll be implemented. With this, you can do something like this:

    Code:
        precomp [...]
        if errorlevel == 2 goto hd_full
        if errorlevel == 1 goto nothing_decompressed
        goto everything_right [not really needed]
    But I think I'll add a switch too, because it could be very useful for big files, you wouldn't have to wait for Precomp to write the PCF file when nothing could be decompressed. In this case, the switch could disable disk writing and increase (something like double) speed.
    Can't wait for the release

  16. #106
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    I hate this crap called bat files.

    Could someone please help me out?
    Code:
    echo.>y
    dir /s /b testset\*.*>..\list.txt
    for /f "usebackq delims=" %%i in (list.txt) do (
    	precomp -slow -o"%%i.pcf" "%%i"<y>out.txt
    	grep "None of the" out.txt>grep.txt
    	for %%n in (grep.txt) do set did=%%~zn
    	if "%did%"=="0" (
    		echo "%%i" can be decompressed
    		pause
    	) else (
    		echo "%%i" cannot be decompressed
    		pause
    	)
    )
    This code should work, but cmd firstly fills all the variables with their values and then executes it, screwing everything up. Like,
    Code:
    D:\>(
    precomp -slow -o"D:\testset\a.pdf.pcf" "D:\testset\a.pdf" 0<y 1>out.txt
    grep "None of the" out.txt 1>grep.txt
    for xn in (grep.txt) do set did=%~zn
    if "63" == "0" (
    echo "D:\testset\a.pdf" can be decompressed
    pause
    ) else (
    echo "D:\testset\a.pdf" cannot be decompressed
    pause
    )
    )
    D:\>set did=0
    "D:\testset\a.pdf" cannot be decompressed
    So at first it generates the code that compares "63" with "0" and then sets the new value to %did%.

    BTW, I think all those who use Windows on daily basis should sign some petition for including bash as the default shell. I saw one day their new "Powershell", and it didn't cheer me up at all.

  17. #107
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Code:
    echo.>y
    dir /s /b _\*.*>list.txt
    for /f "usebackq delims=" %%i in (list.txt) do call :loop1 %%i
    goto :eof
    
    :loop1
    precomp -slow -o"%1.pcf" "%1"<y>out.txt
    grep "None of the" out.txt>grep.txt
    for %%n in (grep.txt) do set did=%%~zn
    if "%did%"=="0" (
        echo "%1" :]
        pause
    ) else (
        echo "%1" :[
        pause
    )
    goto :eof
    There is some inconsistency in your code. Once you use ..\list.txt and once w/out ..\

    Indeed, batch is the worst practical language I know. Unless you use one magic command:
    Code:
    perl script.pl

  18. #108
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks for the advice with labels
    Quote Originally Posted by m^2
    There is some inconsistency in your code. Once you use ..\list.txt and once w/out ..\
    list.txt contains the list of files to be processed (I made the respective command shorter here than in r/l), while out.txt is a temporary output file for each precomp run.

  19. #109
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by nanoflooder View Post
    Thanks for the advice with labels

    list.txt contains the list of files to be processed (I made the respective command shorter here than in r/l), while out.txt is a temporary output file for each precomp run.
    Yes, I see. But in the code you posted you create ..\list.txt and parse list.txt w/out cd in between.
    It was a side note, I guess that's just merging 2 pieces of code or so, but as you posted - the error is that list.txt doesn't exist.

  20. #110
    Member
    Join Date
    Aug 2008
    Location
    Saint Petersburg, Russia
    Posts
    215
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by m^2 View Post
    but as you posted - the error is that list.txt doesn't exist.
    Ah, yes, indeed I've been looking at another line of code
    Quote Originally Posted by m^2 View Post
    I guess that's just merging 2 pieces of code or so
    Yep, it's a stripped-out version with the important part only

  21. #111
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    349
    Thanks
    37
    Thanked 37 Times in 22 Posts
    Any NewS..

  22. #112
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    I tried recompressing my photos in v0.3.8 (funny, ReactOS of same version came out recently) for backup purposes, but packJPG fails on it. Separating the crashing part was very easy... at least for once! First decompress the RAR before precomping.
    There were only JPG pictures and MOV (constisting of MJPG, so just more JPG pictures) in the huge archive. BTW, I didn't recall that time if latest Precomp can handle > 2GB files, so I first 7z-stored everything, then tarred it to 2GB parts, then after packJPG's failure tarred to smaller parts and then knifed it to get as small file as possible... just to prevent questions.
    Attached Files Attached Files
    Last edited by Black_Fox; 14th February 2009 at 23:35. Reason: typos
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  23. #113
    Member
    Join Date
    Oct 2007
    Location
    Germany, Hamburg
    Posts
    409
    Thanks
    0
    Thanked 5 Times in 5 Posts
    Yes thats also most likely the known bug of packjpg with any sort of very hugh (wrong) picture data.
    But sadly Matthias Stirner don't seems to be very active in developing packjpg at the moment. I thing schnaader will walk around the bug in a better jpg stream detecion or simpler with a size limit. But thats no nice walk around in the more and more high quality pictures that grows and grows.

  24. #114
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    I had several jpegs the pacjpg.exe didn't like to precess
    something with non optimal data

    i ran jpegoptim.exe --strip-all on the files and afterwords it worked perfectly

  25. #115
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    It seems the error that Black_Fox describes is some sort of strange memory error when processing JPG files. If I rename his file to test.7z, it works. If I copy precomp.exe to a different directory and call it from there, it sometimes works. But I also reproduced the error several times. It even works for me if I add some switches (-v or -t+j).
    lprepaq v1.1 had similar strange crashes because of a memory error in the switch parsing routines. The bad thing is that this sort of error is very hard to find (and as it seems to happen together with JPGs, it could even be hidden in the PackJPG DLL). Anyway, thanks for the bug report, I'll try to investigate this further.
    http://schnaader.info
    Damn kids. They're all alike.

  26. #116
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    By the way:

    Quote Originally Posted by maadjordan View Post
    Any NewS..
    Recursion (multi-pass) is slowly moving forward. Precompression with it works quite good already, now I'll have to adjust the recompression routines.
    http://schnaader.info
    Damn kids. They're all alike.

  27. #117
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    349
    Thanks
    37
    Thanked 37 Times in 22 Posts
    i found a problem in identifying the JPEG stream in Mac-Pict image format.. here is a data strip
    http://rapidshare.com/files/205931108/001.rar.html

    i found this "Only two non-proprietary formats, other than JFIF, currently support JPEG-encoded data. The latest version of the Macintosh PICT format prepends a PICT header to a JFIF file stream. Strip off the PICT header (everything before the SOI marker) and any trailing data (everything after the EOI marker) and you have the equivalent of a JFIF file. The other format, TIFF 6.0, also supports JPEG and is discussed in depth in the article on TIFF." here http://www.fileformat.info/format/jpeg/egff.htm

Page 4 of 4 FirstFirst ... 234

Similar Threads

  1. Precomp 0.4
    By schnaader in forum Data Compression
    Replies: 190
    Last Post: 5th October 2010, 15:13
  2. Precomp (and Precomp Comfort) in 315 kb
    By Yuri Grille. in forum Data Compression
    Replies: 2
    Last Post: 1st April 2009, 19:40
  3. Precomp 0.3.5 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th August 2007, 14:55
  4. Precomp 0.3.3 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th July 2007, 17:27

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •