Page 2 of 3 FirstFirst 123 LastLast
Results 31 to 60 of 72

Thread: Precomp source code on GitHub

  1. #31
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts


    Quote Originally Posted by schnaader View Post
    Sounds like you're using MinGW GCC as compiler, try to add the options "-static -static-libgcc -static-libstdc++", this will remove the need for the DLLs.



    Try the attached file. It's the latest version (commit 93c1988).



    I wondered that too and will ask Matthias Stirner about it. My guess is that the MP3s are perfectly valid, but some unusual change happens in the MP3 frame format every now and then that packMP3 can't process correctly at the moment, so the file has to be divided into parts where the frame format is the same. Also note that your example file is divided into parts without any gaps (position of first part = 2197, corrected length of first part = 398106, 2197 + 398106 = 400303 = position of second part, and so on), so there even aren't any invalid things or garbage between parts. If my guess is right, this might get changed in packMP3 so it can process this files, too.

    Thanks for the compilation hint and for the exe too.

    Some comments on this commit and in general:

    1) Most errors are gone Well done.
    2) Some others still there. See attached. You'll find two examples, checked against both mp3checker and mp3val (logs included in a subfolder for your commodity). I also ran MP3diags on them. Didn't found errors either.

    Preview:
    Code:
    (9.42%) Possible MP3 found at position 284547, length 2736797
    packMP3 error: corrupted file, compression not possible
    No matches
    3) About SWF recompression: If I'm not missing something, precomp is able to recognize in the header whether an SWF file is compressed or not. Well, maybe your parser is missing something. See this:

    Code:
    shar as ARCH *.swf
    ...
    Recompressed streams: 733/844
    JPG streams: 322/376
    MP3 streams: 89/89
    SWF streams: 5/5
    zLib streams (intense mode): 317/374
    There are 18 files inside a Shelwien Archive (Tar-like no compression).

    If you want I can upload the actual files.
    Attached Files Attached Files

  2. #32
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by Gonzalo View Post
    2) Some others still there. See attached. You'll find two examples, checked against both mp3checker and mp3val (logs included in a subfolder for your commodity). I also ran MP3diags on them. Didn't found errors either.
    These are two interesting files, thanks. Somehow packMP3 seems to detect bad frames, but there are many possible reasons, so I'll write to Matthias Stirner first and won't spend much time investigating myself at first.

    Quote Originally Posted by Gonzalo View Post
    Code:
    Recompressed streams: 733/844
    JPG streams: 322/376
    MP3 streams: 89/89
    SWF streams: 5/5
    zLib streams (intense mode): 317/374
    Most likely 5 of the files are compressed, the other 13 aren't. Note that not compressed SWF files won't be shown in the statistics, if you expected to see something like 5/18 there. MP3 and JPG streams are usual candidates in SWF files, but the intense mode zLib streams are interesting, indeed. Looking at the specs, there are indeed several data types in SWF that are compressed by ZLIB:

    • JPGs with alpha channel (page 139, "DefineBitsJPEG3")
    • Lossless compressed bitmap data (pages 139 + 140, "DefineBitsLossless")
    • Lossless compressed bitmap data with alpha channel (page 142, "DefineBitsLossless2")
    • Alpha data for JPG/PNG/GIF images (page 143, "DefineBitsJPEG4")
    • Video data (page 208 ff., "Screen Video bitstream format", page 212 ff. "Screen Video V2 bitstream format")


    Haven't heard of these so far and I think most of them like the video data are rare, but adding them to the parser is useful. Created issue #30 for this.

    Please upload the files, I'll have a look at it.
    http://schnaader.info
    Damn kids. They're all alike.

  3. #33
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Quote Originally Posted by schnaader View Post
    Please upload the files, I'll have a look at it.
    SWF.nz

    This is all I have I think. Although maybe just one is enough.

  4. #34
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quick progress report:

    This actually were two issues. One of them was a check that made sure the version number for compressed SWF files is between 0 and 10 (exclusive). Most of your files have version 10, so this was the reason these weren't detected as compressed SWF. With the latest commit, that issue (#31) is solved by simply ignoring this version number.

    Luckily, most of your files also contain ZLIB compressed data like the ones I mentioned above, so they're helpful for that issue (#30). So far, I found a nice tool called SWF Investigator from Adobe that shows the tags used in a SWF file and makes analyzing files easier. Your files contain examples of DefineBitsJPEG3, DefineBitsLossless2 and DefineVideoStream, so thanks I don't know when I'll have time to work on that issue, so best advice for now is to use intense mode on SWF files.
    http://schnaader.info
    Damn kids. They're all alike.

  5. #35
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Well, happy to see this is really moving forward! Next time I'll try to save you the research and give more details.
    I have also a few other files making precomp freak out. But I'll wait you have some time to fix these current issues first. Otherwise I'll drown you

  6. #36
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    I can't compile it
    I use the flags but the problem is the same.

    Can anybody compile last commit, please? Win32
    Thanks in advance!!

  7. #37
    Member
    Join Date
    Jan 2014
    Location
    Bothell, Washington, USA
    Posts
    695
    Thanks
    153
    Thanked 183 Times in 108 Posts
    Quote Originally Posted by Gonzalo View Post
    I can't compile it
    I use the flags but the problem is the same.

    Can anybody compile last commit, please? Win32
    Thanks in advance!!
    I typed make from a 32-bit command window and it seemed to work fine.
    Attached Files Attached Files

  8. Thanks (2):

    Gonzalo (24th March 2016),Stephan Busch (24th March 2016)

  9. #38
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Seems to me last commit have override the MP3 related bug fixes... See these 3 different runs over the same file with the 3 last versions of Precomp:

    Older one: Only 14 SWFs, many MP3 false positives.

    Code:
    Recompressed streams: 8552/36212
    PDF streams: 6197/6244
    PDF image streams (8-bit): 435/479
    PDF image streams (24-bit): 90/90
    PNG streams: 40/55
    PNG streams (multi): 0/4
    GIF streams: 5/5
    JPG streams: 1002/1073
    MP3 streams: 372/27493
    SWF streams: 14/14
    zLib streams (intense mode): 353/755
    Second one: Only 14 SWFs, but this time, more realistic about MP3 streams

    Code:
    Recompressed streams: 8552/9140
    PDF streams: 6197/6244
    PDF image streams (8-bit): 435/479
    PDF image streams (24-bit): 90/90
    PNG streams: 40/55
    PNG streams (multi): 0/4
    GIF streams: 5/5
    JPG streams: 1002/1073
    MP3 streams: 372/421
    SWF streams: 14/14
    zLib streams (intense mode): 353/755
    Last commit binary: More SWFs detected, but the same number of mistreated MP3 chunks than before bug fix...

    Code:
    Recompressed streams: 8552/36212
    PDF streams: 6197/6244
    PDF image streams (8-bit): 435/479
    PDF image streams (24-bit): 90/90
    PNG streams: 40/55
    PNG streams (multi): 0/4
    GIF streams: 5/5
    JPG streams: 1002/1073
    MP3 streams: 372/27493
    SWF streams: 29/29
    zLib streams (intense mode): 338/740
    Note that the total number of streams correctly processed are the same on all 3 cases, only runs with such number of false positives are much, much slower.

    Thanks in advance for looking on it.

    When fixed, I'll rerun my tests, for I have a list of several different cases where precomp directly crashes or struggles with non-existing "defective" frames.

  10. #39
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by Gonzalo View Post
    Seems to me last commit have override the MP3 related bug fixes...
    Hm... I can't reproduce that - can you check if that also happens with the "iwta-beating-stress.mp3" test file for you? This one still runs fast for me (about 5 seconds, 7/7 MP3 streams), with both the version that Kennon Conrad uploaded and a compile of the latest commit 3 days ago (commit 68801a2).

    Also checked the code changes from the last 7 commits ("Fix for MP3 slowdown" ... "Final merge with packJPG 2.5k"), nothing suspicious there. I attached a binary for the latest commit so you can try this one, too.
    Attached Files Attached Files
    http://schnaader.info
    Damn kids. They're all alike.

  11. #40
    Member Dimitri's Avatar
    Join Date
    Nov 2015
    Location
    Greece
    Posts
    48
    Thanks
    21
    Thanked 30 Times in 14 Posts

    Precomp Fail on large music prepacked data

    Hello Mr Schaader, it's been long since i tried precomp.

    It has come to my attention that precomp stucks on large prepacked data with many many mp3 insides.

    it seems that in one of the three mp3 packed data precomp has sucess.

    on the other it will simply stuck on 83% without moving for hours.

    Here is a photo just in case!!

    Edit: forgot to mention that precomp used in this photo is the one above !!!
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	Screenshot (2).jpg 
Views:	176 
Size:	459.8 KB 
ID:	4281  

  12. #41
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    @Dimitri: Use verbose mode to see what's actually happening (-v option). Also check CPU usage on the Task Manager. It should be running one thread at full (25% for example if you have a quad-core).

    @Christian: iwta-beating-stress.mp3 is processed just fine with all versions since the bug fix (7/7).
    But this is another thing completely. Sadly, I have to say the last binary you provide have the same behaviour than the others. Which should be OK if I haven't seen another version ignoring all those bad segments. In the log you will see there are some V2 layer III files which obviously won't be compressed, also various other inconsistent streams.
    What is driving me crazy is one version seems not to be aware of that (I think is
    precomp_v045_dev_93c1988, will check right now)

    --------------------------------

    Edit: I found it. Here is the first difference:

    precomp_v045_dev_93c1988 says this:

    Code:
    (2.94%) Possible MP3 found at position 28841523, length 2865894
    Best match: 2865894 bytes, recompressed to 2492483 bytes
    (3.24%) Possible zLib-Stream (intense mode) found at p........


    Last binary attached says this:

    Code:
    (2.94%) Possible MP3 found at position 28841523, length 2865894
    Best match: 2865894 bytes, recompressed to 2492483 bytes
    (3.23%) Unsupported MP3 type found at position 31707812, length 2106624
    Type: MPEG-2 LAYER III


    And so it goes on finding all the other non-valid streams, mostly less than a kb in length.
    Could it be a bug in the parser, or is just the precomp now more sensitive and find more chunks?

    --------------------------------

    Edit 2: I ran a clean -t+3 only instance of both old and new "precomp.exe" to make the logs less noisy and help diagnosis. They are attached. And as you will see, there are the same amount of chunks listed at the end of both processes. So I guess the thousands of false positives we are chasing for are really below the first recursion depth. But there are no ZIPs on the big file, only JPG, PNG, PDF, SWF, FLV, HTML, XML, MP3 and a few small *.js;*.db... As I see it, it shouldn't be any mp3 compressed with zlib here.

    The file itself is pretty big, ~1Gb. But I can upload it if needed.
    A second option, more practical as I see the thing, is to give you a couple of smaller files I have. They cause similar problems to this big one, and are far less obscure to analyse than 1gb archive filled with hundreds of files. Once you discover what happens to precomp with them, this other issue will likely disappear... As you wish...
    Attached Files Attached Files
    Last edited by Gonzalo; 9th April 2016 at 05:51. Reason: Logs

  13. #42
    Member Dimitri's Avatar
    Join Date
    Nov 2015
    Location
    Greece
    Posts
    48
    Thanks
    21
    Thanked 30 Times in 14 Posts
    there as requested with verbose mode.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	Screenshot (1).jpg 
Views:	153 
Size:	437.7 KB 
ID:	4290  

  14. #43
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Ok, it could be one of two:

    1) You have damaged files
    2) There's a bug in precomp

    Actually, can be both... BUT, be patient and your file will be processed to the 100% eventually
    In the meantime, you can wait until there are not bugs at all in mp3 processing. Keep in mind this is the most recent addition to precomp.
    If you don't want to wait an eternity to complete the task, I suggest to process every file separately, not in conjunct. You can run something like this:
    Code:
    for /r %m in (*.mp3) do precomp -cn "%m"
    When you see one file causing trouble, kill the process and remember which one is. You might want to keep it just in case it could be used in bug hunting.

    Happy packing

  15. #44
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    OK, just a short post for now, don't have much time this weekend.

    Quote Originally Posted by Dimitri View Post
    it seems that in one of the three mp3 packed data precomp has sucess.
    on the other it will simply stuck on 83% without moving for hours.
    Here is a photo just in case!!
    Thanks for the verbose mode output. Seems there are some more things in the frame header that packMP3 checks, so I'll have to handle all of them. Opened issue #34 for this.

    It's very likely that there will be more of these slowdown issues even after that one is fixed, so keep reporting these

    Quote Originally Posted by Gonzalo View Post
    Could it be a bug in the parser, or is just the precomp now more sensitive and find more chunks?
    The new version is more sensitive indeed because I don't ignore unsupported MP3 types anymore. Older versions checked the first frame and if it was an unsupported type, they didn't continue parsing. The new version continues parsing the frames in this case to be able to report the length of the stream. Once they did this, they skip parsing for this type and length to get faster again.

    In this case, it seems that there are very short streams of unsupported types that slow down things a bit and bloat the log. I have an idea how to improve this and will try to do so in the next commit (should be available in a few days).

    Quote Originally Posted by Gonzalo View Post
    I ran a clean -t+3 only instance of both old and new "precomp.exe" to make the logs less noisy and help diagnosis. They are attached. And as you will see, there are the same amount of chunks listed at the end of both processes. So I guess the thousands of false positives we are chasing for are really below the first recursion depth.
    Recursion is a special case where MP3 parsing could still be buggy. I checked with a small testfile that there is no crash, but the testfile didn't have any MP3 streams, so it's likely that the thing I described above (reporting unsupported types and skipping parsing for a while) go wrong in recursion.

    Perhaps you could split your ~1 GB file into smaller chunks (something between 10 and 100 MB) and upload one of these. Though downloading the whole file wouldn't be a problem for me, so choose your favorite
    http://schnaader.info
    Damn kids. They're all alike.

  16. #45
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Quote Originally Posted by schnaader View Post
    Recursion is a special case where MP3 parsing could still be buggy. I checked with a small testfile that there is no crash, but the testfile didn't have any MP3 streams, so it's likely that the thing I described above (reporting unsupported types and skipping parsing for a while) go wrong in recursion.

    Perhaps you could split your ~1 GB file into smaller chunks (something between 10 and 100 MB) and upload one of these. Though downloading the whole file wouldn't be a problem for me, so choose your favorite
    Here is your guy: https://drive.google.com/open?id=0Bz...Et3T0tyYnU3b3c

    This is part 13 of 19. Only 50 mb. The real slowdown is right here. 10x slower than its counterparts (~30 min vs ~3 min). Or, in other words, 5% of total size in length takes 50% of the total time to process.

    Pay special attention starting with line 1692 in the log...
    Attached Files Attached Files
    Last edited by Gonzalo; 10th April 2016 at 00:57. Reason: Re-uploaded

  17. Thanks:

    schnaader (10th April 2016)

  18. #46
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Thanks for the file, I can now reproduce the problem.

    The slowdown happens because of another packMP3 parsing error ("region size out of bounds"), opened issue #35 for that.

    In commit c8dc5a0 (also see attachment), I relaxed the parsing restrictions for unsupported MP3 streams a bit. This could potentially lead to some more false positives, but most of the time, streams will be longer and the log will be less bloated. For example, for the file you uploaded:

    Code:
    --- Old version - 115 log entries
    
    (13.45%) Unsupported MP3 type found at position 13440, length 3460
    Type: MPEG-1 LAYER I
    [...]
    (13.53%) Unsupported MP3 type found at position 1505932, length 3508
    Type: MPEG-1 LAYER I
    
    --- New version - 1 log entry
    
    (13.44%) Unsupported MP3 type found at position 4, length 1512280
    Type: MPEG-1 LAYER I
    This makes the log 508 lines shorter. Not that much, as it's still 87626 lines long, but a step in the right direction

    What surprised me was the time difference, did you run on a slow machine? Also which OS did you use (on Linux, Precomp sometimes seemed slower for me in the past)?

    Code:
    --- Your log:
    
    New size: 106239630 instead of 52428800
    
    Done.
    Time: 31 minute(s), 48 second(s)
    
    Recompressed streams: 896/28085
    
    --- My log (checked with the "old" version, without the commit described above):
    
    New size: 106239630 instead of 52428800
    
    Done.
    Time: 3 minute(s), 30 second(s)
    
    Recompressed streams: 896/28085
    Attached Files Attached Files
    Last edited by schnaader; 11th April 2016 at 14:47. Reason: Attached compiled version
    http://schnaader.info
    Damn kids. They're all alike.

  19. #47
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Thanks for the efforts And for the binary too.
    I made a second run over the file, making sure there's no other resource-hungry process running. I also made precomp to run on realtime priority. Now timing is 18 min. 0 sec. Last commit shows practically no difference.
    My machine runs Win7 over an Intel Atom CPU (1.6 Ghz). An average computer can be said. I'm not a hardware expert but I guess the huge difference in speed is due to hard drive specs. Remember, this kind of false positives leads to an exaggerated use of temp files. Anyway, all of these temps are mostly identical. So I'm wondering if better cache management could make the speed up in your machine (or the slowdown on mine). I'll try some ram-drive software to see if this is true. I have a Linux O.S. installed in other partition of the same machine just in case.

    --------------------------------

    Edit: I saw the modification you made in last commit. Seems a clean solution, because you white-listed the only supported type and let the others out. That's why I don't understand the result attached... This time the problem is given by "
    Type: MPEG-2 LAYER III" Could be needed a modification in the detection stage itself maybe? Anyway, you can just ignore this while it is only a matter of bloated logs, the speed seems affected but not that much.

    Now, if you want to check this out, I am uploading a set of the 10 more problematic files respecting this "MPEG-2 LAYER III" issue.

    --------------------------------

    Edit2:‚Äč After iterate through 1999 files, I found where the real troublemaker is... See below
    Attached Files Attached Files
    Last edited by Gonzalo; 12th April 2016 at 01:48. Reason: New results

  20. #48
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Note: MAKE is configured to build bz, zlib and others with -O2 optimization switch. Is that on purpose? Or must be changed to -O3 in order to speed-up Precomp?

  21. #49
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by Gonzalo View Post
    I made a second run over the file, making sure there's no other resource-hungry process running. I also made precomp to run on realtime priority. Now timing is 18 min. 0 sec. Last commit shows practically no difference.
    My machine runs Win7 over an Intel Atom CPU (1.6 Ghz). An average computer can be said. I'm not a hardware expert but I guess the huge difference in speed is due to hard drive specs.
    OK, 18 minutes sounds more reasonable. That's factor 5. My CPU runs at 2.4 GHz and disk thrashing explains the remaining difference.

    It's possible to pass streams to packJPG/packMP3 in memory instead of using files, I'll try this next for "small" streams (up to 16 MB or something like this). This won't solve the actual issues (better parsing on Precomp side), but it should reduce the slowdown and disk thrashing.

    Quote Originally Posted by Gonzalo View Post
    This time the problem is given by "Type: MPEG-2 LAYER III" Could be needed a modification in the detection stage itself maybe? Anyway, you can just ignore this while it is only a matter of bloated logs, the speed seems affected but not that much.
    This can't be improved easily, some of my unsupported MP3 testfiles show the same behaviour. Note that there are gaps between frames, for example here, there's a frame from 11537951...11548400, after that, 208 bytes gap, then the next frame at 11548608...11558640 (and so on):

    Code:
    (1.91%) Unsupported MP3 type found at position 11537951, length 10449
    Type: MPEG-2 LAYER III
    (1.91%) Unsupported MP3 type found at position 11548608, length 10032
    Type: MPEG-2 LAYER III
    I'm not sure why this happens, but if I would somehow "merge" these frames (something like allowing gaps up to 1024 bytes, for example), I'd most likely also merge cases where multiple files are concatenated with small gaps. As you said, no slowdown here, just bloating the logs, so I won't change anything here for now.

    Quote Originally Posted by Gonzalo View Post
    Note: MAKE is configured to build bz, zlib and others with -O2 optimization switch. Is that on purpose? Or must be changed to -O3 in order to speed-up Precomp?
    I had problems with -O3 and zlib in earlier versions of Precomp (somewhere around 0.4.2, IIRC) that led to random crashes. Might be gone with newer compiler and zLib versions, but has to be tested thoroughly. I might check this after releasing 0.4.5.
    http://schnaader.info
    Damn kids. They're all alike.

  22. #50
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Quote Originally Posted by schnaader View Post
    It's possible to pass streams to packJPG/packMP3 in memory instead of using files, I'll try this next for "small" streams (up to 16 MB or something like this).
    Yes, this is great news!!! Everything can be done in memory, must be done in memory I believe. I can't wait to see this implemented.

    Quote Originally Posted by schnaader View Post
    This can't be improved easily, some of my unsupported MP3 testfiles show the same behaviour. Note that there are gaps between frames, for example here, there's a frame from 11537951...11548400, after that, 208 bytes gap, then the next frame at 11548608...11558640 (and so on):

    ...

    I'm not sure why this happens, but if I would somehow "merge" these frames (something like allowing gaps up to 1024 bytes, for example), I'd most likely also merge cases where multiple files are concatenated with small gaps.
    Well, I don't know if this is a problem at all... Because legit files tells you exactly how long they are. And on non supported files precomp has nothing left to do, so if you take them all as just one doesn't mater. A few days ago I corrupted deliberately one mp3 file to see how precomp handle it. And it didn't even throw a warning. Decompression checked with fc, so I guess packMP3 is more resilient than we think.
    But, as you said, maybe a better understanding of mp3 format internals could be the answer here. These numbers on the logs, are produced by precomp itself, or by packMP3 library?

  23. #51
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    I made a few numbers in order to figure out whose exactly the slowdown fault is, namely HD trashing, bad parsing, etc. I don't know if I wrote this down the right way. I hope you understand what I mean.
    Here are the results:


    Code:
    Hard Drive:		  9,9 secs
    Hard Drive, -intense:   349,0 secs
    Ram Drive:		  8,3 secs
    Ram Drive, -intense:    227,0 secs

    This means:
    Normal mode: 16% faster on Ram
    Intense mode: 35% faster on Ram


    Now, what I really wanted to see is how precomp performs without all the false positives. How well can it do when all parsing issues are gone. Since all valid mp3 streams are on recurssion depth 0, and no further, the optimal result can be achieved this way:


    Code:
    precomp -cn GapFillDragAndDrop.swf.online
    
    precomp -cn -intense -t-3 GapFillDragAndDrop.swf.pcf
    This combination timing:


    Hard Drive: 28 secs
    Ram Drive: 20 secs


    So, optimal scenario, all parsing issues fixed, and all streams processed in-memory:
    20 seconds instead of 5 min 49 seconds > 94% faster (on this particular file)
    Even without getting rid of disk trashing, just 28 seconds make a 92% speed-up if you can fix the rest.

  24. #52
    Member Dimitri's Avatar
    Join Date
    Nov 2015
    Location
    Greece
    Posts
    48
    Thanks
    21
    Thanked 30 Times in 14 Posts
    sorry for the late reply, i have been busy with stuff and work so i forgot to post :P

    i unpacked that music.arc file then made it a tar and tried to run precomp on it again, and it worked all mp3 streams where found and compressed accordingly !!!

  25. #53
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Oh boy, I'm about to lose my mind! I launch make.bat over and over again and always gives me this error:

    Code:
    precomp.cpp:60:19: fatal error: conio.h: No such file or directory
    I copied conio.h to every folder named "include"
    I also put it into the same folder than precomp.cpp and changed the angle brackets to quotes but the same error shows up time after time.
    Anyone can save my day and give some clues of what's happening? Thanks in advance!

    --------------------------------

    BTW: I know there are binaries of this, but I changed compilation options from -O2 to -O3 in order to compare performance and see if those old crashes are still appearing.

  26. #54
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Sounds like your compiler configuration is broken. I assume you're using Windows and MinGW g++/gcc, if you're on Unix/Linux, ignore the following and add -DUNIX, but it would be strange to use make.bat instead of Makefile in this case.

    conio.h is not a standard C library, but most compilers targetting Windows should have it in their standard includes, without having to copy anything anywhere. You can try the following:

    Code:
    g++ -print-search-dirs
    This shows all the directories where g++ searchs for include files, in my case, the beginning of the long output looks like this:

    Code:
    install: c:\program files (x86)\mingw-builds\x32-4.8.1-posix-dwarf-rev2\mingw32\bin\../lib/gcc/i686-w64-mingw32/4.8.1/
    [much more output...]
    So I searched for conio.h in "C:\Program Files (x86)\mingw-builds" and there it is (subdirectory x32-4.8.1-posix-dwarf-rev2\mingw32\i686-w64-mingw32\include). I didn't check the output for this path, but it should be in there. But for any path that is not in the output, you can use the -B switch:

    Code:
      -B <directory>           Add <directory> to the compiler's search paths
    So you can look up where conio.h is and add this switch to make.bat (don't forget to surround the path with quotes if there's a space in it).

    If all of this goes wrong, please run "g++ --version" and post the output here, so we know what compiler you're working with.
    http://schnaader.info
    Damn kids. They're all alike.

  27. #55
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Here are binaries for commit 752c2a. There were 4 commits since the last version, 2 minor ones and 2 major ones that do all the MP3 stuff in memory for files up to 64 MB.

    I also attached a version where both zlib and precomp are compiled with -O3 for testing. In a quick test (compressed and recompressed a file containing PNG, JPG, MP3, SWF, using the builtin bZip2 compression) there was no significant speed difference, but no crash either. On the other hand, compiler warnings ("...may be used uninitialized here...") appeared in the bzip2\compress.c, so maybe bZip2 should be tested more thoroughly.
    Attached Files Attached Files
    http://schnaader.info
    Damn kids. They're all alike.

  28. Thanks:

    Gonzalo (19th April 2016)

  29. #56
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    I can confirm there is at least one crash among my files using optimized version. I have to run this specific binary you posted but for the commit before linux builds compiled with -O3 crashes on "kubuntu-15.10-desktop-amd64.iso"

    I can also confirm precomp compiles just fine with -m64 -march=x86-64 on linux and is also faster, up to 12% on my machine

    Last commit performs up to 15% faster on MP3 files. Visual Studio 2015 exe is also about a 6-7% faster than Mingw binaries on my machine, on mp3. I'll take a look in general when I have the time.
    Attached Files Attached Files
    Last edited by Gonzalo; 19th April 2016 at 23:00. Reason: MSVS2015 executable

  30. Thanks:

    schnaader (20th April 2016)

  31. #57
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    @all testers: Just informing...

    Last commit introduced JPEG compression in-memory instead of use temp files. The result? About 17% improvement in speed according to my tests. Long-awaited improvement!
    By the way, @schnaader: there are some headers missing on the zip github provides as the main download. Nothing to worry too much about, since they are in the past versions folders anyway.
    Attached Files Attached Files

  32. #58
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by Gonzalo View Post
    By the way, @schnaader: there are some headers missing on the zip github provides as the main download. Nothing to worry too much about, since they are in the past versions folders anyway.
    Strange, it worked for me, I could just download the ZIP file and run make.bat, as well as building the project in VS2012. Perhaps this is some VS2015 issue?
    http://schnaader.info
    Damn kids. They're all alike.

  33. #59
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Maybe your version replaced them with its own on the PATH... I guess that is what gcc did for me on Linux, because it compiled just fine. Or maybe I just deleted them inadvertently. I have to check later. Right now I'm dying to get some sleep

    PS: Any plan to process zlib streams in memory these days?

  34. #60
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    612
    Thanks
    250
    Thanked 240 Times in 119 Posts
    Quote Originally Posted by Gonzalo View Post
    PS: Any plan to process zlib streams in memory these days?
    I plan to put some work into the remaining MP3 issues (slowdown/false positives) and release 0.4.5 after that. Especially MP3 recompression, the fixed packJPG crashes and processing both MP3 and JPG in memory are changes that are valuable enough to already release 0.4.5. After that, I'll start to work on processing zlib streams in memory right away and when this is finished, it'll be a new version 0.4.6.
    http://schnaader.info
    Damn kids. They're all alike.

  35. Thanks:

    Gonzalo (24th April 2016)

Page 2 of 3 FirstFirst 123 LastLast

Similar Threads

  1. AntiZ-an open source alternative to precomp
    By Diazonium in forum Data Compression
    Replies: 70
    Last Post: 17th June 2017, 14:09
  2. Durilca Source Code
    By juanandreslaura in forum Data Compression
    Replies: 2
    Last Post: 14th September 2015, 18:30
  3. where to get source code for a jpeg implementation?
    By pk-compression in forum Data Compression
    Replies: 2
    Last Post: 11th August 2015, 23:35
  4. Compiling Source Code
    By comp1 in forum The Off-Topic Lounge
    Replies: 2
    Last Post: 10th June 2015, 22:32
  5. Need help to migrate libbsc.com from Office Live to github
    By Gribok in forum The Off-Topic Lounge
    Replies: 3
    Last Post: 23rd April 2012, 01:29

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •