please take a look at http://www.nosltd.com/ anf post about what you think.. this installer is used with acrobat reader 9.0 compressed (33.4mg) uncompressed (113).. with winrar it got 44.9mg
please take a look at http://www.nosltd.com/ anf post about what you think.. this installer is used with acrobat reader 9.0 compressed (33.4mg) uncompressed (113).. with winrar it got 44.9mg
I remembered such a installer. It used a way for storing "same" files as "xref" in the archive. Also, it used a modified UPX code which means a GPL violation. Anyone remember it?
Edit: spelling...
Adobe have used this(NetOpSystems FEAD, so i think they'd just renamed it?) for years, and i even think we had a big discussion about it on the previous forum, or maybe it was another compression site or on usenet i forget. It's even a lossy compressor to some extent(depending on switches used) as long as it all works once decompressed it doesn't care of what it originally was.
It is mainly an EXE compressor, which is actually based on UPX and hence has been a fair few arguments as to wether it violates the GPL, but has some enhancements. You can even patch UPX very easily to decompress the FEAD sfx installers.
Just google for NetOpSystems FEAD and you'll get stacks of info and experiments done with it.
winrk v3.11
normal 39.8mg
7zip 40.6mg
stuffit12 52.1mg
There was some discussion about it on older forum, I found only a small topic though.
I am... Black_Fox... my discontinued benchmark
"No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates
Yes, I was talking about NetOpSystems FEAD.Originally Posted by Intrinsic
Actually, there isn't any enhancements. They are just changed some UPX constant (like UPX signature etc.). They used UPX for packing the whole installer. Rest of notable compression comes from "putting references instead of the exactly same original files". This can't be an executable packer improvement for me. Because, an executable packer only compress a single executable file. Although, the whole installer compression was very poor on single file IIRC.Originally Posted by Intrinsic
I think FEAD is partly integrated into the companies devolpment toolchain, and also NOS will tailor(hand tune) their compression to the individual companies program, so i really don't think you can compare it to conventional general compressors like RAR etc.
It's sorta like comparing crinkler to UPX, you can't just give an exe to crinkler like you can UPX for it to compress, you have to have it in your toolchain and program with it in mind, but once you do, it shits over anything, even FEAD probably.
kkrunchy is the best command line exe compressor like UPX you'll find, but it's limited to certain types of exe's, ie no exports or tls data. It's also a 1 way compressor as it'll chop n change the file to get the best compression.
Quick example of it on an uncompressed sbc.exe:
Latest UPX -9: 144384 bytes
Latest kkrunchy --best : 118272 bytes
Edit: @osmanturan Heh i was writing my post out at the same time as you so only noticed after i'd posted that you had tooEnhancements was a poor choice of wording on my part, minor changes is more correct as you pointed out.
Last edited by Intrinsic; 31st July 2008 at 01:22.
IIRC, kkrunchy or a compressor which have a similar name designed for 4KB demos (I'm talking about some eye-candy demoscene products - I really like them). AFAIK, it was using a statistical method (I suspect a PPM model) for compressing executables. The author(s) was pointed about their compressing technique which based on contextual approach was superior. But, it was not good for large executables due to long decompression time (demoscene parties generally have a timelimit for each products). As you mentioned before, it was applying some changes on executables and also it has some limitations (such as no exports etc.)
Aye kkrunchy is aimed at the demoscene, something i've been involved with in some form or another since 1991.
Info on it from the horses mouth, so to speak, he's not that ugly
variant A (i.e. 0.23alpha) uses a pretty basic LZ+arithmetic algorithm (no, not LZMA; worse pack ratio, but smaller depacker). it's slow to compress but decompresses a lot faster (very roughly 2MB/sec on my P4 2.4GHz for typical data).
A2 (0.23alpha2) is a context mixing based algorithm, akin to PAQ and crinkler (in fact I use the neural network mixer from PAQ7 with some simpler custom models). it's usually faster to pack than variant A, but takes just as long to depack (the algorithm is symmetric). i've spent quite some time (and sacrificed size too) to make average depack time reasonable for 64ks, but it's quite useless for anything much larger.
4k's is where Crinkler rules the world currentlyit dethroned kkrunchy.
Most notably the procedural gfx 4ks by a guy called iq:
download: http://pouet.net/prod.php?which=50068
Video: http://www.vimeo.com/950576
Is one example, and his most recent one is:
http://pouet.net/prod.php?which=51074
which again is just mindblowing.
All crunched with Crinkler, so yeah 1gig of data out of a 4k exe isn't too bad i'd say ;p
The scene also gave us what afaik is the smallest compressor and decompressor, just 256b in size. Bit exact in and out, impressive stuff.
Becomes very off topic.sorry...
My favorite demoscene group:
http://www.theprodukkt.com/
They has a high-end game which computes procedurally most of the game resources (textures, geometries, even music!!!). It's only 96 KB!!!
http://www.theprodukkt.com/kkrieger
And last. Here is my first demoscene entry which is written in a very short time by using OpenGL. At that time, I had no any own framework in C/C++ for games (I had only in delphi). The game was written Dev C++Includes lots of bugs.
http://www.pouet.net/prod.php?which=29996
The .produkkt is just Farbrausch's other(commercial) site where they flaunt their software, the name comming from one of their early demo's which was the 1st real test for their new tools they developed(which they now sell called werkkzeug, freebie versions are available though) http://pouet.net/prod.php?which=1221
A lot of outsiders only really know of Farbrausch after fr-08 because they have good media & industry contacts, but apart from Debris they're really just above average these days, but because people still follow the hype from the early days they are still "popular" and one of the best groups(with regards to raw talent they have, Chaos being one of the best coders you'll find anywhere) for sure they can still do great stuff, but a lot more time is spent on commercial stuff so distracts them. The new top dogs are ASD who's astounding Lifeforce http://pouet.net/prod.php?which=31571 and their other masterpiece Iconoclast http://pouet.net/prod.php?which=18350 for demo's. Conspiracy prolly rule the 64k area, and rgba 4ks atm on PC. TBL still are top dogs on Amiga(my platform of choice, and still beating PC prods at various parties), Ephidrenia for 4ks.
kkreiger generates about 180mb of data fyi.
And i had thought i'd recognised some names from here/elsewhere, now i know for sure!But yeah, slightly offtopic heh.
The other day I downloaded the Acrobat Reader and got interested by the strange NOSSO sign in the bottom-left corner of the program window.
Time passed, but I could find almost nothing about their technology.
Then I decided to make a test including NOSSO and some other well-known archivers.
So I started the Acrobat Reader SFX, then copied the unpacked data from the Temp folder and took a glance at it. First thing I noticed was another NOSSO SFX archive with Adobe AIR, which I unpacked too. Then I cleaned the data up and started my benchmarks.
So I hope it would be interesting for all of you to take a look at the results
You can download the results in an Excel sheet here. There are many interesting things to read in the notes (red arrows in top-right corners) as well
And, yes, if you want me to test and include any other archiver - just let me know![]()
Last edited by nanoflooder; 6th August 2008 at 01:31.
Search for NetOpSystems(and/or FEAD) and you'll find a lot more info on it fyi.
Oh thanks, gonna do it now
BTW, just for the record - so what did it appear to be - an archiver, or just some manually, uh, compiled SFX'es?
I'm asking this assuming that you already performed all the searches and now know what's what ^^
Thanks, very nice test you've done
@nanoflooder :can you check if there are duplicate files somewhere in the unpacked directories ? (the file names can be different though!)
If there are no duplicates, I still don't understand how Nosso can get such high comp. ratio.![]()
Last edited by pat357; 6th August 2008 at 04:34.
ThanksAmazing, isn't it, that WinRK in ROLZ3 mode (modified by me to use more memory) can still outcompress a manually-prepared package, being an asymmetric archiver
Well, I should say, if I was asked by Adobe to prepare a package for one of their products - the first thing I would do, after, is removing all the duplicates
Obviously, there aren't any among the unpacked stuff, and I checked that, uh, twice or three times I think. And, hey, aren't the archivers supposed to resolve them in the data they face? That's their job, at least I always thought it is
The whole unpacked package weighs about 138 MB, so it's not that much for them to miss something really important - most of the top-ranked (by compression) archivers do make solid archives and the dictionary sizes they use are huge! That's why, for example, the difference between #1 program - WinRK - and the original package are whopping 5 megs!
I like your test. This kind of test data is also very good because we know the actual data is really being used by lot's of people.
I'm not winrk expert, so I'm not sure, is this rolz3 mode the "best asymmetric" mode? The benchmarks I have seen nz -cO procudes generally smaller size, especially on exes and other such data. Redundant bitmap files, source code, already compressed files, mixed binary and text, sometimes appear to be handled better by "rolz" currently.
I downloaded the acrobat reader and decompressed data1.cab, but I got only 103mb of data. I tested compressing couple exes and dlls, but I didn't find anything special where -cO would perform weakly. I'm still interested if there is some specific files (other than mentioned above) I could study, where -cO tend to produce larger files than winrk asymmetric modes.
ThanksOriginally Posted by Sami
Acrobat Reader was a sample of a binary-spec package. I'm preparing another 2 tests at the moment: Re-Volt game setup (lots of BMP/WAV, typical for games) and Dicto application (lots of UTF8/English text, typical for dictionaries).
Yes, it is, and I have slightly modified it via WinRK.ini to make sure it uses as much memory as possible for my computerOriginally Posted by Sami
You can download my package here (direct download, no need to wait) - this is an SFX archive brought to you by that ROLZ3 methodOriginally Posted by Sami
Hopefully reliable, although I didn't test it for the entire data integrity. I'm too lazy for that
I have also updated the table (fixed my brain bug and some errors in DNA) - you can always access it here.
Last edited by nanoflooder; 6th August 2008 at 23:26.
The archive does not extract on my system : it kept asking about dictionaries.
Lukely I have a copy of WinRK v3.0.3, so I used this and got the following errors on extracting :
What version of WinRK did you use to pack ?Code:Error extracting '' - (.\compression\FastDecompressor.cpp:595) (uint32)offset<=tpos[rcntxt] CRC failed extracting 'G:\test\Nosso\Setup Files\READER9\AcroRead.msi' CRC failed extracting 'G:\test\Nosso\Setup Files\READER9\Data1\Checkers.api' CRC failed extracting 'G:\test\Nosso\Setup Files\READER9\Data1\AcroForm.api__NON_OPT' CRC failed extracting 'G:\test\Nosso\Setup Files\AIR\Adobe AIR Installer\AIR\mauby\META-INF\AIR\application.xml' CRC failed extracting 'G:\test\Nosso\Setup Files\READER9\Data1\Annots.api' Error extracting '' - (c:\programs\winrk\librk\compression\ArithmeticCoding.h:179) Error in data stream, unable to complete decoding .......
It seems 3.1.1 is the one used throughout the sheet.
I am... Black_Fox... my discontinued benchmark
"No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates
It's either a problem with your WinRK or with this SFX itself, but anyway I just wanted to save some size by uploading an RK archive instead of anything else.
Here you go with a .7z: http://rapidshare.com/files/135634579/acrord9.7z
Thanks, got it.Now I've something to play with !
I guess there wasn't anything wrong with the WinRK sfx, only RK used dictionaries to compress the few text files.
I've tried putting the dicts in the same folder, but the RK-sfx couldn't "find" them.
Next time you compress with WinRK, you might consider disabling the use from the dicts in case you plan to give the sfx to someone else.
I noticed from your xls that you used WinRK 3.1.1, which is not backwards compatible with v3.0.3.
This explains the errors I got when trying to extract.
BTW : if you make a comparison from WinRK with other archivers/compressors like NZ, consider disabling the dicts in WinRK; IMHO it is not fair to compare programs with and without external dictionaries.
Last edited by pat357; 8th August 2008 at 22:34.
Oh I'm sorry, I actually didn't even know that (I mean, if you look on the "Use text dictionary" tooltip, it says "blah-blah-blah, ..., not compatible with SFX archives", but when you create one, it doesn't say a word about that) - I'm gonna retest WinRK right now with no static text dictionaries.
Hehe, in fact, turning the dictionary usage off makes WinRK crash in any mode I tested itSo maybe the solution is to add the dictionary sizes to the end archive and leave this alone.
![]()
Update: I realized, that Office 2007 is crap, at least at converting .xlsx to .xls - all my color dividing simply didn't work in the converted version. So I have made a table in Office 2003 with all the coloring (basically it is now easier to trace the algorythm symmetry) and compression / decompression efficiency (using the same old formula 2 ^ ((size_X / size_TOP) - 1) / 0,1) * time_X)
Last edited by nanoflooder; 11th August 2008 at 21:40.
I wrote about them last year: http://trixter.wordpress.com/2007/08...-netopsystems/
Not as technical as people here would like, unfortunately, but at least the conclusion is the same (benefits the provider more than the customer).
There is no such mystery Nosso might predict.
Just take all "crap" Adobe ist putting in Cabs.
Why they do that at first? They are bound to productchains - InstallShield and stuff...
Now take that stuff: Cabs (InstallShield.Cabs) and decrypt that stuff. You get another stuff. Cabs from microsoft, nicely packed with lz, ms, and other algies. Furthermore you get alot of MSCFs, Word-Files, and other stuff. PDFs - nice example, and jpgs, pngs and much better - bmps. All those extractes stuff - take it and recompress it.
Dont tell me: "I did it but did not reach the compression rate they did" because if you did it the way they did you would have!
Just do it until you cannot get much deeper. Tear apart every single file you can and recompress all that stuff with 7z solidly. 64MB dict size will be enough. Put some data sorter in front of it, like bc2, and you have the result they do. Its the work they get money for - thats ok.
Good Idea though!
Last edited by SnakyJake; 11th April 2009 at 05:08.
> Dont tell me: "I did it but did not reach the compression rate they did"
> because if you did it the way they did you would have!
I don't know if I should love or cry.
I cannot understand why so many people (here) think it's the right direction trying to blame Nosso all they can and lower the service they are doing. I didn't test it myself trying to beat them but do you really think a company like Adobe would pay them if they weren't good?
Try to think before you write please!