http://www.nicodevries.com/nico/all-uc2.zip
in 2007 it wasn't available. or i overlooked it?
http://www.nicodevries.com/nico/all-uc2.zip
in 2007 it wasn't available. or i overlooked it?
encode (23rd January 2018)
Thank you. I remember a lot Ultra Compressor II in the 1990s, and its amazing super-optimizer.
Is it possible to make a win32 compile of UC2 from those sources?
Here is 1996 compile from my deep archive. Nice testing.
Sincerely,
FatBit
uc2 uses a lot of assembler, and hardware-specific things (ems/xms memory, 386-specific commands) and i'm not sure that everything is easily replaced with C++ code. after all, if it was possible, uc2win32 would be probably on market in 90s
@ fatbit: those uc2 compiles are not 32 bit but ms-dos compiles.
Creator of UC2 here.
No you didn't overlook it. I released those sources under GPL in 2009. Around 2002 I released them to Greg Flint from Purdue University who did an attempt to port UC2 to Unix.
Not straight away. It would require elimination of the machine code (it's not much, most is in C) and all kinds of DOS specific logic.
Borland C++ is still around but I have no idea what would happen if you try to compile the source code.
Bulat Ziganshin (22nd April 2014)
Nico, i glad to see you there!!
can you please shortly describe the UC2 algorithm? i mean its shared dictionary and other original ideas
In general UC2 uses LZHUF compression comparable with ZIP, ARJ and RAR. A few things made UC2 special, for it's time.
- It used a 64 kilobytes LZ buffer, in a time when everyone used 32 kilobytes. This is what the machine code handles as efficiently keeping 65536 2 byte pointers with a 8086 CPU is a small disaster. When everyone moved to 32 bits it became way easier to use large LZ buffers.
- It was (AFAIK) the first archiver to use "damage protection". XOR based, just like RAID. In the age of the floppy disk this was a useful feature. I think RAR and ARJ copied this in later versions but I doubt the feature is very useful with modern storage. Robert Jung (the author of ARJ) and I worked together on some things, e.g. reverse engineering PKZIP and when I gave up on UC2 I donated some code to him for ARJ. It's awesome that he is still doing ARJ.
- It had a special multimedia mode for better (lossless) compression of audio and uncompressed images. This was based on a simple delta algorithm.
The shared dictionaries. They allow compression ratio's close to what solid (RAR) or TGZ compression can achieve without the disadvantage of having to re-create the entire archive to add a single file. Extraction of a single file also is much faster for similar reasons.
The trick is to collect 64 kilobytes of data (e.g. the first 64kb of all files to be compressed) and make that a separate entity in the archive. Then all files are compressed individually using these 64 kilobytes to initialize the LZ buffer. When files that are actually in the dictionary are compressed they become extremely small as only 1 LZ reference is needed. And files that are similar to the content of the dictionary compress better similar to solid archives like TGZ. UC2 also had a default dictionary containing English text, C source code and some other stuff improving compression of certain document types.
Some additional tricks where used (e.g. sorting files by extension before processing them) to optimize the content of the dictionary. It works really well for source code.
A side effect of the dictionary approach was that it was very easy to add versioning to UC2. So it had a lot of options to do version management for source code, including generation of header files containing version numbers etc.
There also is the "super-optimizer" which ordered files randomly trying to find the optimal dictionary for a certain file collection.
Last edited by Nico de Vries; 23rd April 2014 at 15:05.
Bulat Ziganshin (11th December 2015),encode (21st April 2017),Matt Mahoney (24th April 2014),Mike (24th April 2014),nikkho (19th December 2015),surfersat (24th April 2014)
Vladislav Sagunov requested I released the UC2 source code under LGPL (I released it under GPL a while ago), which I think is an excellent idea.
The LGPL version is availabel from here: http://www.nicodevries.com/nico/uc2-lgpl.zip
It contains all source code, executables, documentation etc.
Bulat Ziganshin (11th December 2015),comp1 (11th December 2015),jf1 (12th December 2015),nikkho (19th December 2015)
What about UCEXE Nico? Can you describe it a bit?
I used it because fast compression and compression ratios better than PKLite, but I guess extraction were slower.
Here is my little tribute to UC2. You can use Google Translate if not able to read spanish:
http://www.javiergutierrezchamorro.c...ressor-ii/2753
nikkho (23rd April 2017)
wow awesome!!! I LOOOOVED using UC2 back in the day (it was my top #1 compressor, and SQZ, LHA, ZOO, and ARJ being also up there high on my list). I read your long explanation of how it worked -- but i still must ask: ANY chance of getting an x32 or x64 tool with the same code or ... no?
(dont need a GUI -- just a CLI version ... heh)
Anyway, on a tangent: But my FAVORITE thing to do, was to take a whole directory, and basically TARBALL it with either TAR or CRUSH (an old DOS program that was like GNU TAR) and then SUPER-OPTIMIZE the ORDER of the files in the CRUSH file in order to experiment how to get the smallest compression via calling out UC2 each time i ran the CRUSH optimize.
Anyone here familiar with the old CRUSH optimizer? (its not a compressor but is used to optimize the files in a tar ball BEFORE a compression, in order to help the compressor be more efficient).
NOTE: the Crush tool (v1.8 from long ago) that i am talking about is NOT the same as this one (on this board):
https://encode.su/threads/1289-CRUSH-0-01-is-here!
but is actually THIS one:CRUSH18.ZIP [/TD]
69K
02-28-1995
CRUSH v1.8 - {ASP} Super-compressor for DOS.
Fed up with limited compression performance
of Stacker, PKZIP, UC2, ZOO, ARJ and LHA?
CRUSH will usually give 5%-50% improved
compression over any other DOS compression
tool, & yet allows the user to continue using
the archiver already in use. CRUSH is fast &
the ideal choice for users keen to save disk
space. Too good to be true? No! Try it!
PocketD v4.1 compat. New: no 1000 file limit
LOCATED here:
http://archives.thebbs.org/ra31a.htm
What do you think of doing things that way? it actually helped achieve smaller compression with ALL archivers (whereas NOT using Crush with any of them, would still be great and tight compression, but not as good).