Results 1 to 20 of 20

Thread: StuffIt X Format

  1. #1
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts

    StuffIt X Format

    i'm waiting for Stuffit 12 result on Maximumcompression but never the less i found this link http://my.smithmicro.com/stuffitcomp...itxformat.html that describes what formats it recompress..

    StuffIt 12 includes custom recompressors for these file formats:
    JPEG (.jpg, .jpeg)
    JPEG lossless (.jls)
    JPEG 2000 (.j2k)
    Note - recompression is only applied when a j2k file is lossless encoded
    Bitmap (.bmp)
    GIF (.gif)
    TIFF (.tif, .tiff)
    PSD (.psd)
    PNG (.png)
    PICT (.pict, .pct)
    PXM (.pbm, .pgm, .ppm)
    MP3 (.mp3)
    Zip (.zip, .cbz)
    Microsoft Office 2007 & 2008 documents (.docx, .xlsx, .pptx)
    PDF (.pdf)

    so it supports
    1- rle compression of bmp,pbm,pgm,ppm,some tif,some pict,psd, some pdf
    2- jp2 arithmetic
    3- jls arithmetic
    4- mp3 huffman

    so i think v13 would increase the formats but which is usefull.. i don't think they'd go to "LZX" as MS may trouble them, nor "RAR".. maybe ISO,bin,nrg or OGG,mp4,mpg,avi..

    the problem is that its slow ... very slow for a compression task.. like precomp speed which lose in practicality.. or unless they provide a viewer not just an extractor (2.5mb compressed which is too much..)which let you read/edit/save in sitx format directly.. of images/doc/audio.. so what do you thank plz.?

  2. #2
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    870
    Thanks
    47
    Thanked 105 Times in 83 Posts
    I like the idea of file specific recompresser.

    but for me recompression of bmp is useless as i woulde rather use png og bmf for that

    howeer
    jpeg
    mp3
    ogg

    recompression would help me when i backup games.



    But i would also like to see file bit level lossy/Contens lossless recompression.
    Like with pngout and mp3repack.. not the same file but the same information.

    i hope that packjpg someday could discard the information regarding the original huffman table to save space on the compressed file.

    I'm still experimenting with using precomp -slow on game CD images. but it seems only to be rare occasions that it improves the compression

  3. #3
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,979
    Thanks
    376
    Thanked 347 Times in 137 Posts
    BMP files mostly uncompressed - so we may not recompress them, although BMP can be compressed by the simple algorithms like RLE.

    AFAIK, OGG make use of an arithmetic encoder - so there is no stuff to recompress as with MP3 which uses Huffman encoder.

    IMHO, recompressing ZIP and other archives are cheating - General purpose archiver should not do such things to save the last bit.


  4. #4
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    870
    Thanks
    47
    Thanked 105 Times in 83 Posts
    i have to disagree

    the old program rehuff was for optimizing huffman on vorbis files. as far as i remember. so i believe its huffman based


    but still couldn't we decompresse the ari level of vorbis
    save some small data to restore the ari compression bit perfect.
    and then compresse the vorbis file with a stronger compression ???

  5. #5
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Quote Originally Posted by maadjordan View Post
    StuffIt 12 includes custom recompressors for these file formats:
    Microsoft Office 2007 & 2008 documents (.docx, .xlsx, .pptx)
    MS Office 2007+ and OpenOffice save their files just as renamed zip, there's no extra magic in it!

    Quote Originally Posted by maadjordan View Post
    but still couldn't we decompresse the ari level of vorbis
    save some small data to restore the ari compression bit perfect.
    and then compresse the vorbis file with a stronger compression ???
    Not much, ari is already quite strong (not to mention it's the strongest one in Stuffit).
    Last edited by Black_Fox; 8th May 2008 at 12:40. Reason: included also .PDF, which wasn't intended
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  6. #6
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    870
    Thanks
    47
    Thanked 105 Times in 83 Posts
    lets for a minute say vorbis was using huffman due to patens issues with ari

    couldn't we then save the huffman tree, remove the huffman compressions and use ari instead ?

  7. #7
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,239
    Thanks
    192
    Thanked 968 Times in 501 Posts
    Its a matter of modelling, and used entropy coding doesn't matter much.
    I mean, paq8 can be implemented with dynamic huffman codes assigned
    to long enough strings, and still have the same compression quality not
    using arithmetic coding.

    And my experience says that all the widespread media formats have
    relatively weak compression, probably due to hardware implementation
    considerations.

  8. #8
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts
    i found that there is a exe SFX for sitx format but the builder (sea creater) takes too much time to convert large size sitx into exe and the stub does not support all v12 feature..

    yesterday i had the chance to test the Stuffit Wireless which compress to Sitx on Windows mobile it support only jpeg compression from the supported formats (above)..it worked rather toooo slow on my O2-atom-life WM6.0..

    another product they have but didnot test is the photoshop plugin! they want the user to same into jpeg which is then compressed into sitx ..so my design with lossy format rather than lossless one (PSD)..

  9. #9
    Member Fallon's Avatar
    Join Date
    May 2008
    Location
    Europe - The Netherlands
    Posts
    155
    Thanks
    14
    Thanked 10 Times in 5 Posts

    Talking

    Quote Originally Posted by Black_Fox View Post
    MS Office 2007+ and OpenOffice save their files just as renamed zip, there's no extra magic in it!
    As zip becomes the norm for container-like .docx files, will recompression not become inevitable? Who will care about an old .doc file benchmark, when that file format is getting outdated?

    Future processors will again have more power -duh-. In 2010 processing power could double with arrival of Intels sandy bridge architecture. So maybe it's a matter of time. With more speed, recompression options could also become more usable.

  10. #10
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    I don't mean it's unusable or not needed, but that it's more PR than some actual algorithm improvement office 2007 .*x files are just renamed zips, psd recompression is IMHO just recompressing the jpeg inside that's used for thumbnail and so on
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  11. #11
    The Founder encode's Avatar
    Join Date
    May 2006
    Location
    Moscow, Russia
    Posts
    3,979
    Thanks
    376
    Thanked 347 Times in 137 Posts

    Talking

    Actually, I don't like the recompression idea in general. Better if Micro$oft and others will use stronger compression initially, so we may not use any recompresison tricks...

  12. #12
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    551
    Thanks
    206
    Thanked 182 Times in 87 Posts
    Quote Originally Posted by encode View Post
    Actually, I don't like the recompression idea in general. Better if Micro$oft and others will use stronger compression initially, so we may not use any recompresison tricks...
    That would be fine, indeed, but you'd get the downside of this when loading some of your documents. For example, it's really bad to compress pages seperately in a PDF file from a compression point-of-view. But if this would change, both speed and memory requirements would suffer and you'd wait some minutes before you could even have a look at the first page of your document.

    This doesn't mean it's not worth it - sometimes you can get better results for precompressed files even with ultra-fast compressors like THOR - but it often collides with more important things.

    There are two things I dislike about recompression today. One is that lossy recompression is done even in some archivers which are typically lossless, and the other is the speed as mentioned in the very first post by maadjordan.
    Fast lossless recompression is definitely possible, it's more complicated from the coding side, but it's not impossible.

    Another benefit of recompression is having everything in decompressed state and being able to spot patterns across different streams or even files. For example, I recently stumbled upon Microsoft's new SharePoint documentation:

    http://www.microsoft.com/downloads/d...displaylang=en

    The documentation consists of 152 PDFs and is also available as a ZIP archive - 134 MB in size. After precompressing all the PDFs, everything is expanded to 661 MB and can get a lot smaller - 75 MB using THOR, 32 MB (!!) using CCM. Creating a SFX with this would slow down the decompression a bit (Recompressing takes 3-4x longer than ZIP extracting, plus THOR/CCM decompression - using RZM would be nice here), but would extremely reduce Microsoft's traffic and the download time for users.
    http://schnaader.info
    Damn kids. They're all alike.

  13. #13
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts
    Quote Originally Posted by schnaader View Post
    That would be fine, indeed, but you'd get the downside of this when loading some of your documents. For example, it's really bad to compress pages seperately in a PDF file from a compression point-of-view. But if this would change, both speed and memory requirements would suffer and you'd wait some minutes before you could even have a look at the first page of your document.

    This doesn't mean it's not worth it - sometimes you can get better results for precompressed files even with ultra-fast compressors like THOR - but it often collides with more important things.

    There are two things I dislike about recompression today. One is that lossy recompression is done even in some archivers which are typically lossless, and the other is the speed as mentioned in the very first post by maadjordan.
    Fast lossless recompression is definitely possible, it's more complicated from the coding side, but it's not impossible.

    Another benefit of recompression is having everything in decompressed state and being able to spot patterns across different streams or even files. For example, I recently stumbled upon Microsoft's new SharePoint documentation:

    http://www.microsoft.com/downloads/d...displaylang=en

    The documentation consists of 152 PDFs and is also available as a ZIP archive - 134 MB in size. After precompressing all the PDFs, everything is expanded to 661 MB and can get a lot smaller - 75 MB using THOR, 32 MB (!!) using CCM. Creating a SFX with this would slow down the decompression a bit (Recompressing takes 3-4x longer than ZIP extracting, plus THOR/CCM decompression - using RZM would be nice here), but would extremely reduce Microsoft's traffic and the download time for users.

    Acrobat 7,8 (pdf v1.6,1.7) has the ability to compress the entire document but gain is minor and if bz2 is used instead of deflate (like Multivalent do http://multivalent.sourceforge.net/) the more gain is accomplished with no delay in modern computer (specially if you enable "web view" option) but if you preprocess & compress a pdf into a format container and made a reader to view it in that manner the result is perfect..

    i tried this case: uncompress a pdf with no security option then pack with RAR i could gain 10~20% size reduction and if pdf reader support reading directly from RARed pdf then view timing won't be that diffrenet..

    1-data - compression - pdf ------------- internet----- pdf reader
    2-data --------------- pdf -compression-internet----- pdf reader

    another example for MS-CAB is the CHM files the maximum dictionery size i could reach is lzx-18 which is 256-kb while CAB format allow dictionary size of 2-mg (LZX-21) and we all know how fast is decopmression with LZX-21 i tried to patch the chm maker (hhc.exe,hha.dll) to allow LZX-21 with no luck so i can't say how much i can gain..(chm with LZX-21 is already supported in windows..strange!!)

    so in general is that if the recompressed files are read/viewed directly is would be best especially if decompression is fast..

    PTW: i'm still looking for a way to make a preprocess lzx compression

  14. #14
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts
    here is the stuffit pluginfor office & photoshop .. remove the 123 from link..

    http://m123y.smith123micro.com/down1...123Plugins.exe (11MB)..

    it acts very well on PSD files but has worse compression when ICC profiles exists..

  15. #15
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    I've been playing with a demo version of StuffIT 12 Deluxe, and comparing it to my own version 9. I only use StuffIT for JPEGs, and the sad news is that compression on JPEGs is WORSE in 12 than it is in 9, not by a huge %, but it's still worse. It is slightly faster so i guess they wanted a tradeoff. But at least for me i won't be purchasing it when v9 does what i want better

    Also, the command line functionality has changed, and seems to freak out when i use wildcards.

    In 9 i use: stuff /c --jpeg-method=2 --jpeg-level=2 --jpeg-no-thumbnails *.jpg

    in 12 i have to use: console_stuff.exe" /c --jpeg-no-thumbnails .
    And instead of placing the archive in the current dir like v9, and dumps it into the parent directory. Which isn't a huge issue, as that's what i'd do anyways. But unless you put the . there or list individual files you can't use *.jpeg/*.jpeg. If you try it just gives:

    Source file(s):
    1. *.jpg

    Archiving... (sitx format)
    0
    done
    Unable to archive files - engine error: OS error - The filename, directory name,
    or volume label syntax is incorrect.
    - error code is 123

  16. #16
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts
    packjpg is best to me as its about 100kb to compress and to decompress it can even be less if sfx stud mode is added ..i can't wait for v2.4

  17. #17
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Sorry i don't understand, are you saying packjpg is better at compressing JPEGs than StuffIT? in all tests i've done StuffIT is far superior. The only problem i'd say is it can't be easily integrated with Freearc. But maybe you mean the packjpg "package" size? in which case yes it is a lot smaller

  18. #18
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by maadjordan View Post
    packjpg is best to me as its about 100kb to compress and to decompress it can even be less if sfx stud mode is added ..i can't wait for v2.4
    PackJPG is a very nice tool, but if you compare it to Stuffit, is 4-5 times slower and packs worse.
    The most important reason for the much higher speed is the multicore support. On my C2Q, I guess Stuffit packs 4 files parallel.
    Another reason is probably the resources : Stuffit can put more resources in their product than Mathias can put in PackJPG.

    Result from a test I've done some time ago :
    31 jpg files / total 156 MB

    Stuffit12 : 56.7 s / ratio : 75,4 %
    PackJPG v2.4WIP : 240 s / ratio : 79.3%

  19. #19
    Member
    Join Date
    May 2008
    Location
    Kuwait
    Posts
    324
    Thanks
    29
    Thanked 36 Times in 21 Posts
    stuffit is higher in compression (even paq is much higher) than packjpg and still needs more work to catch in but its improving and has much wider support to jpeg than paq and stuffit..

    for example graphic designers often use CMYK color mode in their jpeg's which only packjpg support.

    then all of these packers work on file-to-file base thus similar images (avi of codec mjpeg.. refer to my finding in precomp 0.38 topic) these images has higher match area so compression can greatly be improved if data is combined or reduced before combination (maybe delta or bwt the frames would do it..) .. i could test it later as its done in animated gif why not jpeg principle are same. (refer to http://www.webreference.com/dev/gifanim/frame.html)
    i call this feature "catalog" compression.. try to combine four jpg's of same image with jpegjoin http://jpegclub.org/jpegjoin.zip and compress the result.. (i check this..)

    and regarding the SFX stud i mentioned .. if introduced i can email. the packjpg images and user would double click to unpack (if he trusted the sender of course..)

  20. #20
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    I don't know of any(unless of course saved as a lossless JPEG) graphics guys that would use a CMYK JPEG if it's being sent to the printers, you don't want your prints to be full of crappy JPEG artifacts ;p Not to mention the loss of colour resolution/clarity even with the best subsampling rates And i none of them that i know use anything beyond ZIP to send stuff around, getting them to use packjpg would just baffle them heh.

    PackJPG support for IrfanView is hopefully due out at some point according to Matthias' website(i think i read that there anyway) which could be very handy

    I do use PackJPG(through precomp) most the time purely because it integrates easily with various archivers. But for bunches of JPEGs(with no/very little other file types) StuffIT always gets used on my systems. Great speed and compression.

    NB: Unless an image is pretty large StuffIT is a lot faster and gives better compression if the image has an optimised huffman table and it's a non progressive JPEG.
    In general for JPEGs as long as the image is pixel exact i'll run it through jpegtran with -copy none -optimize -progressive, but my script also goes through several stages both with and without -progressive and then compares the filesize and uses the smallest files. If they have ICC profiles then i have to use -copy all.

Similar Threads

  1. Universal Archive Format
    By Bulat Ziganshin in forum Data Compression
    Replies: 1
    Last Post: 9th July 2008, 00:54
  2. Stuffit Method 15 - Dedicated to BWT maniacs
    By encode in forum Data Compression
    Replies: 1
    Last Post: 12th May 2008, 23:43
  3. New archive format
    By Matt Mahoney in forum Forum Archive
    Replies: 9
    Last Post: 25th December 2007, 11:22
  4. Stuffit 12 beta
    By squxe in forum Forum Archive
    Replies: 7
    Last Post: 3rd December 2007, 21:39
  5. StuffIt compression exposed
    By encode in forum Forum Archive
    Replies: 2
    Last Post: 4th July 2007, 21:29

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •