Page 3 of 7 FirstFirst 12345 ... LastLast
Results 61 to 90 of 209

Thread: RAZOR - strong LZ-based archiver

  1. #61
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    Quote Originally Posted by Gonzalo View Post
    As of today, I think WCX Totalcommander plugins are the de facto standard of data compression dlls. They have a well defined way of handling most if not all parameters somebody would need to include its functionality into a separate program. Various commander-style programs include support for those plugins, even other apps like PowerArchiver. So, if released as a WCX plugin, Razor becomes automatically available to an enormous amount of people _and_ gets a GUI for free. (Archives in those 'commander' apps are treated almost like a normal folder).

    I also feel the same way about Razor. It would be a great replacement of LZMA on installers like those of NSIS, for example. I also think a non-archiver version that can be used as a codec could have a very promising future.

    See this for an example of WCX uses:
    https://totalcmd.net/directory/packer.html
    Interesting feedback.

    Of course, a DiskZIP plug-in has the advantage of making all Razor archives appear like folders in Windows File Explorer - without even needing a third party `commander` class tool at all.

    I'll take a look at the WCX API's to see if they are compatible with DiskZIP's requirements for its own client apps and especially the Windows File Explorer "archives as folders" integration. Drag&drop/copy&paste requirements can be pretty stringent, but as long as the API is reasonably flexible, I don't anticipate having many problems.

  2. #62
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,908
    Thanks
    291
    Thanked 1,271 Times in 718 Posts
    Tested 7z vs 7zdll/PA vs rz:
    Code:
    41,262,256 // powerarc.exe (x64)
    
     6,994,751 10.093s 0.515s // 7z a -mx=9 -myx=9 1.7z powerarc.exe
     6,899,936 10.093s 0.515s // 7z a -mx=9 -mf=off -m0=bcj2:d40M -m1=lzma:a1:d26:lc8:pb2:lp0:fb273:mc999 ...
     6,622,904 67.268s 0.686s // rz.exe a 1.rz powerarc.exe
     6,599,964 18.767s 0.983s // 7zdll_vF5 x64flt3/deltb/lzma:mt2
     6,471,954 27.050s 3.198s // 7zdll_vF5 x64flt3/deltb/plzma4:mt1
     6,459,780 30.904s 3.666s // 7zdll_vF5 x64flt3/deltb/plzma
       
     6,622,904 67.268s 0.686s // rz.exe a 1.rz powerarc.exe
     6,607,610 69.889s 0.717s // rz.exe a 1.rz *.bin (dumped x64flt3 output from 7zdll)
     6,653,256 69.358s 0.749s // rz.exe a 1a.rz *.bin (zeroed MZ/PE signatures)
     6,511,817 70.060s 0.718s // rz.exe a 1.rz *.bin (dumped x64flt3+deltb output from 7zdll)
     6,510,018 69.889s 0.671s // rz.exe a 1a.rz *.bin (zeroed MZ/PE signatures)
    Observations:
    1. x64flt3/deltaB/lzma gives better compression with 4x faster encoding and 30% slower decoding.
    7zdll also has MT (including decoding), so decoding speed comparison is not really relevant.
    2. x64flt3+deltaB is a better exe preprocessor than whatever is built into rz.
    Also, rz uses MZ/PE exe signatures to enable some preprocessing, which can hurt compression (eg. on already preprocessed files).
    Other tests also showed that srep is a better dedup filter.
    3. rz may be a good integrated solution for some cases, but from developer p.o.v it doesn't really win over lzma.

    @diskzip:
    1. "DiskZIP standard data-set" consists of exe files (mostly windows update archives), unpacked and in all kinds of containers (msi/msu/msp, cab, 7z stored)
    2. "DiskZIP's 7-Zip plug-in at the super-high compression settings." gives the same result as 7-zip -mx=9 -myx=9 -mqs -md=1536M
    3. Its certainly an interesting dataset, which is good for testing exe preprocessing, deflate/LZX/cab/7z recompression, dedup preprocessing, and some other things.
    But its hardly a relevant set for archiver users.

  3. Thanks (4):

    Bekk (1st April 2018),comp1 (20th September 2017),Mike (20th September 2017),Simorq (30th January 2018)

  4. #63
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    Quote Originally Posted by avitar View Post
    Some of you seem to making a big deal re 32 bits-surely point is if it can be done easily why not? j
    I don't think anyone ever said not to do it.


    @Shelwien
    interesting test



    Code:
    C&C red alert 3iso + 1 Cue/bin CD image ECM prefiltered
    
    original     2.02 GB (2,173,803,698 bytes)
    M7Repacker    864 MB (906,830,654 bytes)
    Razor         798 MB (837,767,739 bytes)

  5. #64
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Quote Originally Posted by diskzip View Post
    I'll take a look at the WCX API's to see if they are compatible with ... the Windows File Explorer "archives as folders" integration.
    Yes, they are. I know because it is already implemented in tc4shell. You can see the screenshots on its web.
    The author is here in the forum and goes by the name of Aniskin.

    Quote Originally Posted by Shelwien View Post
    3. Its certainly an interesting dataset, which is good for testing exe preprocessing, deflate/LZX/cab/7z recompression, dedup preprocessing, and some other things.
    But its hardly a relevant set for archiver users.
    Well, it certainly is for me. I'm not a developer, just a random user. Power user if you like. But the main use case for me to use a compressor involves exactly those things. Documents compressed with deflate like PDFs and MS/LO files, LZX recompression and practical long-term archiving of several gb of data, which definitely needs a deduplication stage. And I really think that I'm not the only one by a long shot. What do people compress and send over the net? Documents and photos to colleagues and friends. Software to customers. More documents and photographs to online backup services. What kind of data one cares about enough to make an offline backup to an external drive? Operating system / user folder... Again, deflate, JPG, MP3 and executable files...

  6. #65
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,908
    Thanks
    291
    Thanked 1,271 Times in 718 Posts
    7zdll/PA has deflate (reflate), jpeg (lepton,jojpeg,precomp), mp3 (packmp3b/c), exe (x64flt*) preprocessing and dedup (rep1).
    And nobody has LZX recompression at this point.
    But all of this doesn't matter with diskzip set (reflate has a minor effect) - what matters is applying exe proprocessing to all files
    (who'd normally preprocess .7z files? well, there're 7z/cab/msi with stored exes there) and using d1536M for lzma.
    To clarify, you need 18-20GB of RAM to use even one instance of lzma:d1536M, thus no MT there.
    So yeah, its hard to force PowerArchiver GUI to beat diskzip's record for this set (not even impossible, just lots of manual settings), but so what.

  7. #66
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    Quote Originally Posted by Shelwien View Post
    7zdll/PA has deflate (reflate), jpeg (lepton,jojpeg,precomp), mp3 (packmp3b/c), exe (x64flt*) preprocessing and dedup (rep1).
    And nobody has LZX recompression at this point.
    But all of this doesn't matter with diskzip set (reflate has a minor effect) - what matters is applying exe proprocessing to all files
    (who'd normally preprocess .7z files? well, there're 7z/cab/msi with stored exes there) and using d1536M for lzma.
    To clarify, you need 18-20GB of RAM to use even one instance of lzma:d1536M, thus no MT there.
    So yeah, its hard to force PowerArchiver GUI to beat diskzip's record for this set (not even impossible, just lots of manual settings), but so what.
    The funny thing with PA was, we got bogged down into a discussion of whether I used 7z.exe or DiskZIP itself in creating the archive, which was very counterproductive. And I do find it surprising that they were unable to provide an actual instruction to defeat DiskZIP's 1.5 GB dictionary compression rates - this is their own technical support, mind you.

    As for RAM requirements, DiskZIP uses 2 threads (adding more hurts compression ratios aggressively), and total memory used is about 17.5 GB maximum during processing, which is a lot, of course - but not as bad as you thought it would be.

    How come, do you think, Razor did so well on this data set? I am still blown away, although at half the speed of compression, thinking more soberly, the limited compression savings do not look as attractive. On the plus side, a lot less memory is required than 7-Zip, one does wonder what Razor might achieve with a 1.5 GB dictionary in turn.

    I realize my data set is not for everyone, but if you do have any practical instructions that defeat DiskZIP on this dataset, with preferably less time (and memory) requirements than Razor (or 7-Zip), I would love to hear of them. 7-Zip fortified with additional codecs sounds like a good idea, especially if the codecs are also open source like 7-Zip; but to date my efforts have not been successful. Which is mainly why I am so in awe of Razor right now, as it was the only thing which defeated DiskZIP square and fair!

    Thank you very much for your help with my research.

  8. #67
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    Quote Originally Posted by Gonzalo View Post
    Yes, they are. I know because it is already implemented in tc4shell. You can see the screenshots on its web.
    The author is here in the forum and goes by the name of Aniskin.



    Well, it certainly is for me. I'm not a developer, just a random user. Power user if you like. But the main use case for me to use a compressor involves exactly those things. Documents compressed with deflate like PDFs and MS/LO files, LZX recompression and practical long-term archiving of several gb of data, which definitely needs a deduplication stage. And I really think that I'm not the only one by a long shot. What do people compress and send over the net? Documents and photos to colleagues and friends. Software to customers. More documents and photographs to online backup services. What kind of data one cares about enough to make an offline backup to an external drive? Operating system / user folder... Again, deflate, JPG, MP3 and executable files...
    Wow, tc4shell! I had seen this before, I will have to check it again.

    Thank you for bringing all of these interesting developments to my attention.

    Usually the biggest problems with shell namespace extensions are:

    1. Stability
    2. Speed
    3. Copy&paste
    4. "Seamlessness" of the overall implementation (supporting things like double-click to launch, edit, and update documents inside archives, etc.)

    I actually have a very large archive to test it on, I will let you know how it goes (and I will upload the archive for feedback as well).

  9. #68
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,908
    Thanks
    291
    Thanked 1,271 Times in 718 Posts
    > And I do find it surprising that they were unable to provide an actual
    > instruction to defeat DiskZIP's 1.5 GB dictionary compression rates - this
    > is their own technical support, mind you.

    They're trying to find an actual solution - something that normal users
    could actually do, instead of writing a complex commandline override.
    I'm not sure if it was successful at this point - I've seen some screenshots
    with better results, but it involved manual renaming of certain file extensions
    or something (to get exe preprocessing automatically applied to them).

    > How come, do you think, Razor did so well on this data set?

    It simply has good preprocessing (exe, delta, multimedia at least) and dedup.
    You'd likely get similar results using Bulat's external filters (delta,mm,srep)
    on uncompressed archive of your set, then compressing with lzma
    (with more practical dictionary size, even).

    > On the plus side, a lot less memory is required than 7-Zip, one does wonder
    > what Razor might achieve with a 1.5 GB dictionary in turn.

    Well, that's also configurable actually.
    For example, you can use hc4 matchfinder with 7z and get 6.5n instead of 11.5n
    memory usage.

    > I realize my data set is not for everyone, but if you do have any practical
    > instructions that defeat DiskZIP on this dataset, with preferably less time
    > (and memory) requirements than Razor (or 7-Zip), I would love to hear of them.

    There's a commandline version of .pa engine used by PowerArchiver (7zdll).
    Its based on 7z with all 7z codecs, so it should be able to at least reproduce
    your result.
    And then, we could construct a commandline with plzma,x64flt3,reflate and
    whatever else necessary, and it would certainly improve that result.

    Just that testing this takes a lot of time (20GB of data without blockwise MT),
    and the set is not interesting enough for me to spend a few days on it.

    Well, one simple way, I guess, would be to take your archive and process
    it with lzmarec - that'd automatically reduce your result by 2% or so
    (need a custom version though, normal one supports up to 128M dictionary).

    > Which is mainly why I am so in awe of Razor right now, as it was the only
    > thing which defeated DiskZIP square and fair!

    Did you test nanozip with a large dictionary?
    http://web.archive.org/web/201402161.../download.html

    There're actually plenty of solutions, if you don't care about speed and memory.

  10. Thanks (2):

    Bekk (1st April 2018),diskzip (20th September 2017)

  11. #69
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    So I have some preliminary feedback.

    It seems tc4shell has come a long way, but still has a bit more to go.

    You can find my super large archive here:

    https://mega.nz/#!MTQmxS6L!5DkfO3QiY...UBLM3CyAozSuhc

    It's actually part of the leaked Windows NT 4.0 source code, from way back when. Hopefully nobody is offended...the reason I use this is because it has tens of thousands of files inside it - the archive byte size isn't that great, but the number of files inside it is truly incredible.

    So on each metric:

    1. Stability: Seems fine. If I recall correctly, earlier releases used to crash on this archive.

    2. Speed: Faster than Windows when browsing this ultra large archive (try Windows's own before installing tc4shell or DiskZIP), but still slower than DiskZIP.

    3. Copy&Paste: Seems to extract files singly, which is very slow, compared to DiskZIP. Same problem with Drag&Drop. Also tested it with a 7-Zip solid archive, which killed performance - single file extraction (instead of at-once extraction of everything selected) is a big mistake for solid archives. Extracting small header files each less than 10K takes multiple seconds each with this approach.

    4. Seamlessness: Extraction freezes main File Explorer window. No support for in-place edits/updates of files inside archives. However, I like how they bind actions to the native File Explorer ribbon UI.

    I'd be happy to cross-pollinate my efforts with his or hers, would only help make both products better.

  12. #70
    Programmer
    Join Date
    Feb 2007
    Location
    Germany
    Posts
    420
    Thanks
    28
    Thanked 160 Times in 18 Posts
    Thank you very much for your feedback.

    I've had a long talk with Stephan last weekend. I'll continue to work on razor. But please, don't overwhelm me with requests. I do have my own road-map for razor. To clear this up: I won't publicly release a 32-bit compile, not now. Maybe at a later time.

    I have a completely compression-unrelated fulltime job, a pregnant wife, family, friends, other hobbies, ... I need to slow down development. Compression is one (of many) hobbies - it should be fun. I freely share with you my work and therefore, my free time. And writing a good, new compressor IS work - for which I don't get paid at all.

    The next thing on my todo-list is to improve/clean the internal codec-API. This should facilitate exposing an useful external API and adding new elements (e.g. recompression: internal or via an API).

    Some thoughts about razor:
    -Razor was and is intended to become an all-in-one, easy-to-use compresson engine. And much later, a GUI-based archiver.
    -Decompression speed and memory is very important.
    -It's aim is to be good in most situations. It does not aim to be better than highly configured solutions or scripts. It just works.
    -razor's exe-processing is forced by a valid signature. Detection works without signatures, too. But using already preprocessed files with a valid signature can harm ratio.
    -On average, rz's lz/rolz-engine (without dedupe, exe, wav, ...) is stronger then 7zip & rzm.
    -razor's dedupe-engine is probably different to srep. rz's engine is block-level - but with variable block-size as data-volume streams in. It scales well with very little memory.
    -Compression speed: razor's parser is slow and very complex. I'm sure, that it can be speeded up a lot - but this takes time. Sometimes, I only spend one hour a week on razor - but I do want to spend this time on new features, not on the parser or fending off requests.

    WCX plugins: I have never used Totalcommander. But this sounds interesting as an intermediate solution. I'll look into it. Anyway, since I want to improve the internal codec-API, this might be helpful.
    -Are there legal things which must be considered when providing a WCX-plugin?
    -Which 'commander'-Apps can be recommended?
    -Can someone please point me to an official WCX API-documentation? I did not find any.
    -Are there other good compression APIs?

  13. Thanks (5):

    Bekk (1st April 2018),JamesB (21st September 2017),msat59 (21st September 2017),spwolf (23rd September 2017),Stephan Busch (21st September 2017)

  14. #71
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    536
    Thanks
    236
    Thanked 90 Times in 70 Posts
    Quote Originally Posted by Christian View Post
    Thank you very much for your feedback.

    I've had a long talk with Stephan last weekend. I'll continue to work on razor. But please, don't overwhelm me with requests. I do have my own road-map for razor. To clear this up: I won't publicly release a 32-bit compile, not now. Maybe at a later time.

    I have a completely compression-unrelated fulltime job, a pregnant wife, family, friends, other hobbies, ... I need to slow down development. Compression is one (of many) hobbies - it should be fun. I freely share with you my work and therefore, my free time. And writing a good, new compressor IS work - for which I don't get paid at all.
    Fair enough, man. We sometimes forget that you are human too


    -Are there legal things which must be considered when providing a WCX-plugin?

    Not that I'm aware of. It has been used by lots of projects with very different licenses.

    Packer-Plugin writer's guide 2.21se
    Description of the plugin interface - write your own plugin!

    Category: For Plugin Writers
    Status: freeware
    -Which 'commander'-Apps can be recommended?

    The original one is Totalcommander. Its inventor is also the 'father' of wcx plugins. But it is a matter of personal preference, I guess. Me, I've set aside totalcommander years ago in favour of other similar programs. Right now I'm using linux so I guess it doesn't apply any more.
    Some alternatives are:
    Freecommander
    DoubleCommander
    Unrealcommander
    nomad.net

    These I can confirm support wcx plugins. You have also other programs that does not use a 'commander' style but support the plugins too, for example tc4shell, Power Archiver and even a tiny command line decompression-only utility that is used as a backend to extract some formats in Universal Extractor.


    -Can someone please point me to an official WCX API-documentation? I did not find any.

    https://totalcmd.net/plugring/packer_interface.html
    https://totalcmd.net/plugring/WCXTest.html


    -Are there other good compression APIs?

    7-zip and FreeArc have their own dll based plugin interface. Note that FA is discontinued and no real plugins have been really made with its api AFAIK.

  15. Thanks (6):

    avitar (21st September 2017),Bekk (1st April 2018),Bulat Ziganshin (21st September 2017),Crispin (5th February 2018),Hacker (19th October 2017),jibz (21st September 2017)

  16. #72
    Member jibz's Avatar
    Join Date
    Jan 2015
    Location
    Denmark
    Posts
    124
    Thanks
    106
    Thanked 71 Times in 51 Posts
    Quote Originally Posted by Gonzalo View Post
    The original one is Totalcommander. Its inventor is also the 'father' of wcx plugins. But it is a matter of personal preference, I guess. Me, I've set aside totalcommander years ago in favour of other similar programs. Right now I'm using linux so I guess it doesn't apply any more.
    For something less commander and more explorer like, xplorer2 also added support for WCX plugins recently.

  17. Thanks:

    Hacker (19th October 2017)

  18. #73
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    Quote Originally Posted by Gonzalo View Post
    7-zip and FreeArc have their own dll based plugin interface. Note that FA is discontinued and no real plugins have been really made with its api AFAIK.
    DiskZIP also has its own plug-in interface, for both clients (apps consuming plug-ins, such as the Windows File Explorer integration which remains the fastest and most seamless available), and servers (DLL's and EXE's providing data compression services to clients).

  19. #74
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    Code:
    Titan Quest  + Immortal Throne (iso + alcohol dvd image)
    
    Original            5.26 GB (5,654,699,581 bytes)
    7-zip ultra 1.5GB   5.10 GB (5,479,302,048 bytes)
    Razor  1023MB       5.06 GB (5,437,533,161 bytes)

  20. #75
    Member
    Join Date
    Dec 2016
    Location
    Location
    Posts
    1
    Thanks
    71
    Thanked 0 Times in 0 Posts
    Christian, thank you for the program, this is the great compression tool I've seen.
    Please make the 32bit version, and ability to create SFX archives.

  21. #76
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    Another example of wher Razor really takes the crown

    Code:
    Quake 2 + Expansions (1iso + 2 cue/bin ECM filteret CD images)
    
    Original        2.69 GB (2,895,511,173 bytes)
    Winrar 4        1.59 GB (1,712,511,589 bytes)
    Winrar 5	1.22 GB (1,315,399,512 bytes)
    7-zip           782 MB (820,524,746 bytes)
    Razor           641 MB (673,086,685 bytes)

  22. #77
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    Code:
    Fable - The Lost Chapters (3x iso 1x clondeCD image ECM filtered)
    
    original     2.16 GB (2,328,979,260 bytes)
    m7repacker   2.08 GB (2,240,516,679 bytes)
    razor        2.08 GB (2,239,385,562 bytes)

  23. #78
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    918
    Thanks
    57
    Thanked 113 Times in 90 Posts
    Code:
    Hexen 2 + Expansion (2x cue/bin ECM prefitler cd images)
    
    NanoZip* 597 MB (626,591,921 bytes)
    Razor    551 MB (578,221,802 bytes)


    * The two main files was compressed with nanozip individually (-Co or -CO). Then stored into a 7zip container with the smaller files being compressed inside this 7zip contains.
    nz.exe was also store compressed inside this 7zip container.

  24. #79
    Member
    Join Date
    Aug 2016
    Location
    Russia
    Posts
    105
    Thanks
    6
    Thanked 70 Times in 37 Posts
    Quote Originally Posted by diskzip View Post
    So I have some preliminary feedback.

    3. Copy&Paste: Seems to extract files singly, which is very slow, compared to DiskZIP. Same problem with Drag&Drop. Also tested it with a 7-Zip solid archive, which killed performance - single file extraction (instead of at-once extraction of everything selected) is a big mistake for solid archives. Extracting small header files each less than 10K takes multiple seconds each with this approach.
    This is a limitation of shell subsystem. Drag&drop and copy&paste operations request IDataObject from source. In my implementation of IDataObject I always returns streams (FileGroupDescriptorW/FileContents formats). And for every single stream I have to open archive, enumerate objects and create stream. This solution reduces extraction speed during drag&drop and copy&paste operations but has several advantages. For example it is very easy to use thumbnails and preview panels. User can preview content of compressed files directly in Explorer Window with thumbnail view mode and preview pane (with help of system thumbnail and preview handlers). Also some apps like Windows Photo Viewer can browse and show content of archives. I know all archive apps extract all required files to temp folder and pass path of temp folder to shell. This method allows to have maximal speed but it has small problem with temp files - we cannot detect a moment when we can delete them. You can find the following notes in the TC4Shell help:

    Important! You can also extract files from an archive using the drag&drop functionality, but it is not recommended, since extraction can be extremely slow in certain cases.

    Important! Yet another alternative is to use the copy&paste functionality for extraction. However, it is recommended that you use the Paste from archive command instead of Paste in order to speed up the extraction process.

    So I strongly recommend TC4Shell user to use "Extract to" function. In this case maximal available speed will be.

    Quote Originally Posted by diskzip View Post
    4. Seamlessness: Extraction freezes main File Explorer window.
    It is correct only for drag&drop and copy&paste operations. In case of using "Extract to" extraction always occurs in separate thread (with the exception of cases when the CMIC_MASK_NOASYNC flag is transmitted).

    Quote Originally Posted by diskzip View Post
    No support for in-place edits/updates of files inside archives.
    Cannot confirm. There is Edit command in context menu of files inside archives. You even can edit files in archives that are inside other archives.

  25. Thanks (2):

    Bekk (1st April 2018),diskzip (25th September 2017)

  26. #80
    Member
    Join Date
    Aug 2016
    Location
    Russia
    Posts
    105
    Thanks
    6
    Thanked 70 Times in 37 Posts
    Quote Originally Posted by Gonzalo View Post
    As of today, I think WCX Totalcommander plugins are the de facto standard of data compression dlls.
    Imho WCX standard is a bad choose. I (as developer) can say that it has tons of limitations. API of 7z File Manager is much more flexible. And this API is supported by many external apps too.

  27. Thanks (2):

    Bekk (1st April 2018),diskzip (25th September 2017)

  28. #81
    Member
    Join Date
    May 2017
    Location
    Australia
    Posts
    126
    Thanks
    92
    Thanked 32 Times in 21 Posts
    Hi Aniskin,

    It's very nice to meet you!

    I would love to help you with any technical questions you may have - feel free to email me directly. As I wrote in my first post, I strongly believe there's much to be gained from both products when we cross-pollinate ideas and features with one another.

    Quote Originally Posted by Aniskin View Post
    This is a limitation of shell subsystem. Drag&drop and copy&paste operations request IDataObject from source. In my implementation of IDataObject I always returns streams (FileGroupDescriptorW/FileContents formats). And for every single stream I have to open archive, enumerate objects and create stream. This solution reduces extraction speed during drag&drop and copy&paste operations but has several advantages. For example it is very easy to use thumbnails and preview panels. User can preview content of compressed files directly in Explorer Window with thumbnail view mode and preview pane (with help of system thumbnail and preview handlers). Also some apps like Windows Photo Viewer can browse and show content of archives. I know all archive apps extract all required files to temp folder and pass path of temp folder to shell. This method allows to have maximal speed but it has small problem with temp files - we cannot detect a moment when we can delete them. You can find the following notes in the TC4Shell help:
    This is not a limitation of the shell subsystem, as evidenced by software like DiskZIP, or even Windows's own default implementation for ZIP and CAB files. If you still have any doubts, just install DiskZIP and see for yourself. It is truly great you are offering content previews, but one-file-at-a-time extraction is a massive performance hit, on the order of thousands of times slower extraction, for solid archives (especially as you get closer to the end of the archive!!!)

    Quote Originally Posted by Aniskin View Post
    So I strongly recommend TC4Shell user to use "Extract to" function. In this case maximal available speed will be.
    That breaks the illusion of seamless integration, and it is *not* something I would recommend for the best user experience. Technical challenges aside, of course.

    Quote Originally Posted by Aniskin View Post
    It is correct only for drag&drop and copy&paste operations. In case of using "Extract to" extraction always occurs in separate thread (with the exception of cases when the CMIC_MASK_NOASYNC flag is transmitted).
    Again, this is an area where you have much room for improvement. Again, look at DiskZIP (which also substantially outperforms Windows's default implementation for ZIP and CAB files, by the way), which performs all operations without blocking the main Windows File Explorer window. This is very similar to how Windows File Explorer doesn't block the window when you're copying files. So, to maintain the illusion of seamless integration, you certainly want to emulate how Windows File Explorer itself handles ordinary file copy ops - which means, non-blocking.

    Quote Originally Posted by Aniskin View Post
    Cannot confirm. There is Edit command in context menu of files inside archives. You even can edit files in archives that are inside other archives.
    Ah, I did not notice that context menu. The editing I am referring to is, once again, the seamless type: You double-click a document, DiskZIP extracts it and launches it with its native application; when that application is closed, DiskZIP checks for changes, and prompts the user when it detects any - then offering the option to subsume the changes back into the original archive. This is very similar to how Windows File Explorer works for regular folders, so again, if the goal is to ensure seamlessness, its something that's very nice to have for the end-users.

    I do hope this feedback is helpful - I am happy to take any detailed technical discussion offline (or start a new thread for it, lest we hijack this Razor thread), and to help you find ways in which to improve your product, and of course I'd be glad to raise a few questions of my own for you - I particularly like your ribbon toolbar menu integration, and your file preview features, which are missing from DiskZIP's own shell namespace extension.

  29. #82
    Member
    Join Date
    Mar 2016
    Location
    Croatia
    Posts
    189
    Thanks
    81
    Thanked 13 Times in 12 Posts
    is it possible to use Razor inside 7-Zip-zstd (https://github.com/mcmilk/7-Zip-zstd) ?

  30. #83
    Member
    Join Date
    Sep 2011
    Location
    uk
    Posts
    238
    Thanks
    188
    Thanked 17 Times in 12 Posts
    As I've said before ccm/ccmx are best compressors for my data (v small files for you guys!) of any of the ones on this forum all of which I've tried. ccmx level 3 is just about the optimum for memory and compression. My data is very repetitive xml ascii output from an envir power monitor, but even so ccmx is about a factor 1.5-2!! better re compression,so thanks Christian!

    ccmx 3 tmp.dat tmp.ccm
    CCMx 1.30c (Apr 24 200 - Copyright (c) 2007-2008 by Christian Martelock
    Allocated 146 MiB of memory.
    3629.98 KiB -> 66.39 KiB (ratio 1.83%, speed 1033 KiB/s)
    This runs on an x40 , win32, old laptop, 1.4Ghz, with only 1.5G ram, , xp & takes about 3s

    Don't have 64bit, so I can't compare this with razor, unless/until we get win32 version, but will razor be likely to give comparable compression when allowance is made for the amount of memory I have available?

    TIA for any thoughts...

  31. #84
    Member
    Join Date
    Sep 2011
    Location
    uk
    Posts
    238
    Thanks
    188
    Thanked 17 Times in 12 Posts
    win 7 64 4 processors, 8G memory, using rz to compress 8.2 G vbox vdi

    arc -mx -mt4 gives 1.6G takes 2000s which is overall best.
    rz gives 1.2G (best size!) but takes 15000s with -d 128M

    Is this difference because rz only uses 1 processor whereas arc uses 4? What else can I try to make it faster? -d 256 or above crashes, not enough memory and, as guess would be slower anyway.

    TIA for help.

  32. #85
    Member
    Join Date
    Jan 2017
    Location
    Germany
    Posts
    63
    Thanks
    31
    Thanked 14 Times in 11 Posts
    I see enormous potential in this compressor. A method that combines strong compression with fast(!) decompression fits the use case - compress once, decompress many times - perfectly.
    The drawback of most strong compression methodes is they have slow decompression speed because of computational complexity, this method does not.

  33. Thanks (4):

    Bekk (1st April 2018),Christian (16th October 2017),diskzip (16th October 2017),oltjon (16th October 2017)

  34. #86
    Programmer
    Join Date
    Feb 2007
    Location
    Germany
    Posts
    420
    Thanks
    28
    Thanked 160 Times in 18 Posts
    Hi everyone,

    just letting you know. I'm working on a redesign of the internal APIs. I'm still in the conceptual phase. When done, there will be some nice features:

    -On compression/decompression all components of the codec-tree will run in parallel.
    -The codec-tree will be fully configurable. I don't know, if I'll expose this, though.
    -The archive-format will support versioning, encryption and will have a better layout.
    -Implementing a codec will be extremely simple. I'll be able to simplify existing codecs, too.

    Don't expect a release anytime soon.

  35. Thanks (13):

    78372 (16th October 2017),Bekk (1st April 2018),dado023 (17th October 2017),diskzip (16th October 2017),encode (20th October 2017),hunman (16th October 2017),JamesB (16th October 2017),Mike (17th October 2017),Nania Francesco (16th October 2017),oltjon (16th October 2017),PrinceGupta (16th October 2017),Stephan Busch (16th October 2017),WinnieW (16th October 2017)

  36. #87
    Tester
    Nania Francesco's Avatar
    Join Date
    May 2008
    Location
    Italy
    Posts
    1,565
    Thanks
    222
    Thanked 146 Times in 83 Posts
    Nice Christian !
    Thanks !

  37. #88
    Member
    Join Date
    Oct 2017
    Location
    argentina
    Posts
    1
    Thanks
    0
    Thanked 0 Times in 0 Posts
    hola, muy buen compresor por lo que estuve leyendo.
    alguien sabe como podría utilizar la descompresión para la instalación en inno setup?

    saludos

  38. #89
    Member
    Join Date
    Feb 2018
    Location
    Moscow
    Posts
    3
    Thanks
    0
    Thanked 7 Times in 2 Posts
    Is there an option/command to skip certain files during compression? For example, I would like to skip all *.rz archives from compressing folder because of batch compressing of the same folder with different window parameters.

    Is that possible to extract to folder with the same name as an archive?

    As for compression, should window size be equal exponent of 2 (64_128_256_512_1023(!)) or it does not matter and any size can be used for compression.


  39. #90
    Member Crispin's Avatar
    Join Date
    Feb 2018
    Location
    Serenia
    Posts
    7
    Thanks
    10
    Thanked 5 Times in 2 Posts
    Quote Originally Posted by Christian View Post
    -Are there other good compression APIs?
    PeaZip.
    http://www.peazip.org/

    Free, open source (LGPL v3), multiplatform ("PeaZip is meant to be desktop neutral as possible, "), multilanguage (localized in 29 languages).

    For example, I easily added PAQ8pxd v13, my own Linux build (it was only for Windows before).

    Giorgio Tani (author) is a nice person and will be glad to help you with, I'm sure.

    As a quick (universal) solution (from Giorgio):
    It is possible to use custom compressor / decompressor in PeaZip: selecting "Custom" format during compression it is possible to customize syntax in Advanced tab, likewise, for extraction it is possible to flag "Extract unsupported file types..." in Advanced tab to use custom decompressor with user-defined syntax - in any case the commands can be fine tuned in Console tab.
    Christian, thank you for Razor, it is great, next big thing imho. All the best.

  40. Thanks (3):

    Bekk (1st April 2018),Christian (11th March 2018),diskzip (7th February 2018)

Page 3 of 7 FirstFirst 12345 ... LastLast

Similar Threads

  1. NanoZip - a new archiver, using bwt, lz, cm, etc...
    By Sami in forum Data Compression
    Replies: 302
    Last Post: 26th March 2020, 05:55
  2. Archiver (GUI-based utility)
    By cade in forum Data Compression
    Replies: 0
    Last Post: 9th January 2014, 02:00
  3. hashing LZ
    By willvarfar in forum Data Compression
    Replies: 13
    Last Post: 24th August 2010, 20:29
  4. LZ differential ?
    By Cyan in forum Data Compression
    Replies: 4
    Last Post: 27th September 2008, 14:00
  5. DARK - a new BWT-based command-line archiver
    By encode in forum Forum Archive
    Replies: 138
    Last Post: 23rd September 2006, 21:42

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •