Page 1 of 2 12 LastLast
Results 1 to 30 of 32

Thread: Packers in active development

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts

    Packers in active development

    Since Squezee Chart is not being updated anymore, I wonder, what packers are actively developed, i.e. at least one new version in the last six months?

    • PAQ and its gazillion forks are actively developed but mostly experimental
    • 7-Zip, RAR go on as usual
    • The development of EMMA, MCM, NanoZip, PackJPG, PackRAW, Razor, UDA, ZCM and ZPAQ seems to be on hold or has been abandoned altogether

    Are any other packers actively developed, please?

    Thank you.

  2. #2
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    573
    Thanks
    245
    Thanked 98 Times in 77 Posts
    Production-ready archivers? None come to mind.
    You can look at https://github.com/mcmilk/7-Zip-zstd for an actively developed 7z fork with extra codecs and https://www.tc4shell.com/ for its take on explorer.exe as an archiver. The author is making lots of plugins to improve over baseline 7z. You can find them here: https://www.tc4shell.com/en/7zip/

    There is CSArc too (custom format, open source, no GUI) but the author doesn't seem to be following a periodic development cycle, only pushing some major version from time to time. Last one is from 2016 and has unresolved issues.
    Màrcio just shared his code for fairytale, but it's unclear wether it will eventually become a full-fledged archiver or just a playground for new codecs.

    There are some libraries active too. Lepton / brunsli are the next packJPG. zstd and brotli are highly asymmetric lz compressors, there's Illya's bcm for block sorting, xpeg-xl for images, lots of competing codecs for video and the plethora of extremely fine tuned compressors from Oodle (proprietary, purchase required AFAIK)

    Personally? I just write my little wrapper scripts. wimlib provides me with full archiving capabilities, including streamable format, unix metadata and file deduplication. Bulat's fazip's got some very nice filters/preprocessors including a crazy fast rep finder, srep takes care of long-range deduplication and fxz utils does the actual compression, or zstd if I know the data is not really compressible. All in memory with no temp files (except srep but that's OK). Way faster and almost always better ratio than any commercial or integrated solution. It's not pretty, but it works.
    And if I really need to save space or just because I can, there's precomp too, which can take ratios to a whole another level.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	imagen.png 
Views:	71 
Size:	83.1 KB 
ID:	8339  
    Last edited by Gonzalo; 17th February 2021 at 01:33. Reason: fairytale

  3. Thanks (2):

    Hacker (17th February 2021),xinix (17th February 2021)

  4. #3
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Gonzalo View Post
    Production-ready archivers? None come to mind.
    I do.
    ZPAQ
    It just... works.
    The most important problem is to check that the archive is not corrupted in the event of an unexpected interruption (e.g. out of disk space).

    Completated with some useful missing function, but currently in test, zpaqfranz

  5. #4
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    Gonzalo,
    Quote Originally Posted by Gonzalo View Post
    You can look at https://github.com/mcmilk/7-Zip-zstd for an actively developed 7z fork with extra codecs
    Yup, but neither of the codecs seems to surpass the standard LZMA2 in general compression ratio, unfortunately, so for my archiving purposes it's not really useful.

    Quote Originally Posted by Gonzalo View Post
    There is CSArc too (custom format, open source, no GUI) but the author doesn't seem to be following a periodic development cycle, only pushing some major version from time to time. Last one is from 2016 and has unresolved issues.
    Indeed, four years old. Seems dead.

    Quote Originally Posted by Gonzalo View Post
    There are some libraries active too. Lepton / brunsli are the next packJPG. zstd and brotli are highly asymmetric lz compressors, there's Illya's bcm for block sorting, xpeg-xl for images
    Libraries are great but unless their compression is made available in some packer they cannot be easily used.

    Quote Originally Posted by Gonzalo View Post
    Personally? I just write my little wrapper scripts. wimlib provides me with full archiving capabilities, including streamable format, unix metadata and file deduplication. Bulat's fazip's got some very nice filters/preprocessors including a crazy fast rep finder, srep takes care of long-range deduplication and fxz utils does the actual compression, or zstd if I know the data is not really compressible. All in memory with no temp files (except srep but that's OK). Way faster and almost always better ratio than any commercial or integrated solution.
    Is it better than say 7-Zip or RAR at maximum settings? How does the ratio look like without deduplication?

  6. #5
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    573
    Thanks
    245
    Thanked 98 Times in 77 Posts
    Quote Originally Posted by Hacker View Post
    Gonzalo,

    Yup, but neither of the codecs seems to surpass the standard LZMA2 in general compression ratio, unfortunately, so for my archiving purposes it's not really useful.


    Indeed, four years old. Seems dead.


    Libraries are great but unless their compression is made available in some packer they cannot be easily used.


    Is it better than say 7-Zip or RAR at maximum settings? How does the ratio look like without deduplication?

    1) The only one that would be competitive is fast-lzma2. In theory it's got just a tiny bit worse ratio, but in practice, due to multi-threading and RAM constraints, it usually is actually stronger than lzma2 (take into consideration that flzma2 is actually lzma2 with a different match finder)

    2) Yup. Bt it works nonetheless. And it is especially good for already compressed data. It can compress it way faster than 7z, and for what is worth, a little bit stronger too.

    3) AFAIK all these libraries have at least one working command line exe.

    4) Yes, it is better. There are exceptions, of course; after all, they are in the same tier. But for me the best advantage is that my script reaches the same or better level of compression while doing it 3X faster.
    Without deduplication, ratio is comparable. Here you have a test I made for precomp comparing its lzma codec with flzma2
    Sumary:

    Weighted average speedup for fxz: 195%
    Weighted average speedup for fxz -e: 168%
    Weighted average ratio gain for fxz: 2.03%
    Weighted average ratio gain for fxz -e: 2.20%

    Now you don't want to get rid of the preprocessors. They are key to improve ratio, compression speed and sometimes even decompression speed. AND they can significantly reduce RAM usage. Especially srep can make memory consumption drop, but the combination of file sorting, file deduplication, rep filter, srep filter and sometimes lzp can make memory requirements for lzma way lower than they would otherwise be. That's because you're actually compressing a lot less information. For large datasets you can end up compressing only a quarter of the original size (real life example - "dztest" folder is 24% of its original size after dedupers. Only that enters lzma stage).

  7. #6
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    Gonzalo,
    Quote Originally Posted by Gonzalo View Post
    it is especially good for already compressed data. It can compress it way faster than 7z, and for what is worth, a little bit stronger too.
    Thanks, I'll give it a chance.

    Quote Originally Posted by Gonzalo View Post
    3) AFAIK all these libraries have at least one working command line exe.
    Let me rephrase it - until they are more widely supported (e.g. in image viewers or in cameras) they are nice for experimenting but we should've had a JPG successor for at least 20 years now but we still don't. Apple did something with HEIC but that's it, unfortunately.

    Quote Originally Posted by Gonzalo View Post
    4) Yes, it is better.
    OK, sounds good enough to give it a try as well. Any thoughts about future compatibility? Will I be able to open the archives in ten years? Are the involved programs standalone downloadable exe's or are they some sort of internal part of Windows 10?

  8. #7
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Hacker View Post
    Are the involved programs standalone downloadable exe's or are they some sort of internal part of Windows 10?
    Ah, found the executables, I was blind, sorry.

  9. #8
    Member
    Join Date
    Aug 2014
    Location
    Argentina
    Posts
    573
    Thanks
    245
    Thanked 98 Times in 77 Posts
    Quote Originally Posted by Hacker View Post
    Gonzalo,

    Thanks, I'll give it a chance.


    Let me rephrase it - until they are more widely supported (e.g. in image viewers or in cameras) they are nice for experimenting but we should've had a JPG successor for at least 20 years now but we still don't. Apple did something with HEIC but that's it, unfortunately.


    OK, sounds good enough to give it a try as well. Any thoughts about future compatibility? Will I be able to open the archives in ten years? Are the involved programs standalone downloadable exe's or are they some sort of internal part of Windows 10?
    There is a great replacement for jpg right now. It's called jpeg-xl. There's a lot of chat about it on this forum.

    About future compatibility? Yes and no. wimlib. tar and fxz are open source. Srep and fazip too but I haven't had much luck compiling them. And they're abandonware, so not great.

  10. #9
    Member
    Join Date
    Feb 2015
    Location
    United Kingdom
    Posts
    183
    Thanks
    30
    Thanked 80 Times in 47 Posts
    Tossing my hat in the ring, I’m currently developing a commercial grade archiver with custom codecs built from the ground up with diffing in mind, a bit like a version controller, but with serious performance.

  11. Thanks (2):

    Hacker (17th February 2021),SolidComp (17th February 2021)

  12. #10
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Lucas View Post
    Tossing my hat in the ring, I’m currently developing a commercial grade archiver with custom codecs built from the ground up with diffing in mind, a bit like a version controller, but with serious performance.
    Opensource?

  13. #11
    Member
    Join Date
    Feb 2015
    Location
    United Kingdom
    Posts
    183
    Thanks
    30
    Thanked 80 Times in 47 Posts
    Closed source, still in development, maybe one day it'll be open sourced.

  14. #12
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    369
    Thanks
    133
    Thanked 57 Times in 40 Posts
    Quote Originally Posted by Lucas View Post
    Tossing my hat in the ring, I’m currently developing a commercial grade archiver with custom codecs built from the ground up with diffing in mind, a bit like a version controller, but with serious performance.
    Good to hear. I hope you make serious money from it. Will it be a journaling deal like zpaq, where it only appends new or changed files?

    What's your angle on the custom codecs? Are you developing general-purpose codecs like Zstd, 7z, etc., or tailored codecs for things like images, videos, etc.? I guess it would make sense to have a basket o' codecs that would be tailored to different kinds of data, and I vaguely remember other projects doing something like that (cmix? mcm? I don't remember). Are you trying to beat the state of the art open-source codecs like Zstd and LZMA? That would be impressive. Well, I guess LZTurbo might be the benchmark for ratio + performance, but I haven't looked lately.
    Last edited by SolidComp; 18th February 2021 at 03:50.

  15. #13
    Member
    Join Date
    Feb 2015
    Location
    United Kingdom
    Posts
    183
    Thanks
    30
    Thanked 80 Times in 47 Posts
    Quote Originally Posted by SolidComp View Post
    Good to hear. I hope you make serious money from it. Will it be a journaling deal like zpag, where it only appends new or changed files?

    What's your angle on the custom codecs? Are you developing general-purpose codecs like Zstd, 7z, etc., or tailored codecs for things like images, videos, etc.? I guess it would make sense to have a basket o' codecs that would be tailored to different kinds of data, and I vaguely remember other projects doing something like that (cmix? mcm? I don't remember). Are you trying to beat the state of the art open-source codecs like Zstd and LZMA? That would be impressive. Well, I guess LZTurbo might be the benchmark for ratio + performance, but I haven't looked lately.
    Thanks for your interest Solidcomp. I am targeting commercial and personal use, but commercial features come first. I have a business partner and another developer working to help bring my idea to life.

    As for codecs, it's all bespoke new codecs, 32-bit (x86) support has been dropped, it's all x64 for maximum performance. Having been developing compression solutions full time I realized most off the shelf codecs are not designed to utilise all available parts of the ISA (eg: AVX2, AVX512, let alone more than 1 thread on a single stream etc...). My old work on parallel encoding and decoding of LZ is what will power most of the internal codecs.

    I did scratch the surface with a new filter implementation, which now contains an optimal parser for figuring out what structure the input has (no header scanning), and smart heuristic which can teach itself mid-side coding from raw L+R audio, it's also capable of learning the YCoCg colourspace from an RGB input.

    Parallel deduplication and archival is already implemented.

    The codec itself is going to be LZ77 based, and I've already written a proof of concept single threaded codec with speeds on par with zstd, but with compression ratios within a few percent of lzma. I'm focusing primarily on striking a good balance between decode speed and compression ratio, with optional high throughput encoding, because not everyone wants to wait forever to compress something.

  16. Thanks (3):

    Bulat Ziganshin (18th February 2021),fcorbelli (18th February 2021),Mike (18th February 2021)

  17. #14
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Lucas View Post
    ...codecs are not designed to utilise all available parts of the ISA (eg: AVX2, AVX512...
    It's true, indeed.
    But be careful: some extensions (like SHA for example) aren't really ubiquous.
    Quite recent processors (like the 6-core i7-8700K) do not fully support them

  18. #15
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Hacker View Post
    Since Squezee Chart is not being updated anymore, I wonder, what packers are actively developed, i.e. at least one new version in the last six months?...
    ZPAQ seems to be on hold or has been abandoned altogether... Are any other packers actively developed, please?

    Thank you.
    zpaqfranz, about one for week/day

  19. Thanks:

    Hacker (17th February 2021)

  20. #16
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by fcorbelli View Post
    zpaqfranz, about one for week/day

    Yes, I noticed your versions, they are still a little bit too experimental for my personal needs.

  21. #17
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Hacker View Post
    Yes, I noticed your versions, they are still a little bit too experimental for my personal needs.
    For me too , but with almost zero feedback is hard to find bugs.

    I am using it in parallel with the older versions (which did not contain the CRC32 storage of the files).

    So far it has allowed me to find a corrupted (multivolume) ZPAQ archive, which I would not have been able to do with the "normal" version.

    The main problem is diffusion: how many users of the "strange" packers X,Y Z are there? 1, 5, 10 in the world?
    Not much

  22. #18
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    fcorbelli,
    Quote Originally Posted by fcorbelli View Post
    The main problem is diffusion: how many users of the "strange" packers X,Y Z are there? 1, 5, 10 in the world?
    Not much
    It depends. I understand most programmers create new packers for the challenge, to learn something and perhaps to improve something, not to create a large user base. I guess only 7-Zip managed that, and in a smaller part RAR (and ACE, rest in peace).

  23. #19
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by Hacker View Post
    fcorbelli,

    It depends. I understand most programmers create new packers for the challenge, to learn something and perhaps to improve something, not to create a large user base. I guess only 7-Zip managed that, and in a smaller part RAR (and ACE, rest in peace).
    Not everybody.
    In my case, for example, it's the classic old-fashioned hacker manner.

    "If you need something that doesn't exist, first try to find it already done,
    it will almost always already exist.
    Otherwise do it yourself"

    For me they are working tools for making backups.

    If I had a large number of users I would be more confident about bugs.

  24. #20
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    369
    Thanks
    133
    Thanked 57 Times in 40 Posts
    Quote Originally Posted by fcorbelli View Post
    zpaqfranz, about one for week/day

    Interesting project. What does "about one for week/day" mean? Is that referring to frequency of releases? I don't see any releases at all on GitHub (https://github.com/fcorbelli/zpaqfranz/releases).

    (In English you would say "one per week" if you mean weekly, not "one for week". Or one per day. If you meant that it's sometimes weekly and sometimes daily, then your "week/day" term is fine. If you meant one release every weekday, meaning Monday through Friday, there's no slash in the word. Anyway, sembra un progetto interessante.)

  25. #21
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by SolidComp View Post
    Interesting project. What does "about one for week/day" mean? Is that referring to frequency of releases? I don't see any releases at all on GitHub (https://github.com/fcorbelli/zpaqfranz/releases).
    Having had virtually zero feedback I do not public anymore.

    "per" is really interesting.
    It's latino


    If you meant that it's sometimes weekly and sometimes daily, then your "week/day" term is fine.
    So it's fine

  26. #22
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    369
    Thanks
    133
    Thanked 57 Times in 40 Posts
    Quote Originally Posted by fcorbelli View Post
    zpaqfranz, about one for week/day
    Do you think it will compile in Visual Studio 2019?

  27. #23
    Member
    Join Date
    Dec 2013
    Location
    Italy
    Posts
    517
    Thanks
    25
    Thanked 45 Times in 37 Posts
    Quote Originally Posted by SolidComp View Post
    Do you think it will compile in Visual Studio 2019?
    I think some changes are needed.
    I don't know specifically, I don't use that type of compiler

  28. #24
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,593
    Thanks
    801
    Thanked 698 Times in 378 Posts
    SHANI (as well as AVX512) available starting from IceLake

  29. #25
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    369
    Thanks
    133
    Thanked 57 Times in 40 Posts
    I've wondered if the Bit Manipulation Instructions will be useful for compression. They were introduced in Haswell and by AMD maybe before Intel did. I never hear anyone using them.

  30. #26
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    4,135
    Thanks
    320
    Thanked 1,397 Times in 802 Posts
    https://en.wikipedia.org/wiki/Bit_ma...nstruction_set
    They're useful, but not for new operations. Well, maybe POPCNT.
    There're more efficients forms of some common instructions (like BSR->LZCNT,SHL->SHLX etc),
    and compilers frequently use them as is.

  31. #27
    Member
    Join Date
    Aug 2015
    Location
    indonesia
    Posts
    514
    Thanks
    63
    Thanked 96 Times in 75 Posts
    Quote Originally Posted by Hacker View Post
    Since Squezee Chart is not being updated anymore, I wonder, what packers are actively developed, i.e. at least one new version in the last six months?

    • PAQ and its gazillion forks are actively developed but mostly experimental
    • 7-Zip, RAR go on as usual
    • The development of EMMA, MCM, NanoZip, PackJPG, PackRAW, Razor, UDA, ZCM and ZPAQ seems to be on hold or has been abandoned altogether

    Are any other packers actively developed, please?

    Thank you.
    Paq8sk still developed until now..

  32. #28
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by suryakandau@yahoo.co.id View Post
    Paq8sk still developed until now..
    Yes, as mentioned by me in the first line. Thank you too for adding to PAQ development.

  33. #29
    Member
    Join Date
    Aug 2015
    Location
    The Earth
    Posts
    13
    Thanks
    3
    Thanked 22 Times in 8 Posts

  34. #30
    Member
    Join Date
    Oct 2016
    Location
    Slovakia
    Posts
    32
    Thanks
    41
    Thanked 3 Times in 3 Posts
    data man,
    Quote Originally Posted by data man View Post
    No executable though?

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 0
    Last Post: 4th October 2018, 00:42
  2. OpenBWT-2.0.0 development
    By smjohn1 in forum Data Compression
    Replies: 11
    Last Post: 7th November 2017, 17:45
  3. New CM compressor in development
    By Mat Chartier in forum Data Compression
    Replies: 37
    Last Post: 28th June 2013, 08:16
  4. Demixer - new tree-based bitwise CM codec is in development
    By Piotr Tarsa in forum Data Compression
    Replies: 34
    Last Post: 17th March 2013, 21:33
  5. IBM Active Memory Expansion
    By m^2 in forum Data Compression
    Replies: 0
    Last Post: 17th February 2010, 01:15

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •