Most disk compression you can do now is dedup. People leave multiple copies of files all over the place. So for good backup performance you want dedup and quick recognition of already compressed files so you can just copy them.
Most disk compression you can do now is dedup. People leave multiple copies of files all over the place. So for good backup performance you want dedup and quick recognition of already compressed files so you can just copy them.
Yep. For a complete HD backup I'd use zpaq but if I only want to send to my friend last weekend photos... I do want a jpg archiverrrr for God's sake!!!
Haha!!! Don't bother, I just want to point out there are very different scenarios where different compression approaches might be wanted.
This is a proved fact and works fine for Winzip, for instance.
Now, here we can make really something else. For we have a good and solid open-source archiver, people with great ideas already developed on real code so it is very possible to construct from the current FA an archiver going from really heavy behaviour packing to the last free byte when one might want it that way, to a very fast dedup behaviour if one wants to make just a periodical backup. NanoZip does it like that. From a method faster than my drive to a cm mode which compares to lpaq8.
I mean, there are a lot of very, very good open-sourced tools scattered around this forum and places like Git Hub which are less useful than if they were altogether in a single software distribution.
Nanozip is fast with good compression, but you can't do incremental backups with it. You can only create a new archive and not update an existing one. It would be a good choice for posting downloads, except that everyone uses zip because that's what everyone has. Nanozip is closed source and abandoned, unfortunately.
Those are all good things about FreeArc. Deflate also has an API (zlib) that is permissive and easy enough to use that it is everywhere like .pdf, .docx, .jar, etc.
Oh, I see. Sorry if I wasn't clear enough... Let's blame idiomatic stuff.
I was talking about that most work is usually put on make better algorithms, instead of end-user friendly products. Which, I must say, is not bad at all. It's just I feel that a GUI archiver open to the public as FA is, could make a very good use of these command line utilities to improve its behavior.
Bulat i hope you have plans to speed up development because 7z is in idle development status and FreeArc could rise now and even maybe kill rar if successful.
Bulat, is there a way to use -a- switch of srep in FreeArc? My laptop at work has "only" 10GB RAM, so if I'm working on Windows, using a Linux VM and trying to (re)compress old, huge backup files (>70gb or so) in background, the process fails due to insufficient memory.
-a0 may be used as any other switch, that's a problem?
I will try that tomorrow, so - after a switch is always the same as level 0
Hello,
Can someone explain to me the use of switches -vmfile and -temp? When during the decompression should be used and what are their advantages? Unfortunately, I can not understand this
Thank you in advance.
Hi, Bulat! What license source code of libunarc?
you can use it freely for extraction of freearc archives
"free" here means "without payments"
Can the source be modified and embedded to proprietary software/library?
AFAIK BandiZip's ARK library can unpack FreeArc archives, but I don't know how they implemented.
yes, as far as they are used to extract freearc archives
I'm using FreeArc 0.67 Alpha on my Windows Embedded 8.1 Industry Enterprise 64-BIT, on my Lenovo y50-70, and the rig is i7 4th gen @ 3.5 GHz, 8 gb ram and 1 TB HDD @ 5400 RPM
i'm always getting this error ERROR: can't allocate memory required for (de)compression in lzma:176mb:normal:bt4:128, use -lc/-ld to limit memory usage
the text after LZMA:... varies according to the size of file, for this i've chosen a file of 36 MB, and the syntax i use for compression via FreeArc GUI is "Maximum: -mx -ld1600m -mc:lzma2/lzma2:max:512mb -mc:rep/maxsrep -mc$default,$obj:+maxprecompj"
because that offers me best compression ratio, also i've not changed any srep or lzma files, because i dont know where to download and how to do that, (kinda newbie at this software)
but i face this issue on a laptop of 8 gb ram!!!!!, is this software serious? i normally have 6.5 gb free while using freearc, and just for this software's sake, i tried using it in safe mode when my o.s. was using only ~700 mb and rest ~7.3 GB free it still says cannot allocate enough memory to compress, is there any fix?
also i would mind updating/upgrading my any binaries/dictionaries/srep stuff, just need step by step instruction, coz i'm newbie at this
Last edited by godlydevils; 10th December 2014 at 17:16.
Bulat Ziganshin (11th December 2014)
one more question if you guys dont mind
how should i update my srep in freearc?
do you really need that?
jpg/mp3/pnm comperssors are open-sourced so anyone can do it. me too, it's just question of priorities. today i will publich open-sourced fa'next, may be it will help people to experiment with implementing new FA methods
with LIB there are 3 problems: no linux support (so we can't enable this compression by default), copyrights (we should ask CS whether he allows to link precomp to freearc/sfx) and lack of decompression-only code that obviously increases the SFX size
it's a weak point of current auto-detecion algorithm - once it finds that files are incompressible, it tries to put them, into $precomp or $compressed group and nothing else. i've found how the algorithm may be improved, but i will need to find time to implement that
i consider this example as correct behaviour - freearc detected that files don't contain texts. unlike it, $mp3 group really includes incompressible files - they just have better compression algorithm than one used for $compressed groupsample# = 9rep+#xb /$text=dict+ppmd
For text files, LZMA is used to compress the .txt files which means that the FreeArc filetype auto-detection surpasses the /$text filter. I had no problem with v0.666 !
When I renamed the filter to /$texte in both arc.ini an arc.groups, the .txt files compressed with PPMD.
what's your idea of auto-detection?
yes actually while i'm compressing .mp4, .mkv, .avi etc video files they gets compressed from 3.99 gb into 4.03gb in about 30-40 mins
i use this command line Maximum: -mx -ld1600m -mc:lzma2/lzma2:max:512mb -mc:exe/exe2 -mc:rep/maxsrep
for files larger than 1 gb because using precomp intense+jpeg just adds another 10-15 min for 800 to 900 mb file
for now i only use the GUI, cause i dont know how to use the CLI based, if you have created an article to do so i can follow up
and i've done lot of research and after that my choices are narrowed, and no doubt freearc is the only file compression utility which offers this level of compression ratio(in other files as compared with any other compressor), so i thought of doing it, i tried adding the lines to arc.ini and the new srep binaries to my freearc 0.67 installation folder, win 8.1 x64 but that just corrupted my installation, so had to reinstall it
if there's any other way to do so, i can follow, and my rig is fit enough to compress huge amount of data, i7 4th gen processor @ 3.5 ghz and 8 gb ram, and 1 tb hdd @ 5400 rpm, so it can handle, do i have any other choice?
Last edited by godlydevils; 24th December 2014 at 20:12.
i asked you why you need to update srep version in your freearc installation. it doesn't make difference, so you will just spend time uselessly
if you don't know, freearc already includes srep 3.20 or so
i need to update my srep because i use srep algo. to compress the data, ->"but when i try to compress videos, files like .mp4, .mkv, they don't get comrpessed, instead they does negative compression"<-, like if i compress a 1 gb file it compresses it to 1.1 gb .arc file,
I won't mind the time spent, but i do want to compress a lot of video files, but using this version of srep is not helping me at all, i would be in your debt if you help me update srep, or any other algo. which i could use and it does positive compression for video files.
But... why you want to spend hundreds of hours to gain 0.something? Currently video codec compress very well.
If you want a very big shrink, but with quality loss, use x265 to repack: this will be a more useful CPU use, IMHO.
godlydevils (26th December 2014)