Thanks Bulat for clarifications.generally speking, i recommend GPL if you want to earn money on it (of course using secondary commercial license), lgpl if you don't need money but want to keep your ideas protected (from using in closed-source software), and bsd otherwise
Although tempting, i don't see how Etincelle could bring in any revenue.
Therefore, my main worry is about license side-effects for 3rd party devs.
I wish GPL code to be able to use Etincelle without any risk for their own license.
In combination with the slow SSD drive and its drivers? On-fly file system compression?Probably not the best use case for Etincelle, which takes advantage of long-range matches; compressing each small block independently, by definition, does not allow that.how does it compare to other alternatives in the on-the-fly fs compression for independent blocks?
I would need to create a special version of Etincelle targeting small files (in the ~64K range). Nothing impossible, if there is a need for it.
Now, for independent small block compression, i may have something better within the next few monthes![]()
Last edited by Cyan; 24th April 2010 at 15:19.
I want the etincelle's source code, where are download?
There is none. The project of Open-sourcing Etincelle did not reached completion status.
Considering my current free time,
with all of it currently gobbled for the LZ4 framing layer,
and next stages planned to be concentrated on the next version of Zhuff,
there is very little chance this item will get through anytime soon.
Developing on free time only makes for a quite limited workforce.
But opening source doesn't require a lot of work either. Unless you have something to hide (patented algos inside?), ie replace.
I think, the lz4 is very slower than etincelle,
etincelle rc2 alreday can use in work.
so, I very want current etincelle rc2 code.
I can convert to stream mode, use in work.
I'm chinese, english is say bad, sorry.
mybe you provide a DLL for etincelle rc2, thank you very math!
Opening source is easy. The expensive part is all the time spent answering questions from people trying to understand your code. Probably nothing is documented, and it has to be for open source to be useful. Who wants to spend time doing that for abandoned code?
Cyan (3rd September 2014)
So was the "incompressible segment detection" algorithm described? Is it a precise algorithm which says something like, "cut off right here"? - as opposed to simply "tread at a faster pace"?
It was a simpler "tread at a faster pace" strategy.
Worked great for speed, and also produced a nice little win for compression ratio.
Last edited by Cyan; 10th December 2018 at 08:34.
After an already compressed/incompressible block is skipped by the match finder, the resumed matches come at a greater price: the distances with the preceding block are much longer. Thus an advanced algorithm targeting high compression ratios should try to "excise", rather than just "skip", the incompressible blocks. Was something like this ever attempted?