Hi,
I had an idea how to create compressed virtual partitions easily and so a friend and me started programming. We started writing a linux kernel module (because its easy using loop) and when we have a working alpha we will start with a windows version too. And now the reason why I'm writing this: we need some compression algorithms' sources - when they are fast enough the partition will be faster than a normal hdd. The criteria are:
- (extremely) fast decompression
- different compression speeds(fast ones with weak compression and slow ones with strong compression)
- good with small files/small solid blocks (about 1mb-32mb)
- doesn't matter with wich kind of files they are good, as long as they aren't much slower with other files(it's no problem that they are weaker)
- ability to update archives would be handy but not important(because of small files/small solid blocks)
I already performed some tests whit large mixed files and tor, lzx, gzip, lzma and maybe FreeArc (a bit to slow when decompressing) are approximately what I'm searching for. Also Thor would maybe be great, but I would need a linux version or better the sources.... I hope Oscar plans to release them. Do you have some suggestions wich (de)compressors I could also try out?
Annother question not related to this topic but related to compression: Is somenone interested in extremely fast array sorting/searching algorithms? I've got an idea that maybe could be faster than quicksort.
Mimos