Results 1 to 15 of 15

Thread: compression speed VS decomp speed: which is more important?

  1. #1
    Member Lone_Wolf236's Avatar
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    13
    Thanks
    0
    Thanked 0 Times in 0 Posts

    compression speed VS decomp speed: which is more important?

    i'm currently working on an algorithm, starting from scratch.
    the compression speed wil theorically be extremely fast, but the decompression will need much more processing.
    the good point is that the decompression speed can be easily improved after some time, when i will get everything to work correctly :S


    SO, according to you, for a software you would use regularly, is the decompression speed really an issue?
    i have no numbers to give you ATM, since the algorithm is still giving me a lot of trouble and headaches lol


    thanks everyone!

  2. #2
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,511
    Thanks
    746
    Thanked 668 Times in 361 Posts
    hash compression?

  3. #3
    Member
    Join Date
    May 2008
    Location
    Germany
    Posts
    410
    Thanks
    37
    Thanked 60 Times in 37 Posts
    @Lone_Wolf236

    1. it is wonderful to hear from you that you are developing
    a new compressor with an extremely fast speed

    2. decompression speed matters

    a) because we sometimes need the compressed data
    b) in some cases the decompression speed is the most valueable thing
    in such cases we compress one time and we often decompress
    for example program-archives

    3. i think:
    write the code and we will test it here and i am sure
    (in the forum are many compressor-experts) we can get the decompression faster

    4. what about the memory requirements?
    I think, this could be really a big problem
    - if we need for decompression very much memory (for example more then 1500 MByte)
    then it will be impossible to decompress the archives in many cases = unusable !!


    best regards

    thats only my 2 cents

  4. #4
    Member
    Join Date
    Feb 2010
    Location
    Nordic
    Posts
    200
    Thanks
    41
    Thanked 36 Times in 12 Posts
    I've noticed that lots of DCT-based lossy compressors are faster to compress than decompress; as phones get 5MP+ cameras this becomes a big deal. Android actually caches VGA-sized images of all images taken because of this.

  5. #5
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,511
    Thanks
    746
    Thanked 668 Times in 361 Posts
    people don't be so naive! he said enough to recognize his "algoruthm"

  6. #6
    Programmer giorgiotani's Avatar
    Join Date
    May 2008
    Location
    Italy
    Posts
    166
    Thanks
    3
    Thanked 2 Times in 2 Posts
    That depends on the specific needs of the user: a content distributor i.e. would prefer to spend more time to compress the package once, favouring the extraction speed so thousands of final users will be as fast as possible each time they extract the package.
    Conversely, a sysadmin would prefer a faster compression to backup quickly many GB of data, and would accept a slower extraction speed since would reasonably need to restore far less data, and (hopefully) less often.
    Between this two extremes, a good balance of extraction and compression speed is generally well accepted by a generic user.

  7. #7
    Member
    Join Date
    Mar 2010
    Location
    Canada
    Posts
    6
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Bulat Ziganshin View Post
    hash compression?
    xD

    For me, decompression speed is all that matters. I most often use compression when sharing files with people, and so low memory and fast decompression is an absolute must, because it's almost a guarantee their computer won't be as good as mine.

    Also, when I archive files, I like to be able to access them very quickly. Archiving can always be done in the background while I'm doing other work. However, when I need to get at a file I need to wait for that file to decompress before I can continue. Therefore, decompression is more important to an average user like me.

  8. #8
    Member Vacon's Avatar
    Join Date
    May 2008
    Location
    Germany
    Posts
    523
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hello everyone,

    Quote Originally Posted by ForPosterity View Post
    [...] Therefore, decompression is more important to an average user like me.
    Yup, same for me.

    Best regards!

  9. #9
    Member Lone_Wolf236's Avatar
    Join Date
    Aug 2009
    Location
    Canada
    Posts
    13
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks guys! I think i will just write the code, post it here and optimise it as much as i can. Each algorithm have its pluses and minuses, and none can suit everybody... Anyway, it will be one more possibility for you, and hopefully among the best ones! :P

    Btw: i didnt give details of the algo because i still need to figure out which of the few ideas i have about how to decompress the file will be the best one, and i need to be sure that every compessed file can be decompressed properly (duh!)


    If this idea doesnt work, i will find another

  10. #10
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by giorgiotani View Post
    Conversely, a sysadmin would prefer a faster compression to backup quickly many GB of data, and would accept a slower extraction speed since would reasonably need to restore far less data, and (hopefully) less often.
    Not really. In practically all cases, backup restoration time is critical.
    I've seen only one case where compression time was more important than decompression:
    Data compressed on tiny embedded devices and decompressed on devs' PCs.

  11. #11
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    91
    Thanks
    8
    Thanked 4 Times in 4 Posts
    Quote Originally Posted by m^2 View Post
    Data compressed on tiny embedded devices and decompressed on devs' PCs.
    Yes, for example NASA would probably be interested

  12. #12
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    878
    Thanks
    51
    Thanked 106 Times in 84 Posts
    Quote Originally Posted by ForPosterity View Post
    xD

    For me, decompression speed is all that matters. I most often use compression when sharing files with people, and so low memory and fast decompression is an absolute must, because it's almost a guarantee their computer won't be as good as mine..
    o soo agreeee on this i can use hours on brute forcing different compression programs/setting to find the one the decompression the fastest with close to the best compression ratio.
    compression something for my i seldom have to do on before/within a certain time. but decompression im just waiting for my data and being impatient

  13. #13
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by jethro View Post
    Yes, for example NASA would probably be interested
    What I mentioned was a real case and I don't know where do you see NASA in it.

  14. #14
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    91
    Thanks
    8
    Thanked 4 Times in 4 Posts

    Talking

    Quote Originally Posted by m^2 View Post
    What I mentioned was a real case and I don't know where do you see NASA in it.
    I wasn't joking. Google 'Galileo low-gain antenna' and compression. 10 bits per second POWER .

  15. #15
    Programmer giorgiotani's Avatar
    Join Date
    May 2008
    Location
    Italy
    Posts
    166
    Thanks
    3
    Thanked 2 Times in 2 Posts
    We are usually talking of many GB of data to backup, so I implicitely thought we are staying in the range of fast or very fast compression/decompression algorithms, if compression is involved at all.

    IMHO in a good system the backup should be the last defense line, after clustering and RAID, and good versioning systems every time it is possible, against (some) human errors, in order to allow data to be restored nearly transparently to users, a condition that hardly even a fast restore from backup can satisfy: in that sense a good data availability project should be the most important factor to guarantee the least loss of time in most frequent cases.

    When restoring from a backup is the only chance to get the data back (bad luck, bad users, sabotage: the first rule of the backup is that there is never a good reason to skip the backup, that's why I sayd "when" and not "if"...) I agree the extraction speed should be as fast as possible.
    Not only the extraction algorith should be reasonably fast, but also it should be avoided solid compression, or it should be carefully applied with reasonably sized chunks, and the phisical support (even in the sense of SAN/NAS when needed) should be carefully selected for the need of the user to not add any bottleneck in the process, this is important for the backup creation part too.

    The backup creation part is critical too, especially if there are users from different countries accessing the data 24:24 the backup creation should be transparent as posible (being able to not lock files and databases, keep low system resource usage etc), so the faster/lighter/smarter the better.

    Obviously, needs can vary from case to case, someones will need to restore more often (especially is versioning is not possible for most data, if user are not skilled, hw is entry level, bad infrastructures...), some others will prefer to complete the backup sooner and impact less the system in the ordinary operation routine.

    I think until we stay in the range of reasonably performing algorithms, the most important thing is to build a good data availability system to reduce as much as possible the cases where restoring from backup is the only option (hoever I repeat: nothing, never, in no way is a substitute of the backup!), and to avoid some critical errors (wrong phisical media, wrong use of solid compression).

Similar Threads

  1. Decompression speed test
    By m^2 in forum Data Compression
    Replies: 8
    Last Post: 18th August 2010, 00:42
  2. Compression and speed
    By Wladmir in forum Data Compression
    Replies: 4
    Last Post: 25th April 2010, 13:15
  3. On Fast I/O speed
    By Cyan in forum Data Compression
    Replies: 16
    Last Post: 14th March 2010, 16:55
  4. GCC 4.4 and compression speed
    By Hahobas in forum Data Compression
    Replies: 14
    Last Post: 5th March 2009, 18:31
  5. Compression speed benchmark
    By Sportman in forum Forum Archive
    Replies: 104
    Last Post: 23rd April 2008, 17:38

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •