Results 1 to 6 of 6

Thread: World's slowest image compressor

  1. #1
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    244
    Thanks
    97
    Thanked 48 Times in 32 Posts

    World's slowest image compressor

    Hello everyone,

    We tend to focus on the speed of image compression tools from the perspective of impatient desktop users who for some reason need the compressed image right now. I think it would be very cool to have an application that sips CPU and electricity while running in the far background compressing and reducing a bunch of images over a span of many hours or days.

    Is there anything out like this out there for common formats like JPEG and PNG? I was reading about huffmix again, and it made me wonder about just taking the time to run through lots of trials and equivalent operations depending on the format. In a lot of cases, we only need to optimize an image once (at least for a given size or device context like desktop, iPhone, and the enormous screens of recent Android phones and iPhone Plus for the Wun Wun market.) And there's no rush to have the compression be done in two seconds – if it was a cost savings, I think a lot of site owners would be happy to let the program run over a weekend.

    I can't find any superslow, energy-sipping utilities. (In another life, maybe I'd get rich creating one.) Two related questions:

    1. Are there any image optimizers using distributed computing, where the job can be divided among many computers? (e.g. the SETI@Home project, asteroids project, protein folding@home). BOINC is a common platform for such projects, but I didn't find any image compression applications outside of a couple of specialized scientific image projects.

    2. Do you think the math would actually work out such that a super-slow compressor would save energy in the end, compared to just running conventional fast compressors? It sounds like a good idea, but for all I know the physics and programming won't actually work out. I don't know the right methodology for approaching this question, or if good methodologies exist.

    Thanks.

  2. #2
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,771
    Thanks
    275
    Thanked 1,205 Times in 671 Posts

  3. Thanks:

    SolidComp (19th July 2016)

  4. #3
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    895
    Thanks
    54
    Thanked 109 Times in 86 Posts
    compression tools have 2 speeds

    compression
    decompression

    Increasing one does not necessarily increase the other so you can easily do a lot of work to improve some compression and have the same decompression. it might not be the optimal speed/compression trade off but its an option
    if we are talking .PNG files you cna look into my PNGbest.bat script it has the abiltiy to do some severe bruteforcing of PNG compression.

    Some of my compression optimizing have been running for weeks with that script but the reciever of the png file still decodes it fast
    however as always diminishing returns as you increase how much to brute force

    if you think site owners really care about optimal compression you will be sadly disappointed. the site i was doing hardware comparison articles on didn't change anything even after i showed a 13% increase efficinet on picture data if they would just do minimal compression optimization.
    a lot of people just like a quick and dirty solution.


    1:
    not that i know off but its technically possible. aka looking at my pngbest.bat ( shameless self promotion ) the part with the png /r trials coudl easiyl be distributed to multiple computers and then just huffmixed together.
    eve tried making the batch script able to have a cluster mode for this but came short. i might look into it again once brutepng get fininshed

    2: it all depends on how many download it

    X amount of increase energy in compression vs (amount of saved energy decoding * users). The more user you have the more time/ressoruces can you use on the compression

  5. Thanks:

    SolidComp (19th July 2016)

  6. #4
    Member
    Join Date
    Jul 2013
    Location
    United States
    Posts
    194
    Thanks
    44
    Thanked 140 Times in 69 Posts
    Quote Originally Posted by SolidComp View Post
    Are there any image optimizers using distributed computing, where the job can be divided among many computers? (e.g. the SETI@Home project, asteroids project, protein folding@home). BOINC is a common platform for such projects, but I didn't find any image compression applications outside of a couple of specialized scientific image projects.
    One of the nice things about images is that they're generally small enough that I don't think you need to think about this much; just let each node handle a separate image. All you would really need to do is throw together a quick load balancing broker, which you could do with ØMQ in a few minutes (or RabbitMQ, Qpid, etc.). I suppose if you're going to try a brute-force approach like pngcrush you could split it up so each configuration would potentially run on a different node, but it seems to me that overcoming the overhead of distributing multiple copies of the uncompressed image would be difficult.

    Quote Originally Posted by SolidComp View Post
    2. Do you think the math would actually work out such that a super-slow compressor would save energy in the end, compared to just running conventional fast compressors? It sounds like a good idea, but for all I know the physics and programming won't actually work out. I don't know the right methodology for approaching this question, or if good methodologies exist.
    I could be wrong, but my understanding is that it is generally much more efficient to just max out the CPU briefly so it can go back to a lower power state as soon as possible.

    I think your best bet would be to just set the process to a very low priority and let the OS and CPU decide how to handle it. On POSIX you can use the chrt utility (or the sched_setscheduler function); there is a Linux-specific SCHED_IDLE policy, but for BSD/OSX I guess SCHED_BATCH would be your best choice. There is also nice (both a utility [man 1 nice] and a function [man 2 nice]) which also affects scheduling. For Windows… well, there is probably something vaguely similar (likely buried in a terrible API, and without a CLI).

    Another thing to keep in mind is that slower compression algorithms are already at the point where there are diminishing returns. Even background processing isn't free; you still have to pay for electricity. Otherwise we would all be running bitcoin miners in the background. Once you reach the point where the electricity costs outweigh the savings from reduced bandwidth and storage, why bother?

  7. Thanks:

    SolidComp (19th July 2016)

  8. #5
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    244
    Thanks
    97
    Thanked 48 Times in 32 Posts
    Quote Originally Posted by SvenBent View Post
    2: it all depends on how many download it

    X amount of increase energy in compression vs (amount of saved energy decoding * users). The more user you have the more time/ressoruces can you use on the compression
    I meant the amount of energy it takes to compress slowly vs. the amount of energy it takes to compress quickly. I'm not thinking about energy needed to decode – that's a separate issue, with a whole lot of other considerations.

    So what I was getting at was: if compression time is not important, is there a way to write a compression utility such that it uses much less energy than typical (instant) compression utilities? For example, it's generally the case that if you drive somewhere at 50 mph, you'll use less gasoline than if you drove at 70 mph. (This probably isn't linear all the way down – I don't know if driving at 10 mph saves gasoline, given the lower gearing, etc.) Could a superslow compressor save energy, or would it actually use more energy?

    Also, you're right that a lot of site owners don't care about optimization. For this question, I'm really thinking only about organizations and companies that have at least tens of thousands of images, and perhaps hundreds of thousands, or millions. I assume that at this scale, people would care about how much CPU/servers/energy it took to optimize images, since it's probably consequential in money terms. For example, I was just looking at http://imageresizing.net/ – what they do is pretty massive, it looks like. I was surprised not to find any work on the most efficient image processing possible, which I assume might be very slow. Of course, I could be wrong – maybe there is no optimal slow implementation. A process has to run, so it comes down to what kinds of processes can be built given our programming languages, CPUs, and algorithms.

  9. #6
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    244
    Thanks
    97
    Thanked 48 Times in 32 Posts
    Quote Originally Posted by Shelwien View Post
    Woah, what is this!? I've never heard of DLI? What the hell? The numbers look outstanding. What's the catch? What's its status? I don't understand how I missed this.

    Is it supposed to be super efficient in terms of CPU or energy? Is it slow? All I see are compression level numbers, which are very impressive.

Similar Threads

  1. WCC - World Compressor Challenge - Incoming
    By Nania Francesco in forum Data Compression
    Replies: 191
    Last Post: 2nd December 2018, 18:16
  2. Long Range RLE, the world's fastest compressor
    By m^2 in forum Data Compression
    Replies: 1
    Last Post: 28th May 2014, 21:31
  3. BIM (a new lossless image compressor) is here!
    By encode in forum Data Compression
    Replies: 43
    Last Post: 17th September 2013, 15:00
  4. New lossless image compressor
    By encode in forum Data Compression
    Replies: 105
    Last Post: 10th January 2013, 09:36
  5. GraLIC - new lossless image compressor
    By Alexander Rhatushnyak in forum Data Compression
    Replies: 17
    Last Post: 29th November 2010, 20:27

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •