Hello everyone,
We tend to focus on the speed of image compression tools from the perspective of impatient desktop users who for some reason need the compressed image right now. I think it would be very cool to have an application that sips CPU and electricity while running in the far background compressing and reducing a bunch of images over a span of many hours or days.
Is there anything out like this out there for common formats like JPEG and PNG? I was reading about huffmix again, and it made me wonder about just taking the time to run through lots of trials and equivalent operations depending on the format. In a lot of cases, we only need to optimize an image once (at least for a given size or device context like desktop, iPhone, and the enormous screens of recent Android phones and iPhone Plus for the Wun Wun market.) And there's no rush to have the compression be done in two seconds – if it was a cost savings, I think a lot of site owners would be happy to let the program run over a weekend.
I can't find any superslow, energy-sipping utilities. (In another life, maybe I'd get rich creating one.) Two related questions:
1. Are there any image optimizers using distributed computing, where the job can be divided among many computers? (e.g. the SETI@Home project, asteroids project, protein folding@home). BOINC is a common platform for such projects, but I didn't find any image compression applications outside of a couple of specialized scientific image projects.
2. Do you think the math would actually work out such that a super-slow compressor would save energy in the end, compared to just running conventional fast compressors? It sounds like a good idea, but for all I know the physics and programming won't actually work out. I don't know the right methodology for approaching this question, or if good methodologies exist.
Thanks.