Hi,
WebP is "VP8 applied to pictures".
http://code.google.com/speed/webp/
Best regards,
Hi,
WebP is "VP8 applied to pictures".
http://code.google.com/speed/webp/
Best regards,
And it sucks, been reviewing an article on in this morning.
http://x264dev.multimedia.cx/?p=541#more-541
then UCI is "x264 applied to pictures".
http://tieba.baidu.com/f?kz=839366347 (Chinese article)
Another article on WebP, this time in it's favour. But he's using very low files sizes, like ~45K for 1920?1280 images which is not "real world" imo.
http://englishhard.com/2010/10/01/re...bp-versus-jpg/
Gonna stick this here instead of a new thread. Another new image format, hipix.
http://www.hipixpro.com/index.html
supported by google, too?![]()
BIT Archiver homepage: www.osmanturan.com
Opera do on-the-fly JPEG->WEBP transcoding to speed up viewing:
Thanks in part to the inclusion of WebP, Opera says, it has boosted the speed of Turbo while improving image quality. Turbo uses proxy servers to compress web pages before sending them down to the browser. In Opera's lab tests, the new Turbo provides 35 per cent smaller pages and is 15 per cent faster than the version included with Opera 11.
Ya read that with the release of 11.10 yet to try it though. Waiting on Opera@USB 11.10 version.
Google updated WebP lossless mode: http://code.google.com/speed/webp/do...y.html#results (link from /.).
Some extra info:
http://googlecode.blogspot.com/2011/...coding-in.html
Wasn't that impressed by WebP lossy in previous tests vs JPEG, but in the new lossless mode it's pretty amazing compared to PNG on 24-Bit images, on 8-Bit paletted images you loose your palette as it converts it to it's 24-bit format. On average i see at least a ~80% improvement, at best it's astounding taking some files down to ~15% of it's original size even after my brute force PNG scripts have been run against it. I will say this though, it is slooooooooooooooooooooooooooooooooow at compressing, but decompressing is super fast, it must brute force it's way to get these sizes.
I tried it and I'm very disappointed, though I don't have too much data.
1. It's extremely memory hungry, on a .25 MB file it needed 40 MB, on a 27 MB one it crashed after asking for probably > 1 GB.
2. It's extremely slow. On my old Pentium D, on a 0.25 MB file it needed 3 minutes. On a 27 MB one it crashed after over 6 hours, which means it worked even slower.
3. I have very little data, so treat it with a huge amount of salt, but it seems about as strong as BCIF, which was 900 times faster on the small file.
More info:
http://extrememoderate.wordpress.com...t-impressions/
I am tempted to do a more detailed comparison, it can't be so bad...
But I don't think I'll do it, it's too slow and takes too much memory. I would be willing to test it anyway if the preliminary results were promising, but they were not.
Last edited by m^2; 20th November 2011 at 23:09.
I did a quick test on 3 optimized Firefox icons (standard, aurora and nightly):
standard
png = 49502, webpll = 38085 bytes (0.769363)
bmf = 30764
aurora
png = 51842, webpll = 40958 bytes (0.790054)
bmf = 30908
nightly
png = 49171, webpll = 38493 bytes (0.782839)
bmf = 29504
On the 3 sample files I've used for Huffmix.
bigmac
png = 33864, webpll = 28684 bytes (0.847035)
bmf = 27096
getadrink
png = 65225, webpll = 49543 bytes (0.759571)
bmf = 41096
mouse
png = 104740, webpll = 73559 bytes (0.702301)
bmf = 59804
It's not that slow considered the intended target (web graphics) where you are supposed to compress limited size files once and download them thousands of times... Webpll has still to be optimized for speed.
PNG has some design flaws, interleaving the Alpha channel with the RBG datas was a bad idea (as I have demonstrated here) and Deflate is pretty old now, thus outperforming PNG by 15% is not a surprise.
I've tried different levels of compression, pushing the compression level does not always give better results:
On my first sample file.
41842 c00.webpll
40699 c05.webpll
40139 c10.webpll
39916 c20.webpll
39901 c15.webpll
39860 c25.webpll
39860 c30.webpll
39800 c40.webpll
39794 c45.webpll
39749 c35.webpll
38227 c80.webpll
38226 c65.webpll
38163 c100.webpll
38151 c85.webpll
38127 c50.webpll
38109 c60.webpll
38105 c90.webpll
38096 c70.webpll
38094 c75.webpll
38085 c95.webpll
38085 default.webpll
38067 c55.webpll
Here the smallest file is produced at level 55!
I could not try BCIF on these files:
ERROR: Image must have a 24 bit color depth instead of 32
Last edited by caveman; 21st November 2011 at 06:14. Reason: Added BMF -s -q9 results for comparison
I noticed some odd dirty transparency pattern in some images from the Google's webp gallery:
I thought it's to optimize webp prediction, but at least with wepbll it actually hurts compression compared to other schemes.
Image 4 also has some things in background, though different. I didn't analyse it.
Any ideas?
That's the classic fingerprint of a PNG optimiser. Perhaps they converted PNGs to WEBP that had already been optimised?
It looks like the files produced by some versions of Adobe Photoshop around version 7.
The Chameleon picture I used to show what CryoPNG did behind the curtain had the same kind of patterns:
At first I thought it was there to improve compression but infact it's closer to a bug.
(what CryoPNG produces)
png2webpll itself does not clean dirty transparent pixels, I applied CryoPNG to the rose sample file:
121363 1.png (original file)
170325 f0.png
119161 f1.png
122218 f2.png
115611 f3.png
114681 f4.png
ran png2webpll:
90196 1.webpll
84421 f0.webpll
84230 f1.webpll
83706 f2.webpll
91290 f3.webpll
84093 f4.webpll
and BMF:
80344 1.bmf
72988 f0.bmf
76336 f1.bmf
75928 f2.bmf
76772 f3.bmf
77508 f4.bmf
BMF still produces smaller files and way faster.
Well, if you zero some delta-encoded elements then the decoded stream won't necessarily have zeros in that places. In fully transparent areas there can be anything and that won't make any visible difference so one can develop a technique which inserts some weird values on the invisible image portion border as to potentially improve compression. It would bring some insight if you show directly the filtered images, eg filtered by x-delta, y-delta, xy-delta, Paeth filter and the fifth one (?), with (fully and partially) visible pixels masked out (so we'll see only what the compressor does with the invisible filtered pixels).
2 Piotr:
Yes and CryoPNG does it.
Flower_foveon from http://www.imagecompression.info/test_images
3267397 - PNG
2659507 - webpll -c 30
2441574 - webpll -c 50
2441453 - webpll -c 70
2440932 - webpll -c 90 -- ~6 hours
2440804 - webpll -c 100 -- ~6 hours
1783812 - BMF -s -q9
1763982 - GraLIC 1.11.demo -- 4 seconds
Last edited by Alexander Rhatushnyak; 24th November 2011 at 01:35. Reason: added webpll -c 100
This newsgroup is dedicated to image compression:
http://linkedin.com/groups/Image-Compression-3363256
Waiting for Shelwien to hack the decoder to output uncompressed PNGs :] Maybe WebP's strength is decompression speed & memory requirements. Otherwise, WebP looks very inferior even to BCIF.
*THE* big part of the WEBM idea is not best possible compression; its at best compariable compression. The big thing is Google owning or avoiding any patents. Presumably this is a driving force behind the technology choices in WEBP too.
I think Google could own BCIF for less then they spent on WEBP lossless.
Are you aware of any patents covering BCIF? I'm not, though it means little because I'm not into it...
ADDED:
BTW, webpll seems to be heavily optimized for tiny images. I'm doing a test right now, don't have any BCIF results yet, but partial webpll and optimized png.
On files up to 1.2 KB webp saved 27%.
1.2-2.4 KB - 22%
2.4-4.3 KB - 14%
By file size I mean size of a PNG as I got it; mostly unoptimized. Please note that there's a small selection bias with putting files to these 3 buckets: the smaller the file, the higher likeness that it's optimised, the less there is to gain. But optimized images are rare, so the effect it small. In the first buckets, that's just 3.8 %.
Part of the reason is a very lightweight container, the smallest webpll file that I got takes just 6 bytes (1 32-bit pixel). The same image as optimized PNG is 70. But even if I added 64 bytes to each webpll file, saved % would regress with size.
ADDED:
4.3-8.2 KB - 19.6% so maybe it's just a random variability
Last edited by m^2; 24th November 2011 at 13:42.
There is a bug in webpll. It silently corrupts 64-bit PNGs by converting them to 32-bit ones.
I can't tell it to them because there is no way of contact other then by using Google Account, which, fuck you Google, I don't want to have. It would be nice if sb. passed the information to them.
thx
Though I certainly wouldn't call it a feature.
Lossless means lossless.
I tested webpll and some other codecs on web png data.
https://extrememoderate.wordpress.co...ion-benchmark/