Page 1 of 3 123 LastLast
Results 1 to 30 of 63

Thread: WebP (lossy image compression)

  1. #1
    Member
    Join Date
    May 2008
    Location
    France
    Posts
    48
    Thanks
    1
    Thanked 1 Time in 1 Post

    WebP (lossy image compression)

    Hi,

    WebP is "VP8 applied to pictures".

    http://code.google.com/speed/webp/

    Best regards,

  2. #2
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    And it sucks, been reviewing an article on in this morning.

    http://x264dev.multimedia.cx/?p=541#more-541

  3. #3
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by Intrinsic View Post
    And it sucks, been reviewing an article on in this morning.

    http://x264dev.multimedia.cx/?p=541#more-541
    Thanks for the link. I wondered why didn't google post any comparison with JPEG2000 and alike...

  4. #4
    Member
    Join Date
    May 2008
    Location
    HK
    Posts
    160
    Thanks
    4
    Thanked 25 Times in 15 Posts
    then UCI is "x264 applied to pictures".
    http://tieba.baidu.com/f?kz=839366347 (Chinese article)

  5. #5
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Another article on WebP, this time in it's favour. But he's using very low files sizes, like ~45K for 1920?1280 images which is not "real world" imo.

    http://englishhard.com/2010/10/01/re...bp-versus-jpg/

  6. #6
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Gonna stick this here instead of a new thread. Another new image format, hipix.

    http://www.hipixpro.com/index.html

  7. #7
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,505
    Thanks
    741
    Thanked 665 Times in 359 Posts
    supported by google, too?

  8. #8
    Programmer osmanturan's Avatar
    Join Date
    May 2008
    Location
    Mersin, Turkiye
    Posts
    651
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Intrinsic View Post
    Gonna stick this here instead of a new thread. Another new image format, hipix.

    http://www.hipixpro.com/index.html
    Another x264 based codec. I wonder why everyone is getting lazy.
    BIT Archiver homepage: www.osmanturan.com

  9. #9
    Member Surfer's Avatar
    Join Date
    Mar 2009
    Location
    oren
    Posts
    203
    Thanks
    18
    Thanked 7 Times in 1 Post
    Quote Originally Posted by Intrinsic View Post
    Gonna stick this here instead of a new thread. Another new image format, hipix.

    http://www.hipixpro.com/index.html
    This sh*t requires .NET framework 3.5 SP1 only.

  10. #10

  11. #11
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Ya read that with the release of 11.10 yet to try it though. Waiting on Opera@USB 11.10 version.

  12. #12
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,474
    Thanks
    26
    Thanked 121 Times in 95 Posts
    Google updated WebP lossless mode: http://code.google.com/speed/webp/do...y.html#results (link from /.).

  13. #13
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    788
    Thanks
    64
    Thanked 274 Times in 192 Posts

  14. #14
    Member
    Join Date
    May 2008
    Location
    England
    Posts
    325
    Thanks
    18
    Thanked 6 Times in 5 Posts
    Wasn't that impressed by WebP lossy in previous tests vs JPEG, but in the new lossless mode it's pretty amazing compared to PNG on 24-Bit images, on 8-Bit paletted images you loose your palette as it converts it to it's 24-bit format. On average i see at least a ~80% improvement, at best it's astounding taking some files down to ~15% of it's original size even after my brute force PNG scripts have been run against it. I will say this though, it is slooooooooooooooooooooooooooooooooow at compressing, but decompressing is super fast, it must brute force it's way to get these sizes.

  15. #15
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    I tried it and I'm very disappointed, though I don't have too much data.
    1. It's extremely memory hungry, on a .25 MB file it needed 40 MB, on a 27 MB one it crashed after asking for probably > 1 GB.
    2. It's extremely slow. On my old Pentium D, on a 0.25 MB file it needed 3 minutes. On a 27 MB one it crashed after over 6 hours, which means it worked even slower.
    3. I have very little data, so treat it with a huge amount of salt, but it seems about as strong as BCIF, which was 900 times faster on the small file.

    More info:
    http://extrememoderate.wordpress.com...t-impressions/

    I am tempted to do a more detailed comparison, it can't be so bad...

    But I don't think I'll do it, it's too slow and takes too much memory. I would be willing to test it anyway if the preliminary results were promising, but they were not.
    Last edited by m^2; 20th November 2011 at 23:09.

  16. #16
    Member caveman's Avatar
    Join Date
    Jul 2009
    Location
    Strasbourg, France
    Posts
    190
    Thanks
    8
    Thanked 62 Times in 33 Posts
    I did a quick test on 3 optimized Firefox icons (standard, aurora and nightly):
    standard
    png = 49502, webpll = 38085 bytes (0.769363)
    bmf = 30764

    aurora
    png = 51842, webpll = 40958 bytes (0.790054)
    bmf = 30908

    nightly
    png = 49171, webpll = 38493 bytes (0.782839)
    bmf = 29504

    On the 3 sample files I've used for Huffmix.
    bigmac
    png = 33864, webpll = 28684 bytes (0.847035)
    bmf = 27096

    getadrink
    png = 65225, webpll = 49543 bytes (0.759571)
    bmf = 41096

    mouse
    png = 104740, webpll = 73559 bytes (0.702301)
    bmf = 59804

    It's not that slow considered the intended target (web graphics) where you are supposed to compress limited size files once and download them thousands of times... Webpll has still to be optimized for speed.

    PNG has some design flaws, interleaving the Alpha channel with the RBG datas was a bad idea (as I have demonstrated here) and Deflate is pretty old now, thus outperforming PNG by 15% is not a surprise.

    I've tried different levels of compression, pushing the compression level does not always give better results:
    On my first sample file.
    41842 c00.webpll
    40699 c05.webpll
    40139 c10.webpll
    39916 c20.webpll
    39901 c15.webpll
    39860 c25.webpll
    39860 c30.webpll
    39800 c40.webpll
    39794 c45.webpll
    39749 c35.webpll
    38227 c80.webpll
    38226 c65.webpll
    38163 c100.webpll
    38151 c85.webpll
    38127 c50.webpll
    38109 c60.webpll
    38105 c90.webpll
    38096 c70.webpll
    38094 c75.webpll
    38085 c95.webpll
    38085 default.webpll
    38067 c55.webpll

    Here the smallest file is produced at level 55!

    I could not try BCIF on these files:
    ERROR: Image must have a 24 bit color depth instead of 32
    Last edited by caveman; 21st November 2011 at 06:14. Reason: Added BMF -s -q9 results for comparison

  17. #17
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    I noticed some odd dirty transparency pattern in some images from the Google's webp gallery:

    I thought it's to optimize webp prediction, but at least with wepbll it actually hurts compression compared to other schemes.
    Image 4 also has some things in background, though different. I didn't analyse it.
    Any ideas?

  18. #18
    Member
    Join Date
    Feb 2010
    Location
    Nordic
    Posts
    200
    Thanks
    41
    Thanked 36 Times in 12 Posts
    That's the classic fingerprint of a PNG optimiser. Perhaps they converted PNGs to WEBP that had already been optimised?

  19. #19
    Member caveman's Avatar
    Join Date
    Jul 2009
    Location
    Strasbourg, France
    Posts
    190
    Thanks
    8
    Thanked 62 Times in 33 Posts
    It looks like the files produced by some versions of Adobe Photoshop around version 7.
    The Chameleon picture I used to show what CryoPNG did behind the curtain had the same kind of patterns:

    At first I thought it was there to improve compression but infact it's closer to a bug.

    (what CryoPNG produces)

    Quote Originally Posted by m^2 View Post
    I noticed some odd dirty transparency pattern in some images from the Google's webp gallery:

    I thought it's to optimize webp prediction, but at least with wepbll it actually hurts compression compared to other schemes.
    Image 4 also has some things in background, though different. I didn't analyse it.
    Any ideas?

  20. #20
    Member caveman's Avatar
    Join Date
    Jul 2009
    Location
    Strasbourg, France
    Posts
    190
    Thanks
    8
    Thanked 62 Times in 33 Posts
    png2webpll itself does not clean dirty transparent pixels, I applied CryoPNG to the rose sample file:
    121363 1.png (original file)
    170325 f0.png
    119161 f1.png
    122218 f2.png
    115611 f3.png
    114681 f4.png

    ran png2webpll:
    90196 1.webpll
    84421 f0.webpll
    84230 f1.webpll
    83706 f2.webpll
    91290 f3.webpll
    84093 f4.webpll

    and BMF:
    80344 1.bmf
    72988 f0.bmf
    76336 f1.bmf
    75928 f2.bmf
    76772 f3.bmf
    77508 f4.bmf

    BMF still produces smaller files and way faster.

  21. #21
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,474
    Thanks
    26
    Thanked 121 Times in 95 Posts
    Well, if you zero some delta-encoded elements then the decoded stream won't necessarily have zeros in that places. In fully transparent areas there can be anything and that won't make any visible difference so one can develop a technique which inserts some weird values on the invisible image portion border as to potentially improve compression. It would bring some insight if you show directly the filtered images, eg filtered by x-delta, y-delta, xy-delta, Paeth filter and the fifth one (?), with (fully and partially) visible pixels masked out (so we'll see only what the compressor does with the invisible filtered pixels).

  22. #22
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    2 Piotr:
    Yes and CryoPNG does it.

  23. #23
    Member Alexander Rhatushnyak's Avatar
    Join Date
    Oct 2007
    Location
    Canada
    Posts
    237
    Thanks
    39
    Thanked 92 Times in 48 Posts
    Flower_foveon from http://www.imagecompression.info/test_images
    3267397 - PNG
    2659507 - webpll -c 30
    2441574 - webpll -c 50
    2441453 - webpll -c 70
    2440932 - webpll -c 90 -- ~6 hours
    2440804 - webpll -c 100 -- ~6 hours
    1783812 - BMF -s -q9
    1763982 - GraLIC 1.11.demo -- 4 seconds
    Last edited by Alexander Rhatushnyak; 24th November 2011 at 01:35. Reason: added webpll -c 100

    This newsgroup is dedicated to image compression:
    http://linkedin.com/groups/Image-Compression-3363256

  24. #24
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,474
    Thanks
    26
    Thanked 121 Times in 95 Posts
    Waiting for Shelwien to hack the decoder to output uncompressed PNGs :] Maybe WebP's strength is decompression speed & memory requirements. Otherwise, WebP looks very inferior even to BCIF.

  25. #25
    Member
    Join Date
    Feb 2010
    Location
    Nordic
    Posts
    200
    Thanks
    41
    Thanked 36 Times in 12 Posts
    *THE* big part of the WEBM idea is not best possible compression; its at best compariable compression. The big thing is Google owning or avoiding any patents. Presumably this is a driving force behind the technology choices in WEBP too.

  26. #26
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    Quote Originally Posted by willvarfar View Post
    *THE* big part of the WEBM idea is not best possible compression; its at best compariable compression. The big thing is Google owning or avoiding any patents. Presumably this is a driving force behind the technology choices in WEBP too.
    I think Google could own BCIF for less then they spent on WEBP lossless.
    Are you aware of any patents covering BCIF? I'm not, though it means little because I'm not into it...

    ADDED:

    BTW, webpll seems to be heavily optimized for tiny images. I'm doing a test right now, don't have any BCIF results yet, but partial webpll and optimized png.
    On files up to 1.2 KB webp saved 27%.
    1.2-2.4 KB - 22%
    2.4-4.3 KB - 14%

    By file size I mean size of a PNG as I got it; mostly unoptimized. Please note that there's a small selection bias with putting files to these 3 buckets: the smaller the file, the higher likeness that it's optimised, the less there is to gain. But optimized images are rare, so the effect it small. In the first buckets, that's just 3.8 %.

    Part of the reason is a very lightweight container, the smallest webpll file that I got takes just 6 bytes (1 32-bit pixel). The same image as optimized PNG is 70. But even if I added 64 bytes to each webpll file, saved % would regress with size.

    ADDED:
    4.3-8.2 KB - 19.6% so maybe it's just a random variability
    Last edited by m^2; 24th November 2011 at 13:42.

  27. #27
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    There is a bug in webpll. It silently corrupts 64-bit PNGs by converting them to 32-bit ones.
    I can't tell it to them because there is no way of contact other then by using Google Account, which, fuck you Google, I don't want to have. It would be nice if sb. passed the information to them.

  28. #28
    Member
    Join Date
    Nov 2011
    Location
    france
    Posts
    42
    Thanks
    2
    Thanked 27 Times in 19 Posts
    Quote Originally Posted by m^2 View Post
    There is a bug in webpll. It silently corrupts 64-bit PNGs by converting them to 32-bit ones.
    => http://code.google.com/p/webp/issues/detail?id=96

  29. #29
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    thx

    Though I certainly wouldn't call it a feature.
    Lossless means lossless.

  30. #30
    Member m^2's Avatar
    Join Date
    Sep 2008
    Location
    Ślůnsk, PL
    Posts
    1,611
    Thanks
    30
    Thanked 65 Times in 47 Posts
    I tested webpll and some other codecs on web png data.
    https://extrememoderate.wordpress.co...ion-benchmark/

Page 1 of 3 123 LastLast

Similar Threads

  1. UCI Image Compression
    By maadjordan in forum Data Compression
    Replies: 5
    Last Post: 20th August 2017, 00:15
  2. GraLIC - new lossless image compressor
    By Alexander Rhatushnyak in forum Data Compression
    Replies: 17
    Last Post: 29th November 2010, 21:27
  3. BMF is not binary lossless NOR pictore lossy
    By SvenBent in forum Data Compression
    Replies: 4
    Last Post: 23rd August 2009, 13:54
  4. Image retargeting
    By Black_Fox1 in forum Forum Archive
    Replies: 1
    Last Post: 24th August 2007, 05:02
  5. image compressors
    By maadjordan in forum Forum Archive
    Replies: 5
    Last Post: 13th August 2007, 10:28

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •