
Originally Posted by
SolidComp
Hi all -- Typical smartphone-acquired images are 8 - 16 megapixels, with some up to 20. Typical display resolutions on smartphones and desktop/laptop are 2 - 4 megapixels, with 4K reaching 8 megapixels, and 5k approaching 15 megapixels (only 1 megapixel for most iPhones). (Both 4K and 5K displays are still quite rare. Fewer than 10 percent of desktop users have them.)
I don't understand how downscaling actually works on displays. For example, I don't know how a 1080p video stream is scaled to a 720p display, or how a 16 megapixel photo is scaled to a 3.7 megapixel display (1440 × 2560, e.g. many Android flagships).
Questions:
1) Do different image compression codecs yield different behavior or tendencies when downscaled?
2) I think all or most of our image comparison tests are on downscaled images. Is this a problem?
3) Would it be helpful for compression codecs to be aware, in some sense, of downscaling / end-user resolutions when they compress images? Could images be optimized for downscaling, or optimized for the bounds of end-user display resolution?