Results 1 to 6 of 6

Thread: (Probably) New Image Compression Algorithm written in Rust

  1. #1
    Member
    Join Date
    Jan 2021
    Location
    Germany
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Post (Probably) New Image Compression Algorithm written in Rust

    EDIT: The explanation version here is a bit outdated. For the new and more complete version of the explanation, please consider visiting the GitHub. It's quite hard to keep up with the changes.
    EDIT: Due to the latest switch from deflate in favor of Zlib, the algorithm manages to SURPASS PNG in some cases.


    Hello,
    I'm a newbie to compression and this forum, so please be patient.
    I recently wrote an image compression algorithm and put a prototype on GitHub.

    I've written a long explanation there, but in order to make it more comfortable here's the explanation:

    Image Compression Algorithm

    A new image compression algorithm.
    In this release version, the algorithm performs worse than PNG in most cases. In fact, the only image, where the algorithm outperforms PNG is the white void of img_3.png. However, the algorithm produces just slightly larger files then PNG. For example, img_2.png is about 12.8 MB, the resulting binary is 12.9 MB.

    How the system works

    Clustering
    The first step in the system is clustering the pixels. This happens in 5 Dimensions, with R, G, B, x and y of every Pixel. X & Y are normed over 255 in order to have a balance between the color values and the pixel position. This might offer a possible improvement.
    In the current settings a Kmeans is used to define 3 dominant clusters. More clusters are possible, but the calculation time increases rapidly with an increasing number of clusters. The encoding supports up to 255 clusters, but this is probably overkill.
    After defining the clusters, we calculate a cluster map, that removes the color values and just displays belonging to a cluster. A visualization of this would look like this:


    Grid

    In the next step we lay a grid on top of the cluster map. The chunks of the grids are not fixed size. They vary in size near the edges. For every grid, we check if all pixels in a grid belong to the same cluster. If this is given, the pixel is calculated relative, otherwise absolute. The gird contains for every chunk a value that determines the cluster or that the chunk has to be calculated absolute. Here is an illustration of this grid map. Every white pixel, symbolizes an absolute chunk.


    Calculating Lists

    In this step we, finally calculate the pixel values that are later written into the file. Every chunk is calculated according to the grid’s perception of absolute or relative value. Every chucks pixel values are added to a super list of relative or absolute pixel values. The pixel values are calculated in wiggly lines. Every cluster has a minimum pixel value. This value is according to the minimum R, G, B value in that chunk. The resulting pixel value is an addition of this chunk value and the encoded pixel value.

    Flatten and Byte conversion

    The grid, the cluster colors, the lines are converted in Vectors of u8 and then converted into bytes.

    Zstd

    Grid and lines bytes representations are compressed with the zstd algorithm. This should achieve the compression and provides an opportunity to optimization.

    Write File

    The resulting binary is just a list of the relevant compressed objects.

    Advantages compared to PNG

    Because of the grid, it's possible to load just specific chunks without loading the entire image. With further improvements it might be possible to surpass PNG in compression rate, but I can't prove that.

    Disadvantages compared to PNG
    Because of the clusterisation it takes quite long to calculate a result. It might be possible to improve that, although this would probably require to abolish Kmeans for another clustering algorithm. One solution to that could be a neuronal net.


    Help and Ideas are much appreciated, especially contributions on GitHub
    Thanks for your time!
    Last edited by umgefahren; 20th January 2021 at 13:59.

  2. #2
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    4,138
    Thanks
    321
    Thanked 1,402 Times in 804 Posts
    Welcome!
    Can you post some test results vs other lossless codecs?
    http://imagecompression.info/gralic/LPCB.html

  3. #3
    Member
    Join Date
    Jan 2021
    Location
    Germany
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts
    I would love to. Give me some time.
    Last edited by umgefahren; 20th January 2021 at 10:51.

  4. #4
    Member
    Join Date
    Jan 2021
    Location
    Germany
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Detailed Test Results

    I have my results!
    The hole folder with all the compressed images is 305,598,693 Bytes in size. It took 405.2812011241913 Seconds to compress them. It took 9.863522052764893 Seconds to decompress them. I used this image set: RGB 8 bit
    My compression ratio is 1.539966344 on the hole image set.
    The compression ratio of the individual images are:

    Code:
    | File Name              | Compression Ratio  |
    |------------------------|--------------------|
    | spider_web.ppm         | 2.14235671557331   |
    | deer.ppm               | 1.2424318516015507 |
    | fireworks.ppm          | 3.642381743674327  |
    | artificial.ppm         | 12.25476523000428  |
    | bridge.ppm             | 1.2273064711294759 |
    | flower_foveon.ppm      | 2.4469685311217293 |
    | big_tree.ppm           | 1.2789847127858722 |
    | cathedral.ppm          | 1.5089509013690656 |
    | hdr.ppm                | 1.9960575653205344 |
    | leaves_iso_1600.ppm    | 1.203903570936856  |
    | big_building.ppm       | 1.3922857035699863 |
    | nightshot_iso_1600.ppm | 1.501047996887146  |
    | nightshot_iso_100.ppm  | 2.251600481220427  |
    | leaves_iso_200.ppm     | 1.3158267828823695 |


    The table function seems broken, i hope this is fine.


  5. #5
    Member
    Join Date
    Jan 2014
    Location
    USA
    Posts
    14
    Thanks
    13
    Thanked 4 Times in 3 Posts
    New approaches to compression and image processing are exciting and invigorating! I hope they stimulate others who have image compression specialization to consider new perspectives as well!

    Watching this method evolve will be a lot of fun! Thanks for introducing us to your Rust image compression library!

  6. #6
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    976
    Thanks
    266
    Thanked 350 Times in 221 Posts
    Quote Originally Posted by umgefahren View Post
    The first step in the system is clustering the pixels. This happens in 5 Dimensions, with R, G, B, x and y of every Pixel. X & Y are normed over 255 in order to have a balance between the color values and the pixel position.
    Funnily, this was a starting point for brunsli. In the first six weeks of Brunsli development (back in 2014) we migrated away from the clustering, because context modeling (and even prediction) were stronger for our JPEG recompression corpus.

    It doesn't mean that it could be more effective now for another domain. Just back then me and Zoltan were not able to get fantastic results with this approach on JPEG recompression modeling. We got about -14 % of JPEG with this kind of ideas (k means in five dimensions to do entropy coding) and -22 % with the ideas expressed in brunsli or JPEG XL.

    In WebP lossless we sometimes have 700+ entropy codes for different ergodic processes -- so having 3 to 255 is not necessarily excessive.

Similar Threads

  1. orz - an optimized ROLZ data-compressor written in rust
    By RichSelian in forum Data Compression
    Replies: 7
    Last Post: 2nd February 2020, 07:54
  2. My new compression algorithm
    By tefara in forum Random Compression
    Replies: 55
    Last Post: 12th June 2019, 21:45
  3. Why did Rust become very popular?
    By kassane in forum The Off-Topic Lounge
    Replies: 32
    Last Post: 9th November 2018, 08:20
  4. Anyone know which compression algorithm does this?
    By hjazz in forum Data Compression
    Replies: 8
    Last Post: 24th March 2014, 06:49
  5. Help identify compression algorithm?
    By DotDotDot in forum Data Compression
    Replies: 0
    Last Post: 1st June 2013, 10:15

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •