Results 1 to 6 of 6

Thread: new DXTn texture compressor/transcoder library released

  1. #1
    Member
    Join Date
    Aug 2010
    Location
    Seattle, WA
    Posts
    79
    Thanks
    6
    Thanked 67 Times in 27 Posts

    new DXTn texture compressor/transcoder library released

    I've open sourced a new DXTn mipmapped texture compressor and transcoding library:
    http://code.google.com/p/crunch/


    It uses the ZLIB license. It's primarily of interest to game developers, or anyone who needs to distribute lots of DXTn compressed texture data but can't afford to compress to DXTn in real-time. (In many games I've worked on, DXTn texture data was the #1 or #2 consumer of optical disc space, making it a big target for alternative storage schemes.) It basically implements a different way of compressing to DXTn. Most DXTn compressors only work at the 4x4 block level, try to optimize for best quality possible, and output raw (fixed block size) DXTn bits. This codec operates at a much higher level, the entire texture (i.e. the entire mip chain).

    Note that DXTn's quality isn't that great to begin with, as Charles Bloom points out here http://cbloomrants.blogspot.com/2008...tc-part-2.html, but the various DXTn formats are supported in hardware by practically all PC/console GPU's and by D3D/OpenGL, so it's useful to have a compressed custom texture format that can be quickly transcoded to DXTn. crunch's DXTn compressor outputs an intermediate format (.CRN) that was designed from the beginning to do just that.

    It can also create standard .DDS texture files that are much more compressible by a lossless compresor such as LZMA (or Deflate, LZO, etc.), by trading off quality by reducing the # of unique endpoints/selectors output to the resulting .DDS file. (Basically, it's able to trade off rate vs. distortion when compressing to DXTn.)

    -Rich

  2. #2
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,475
    Thanks
    26
    Thanked 121 Times in 95 Posts
    What about newer texture compression formats? See here for example: http://coderchameleon.blogspot.com/2...7-texture.html

  3. #3
    Member
    Join Date
    Feb 2010
    Location
    Nordic
    Posts
    200
    Thanks
    41
    Thanked 36 Times in 12 Posts
    +1 thank you rgeldreich!

  4. #4
    Member
    Join Date
    Aug 2010
    Location
    Seattle, WA
    Posts
    79
    Thanks
    6
    Thanked 67 Times in 27 Posts
    The library currently only supports BC1 through BC5. I would like to support the newer BC6H/7 at some point, but ran out of time. BC6H (the HDR format) seems really useful.

    For what it's worth, from a developer point of view I'm not excited by BC7. At 8bpp it's just too big compared to BC1/DXT1 (4bpp). Maybe I could see using it for titles/loading screens (I hate DXT1 compressed loading screens!), but for general texturing I would rather have 2x as many DXT1's, or the same number of DXT1's that are 2x the resolution on one axis vs. BC7. Or, if you really need better quality, you can use luma-chroma BC3/DXT5 (with Y in alpha, 8bpp) which can be made to work on all GPU's.

    What we really need is a new 4bpp or 2bpp format that breaks away from colorcell-based approaches and uses something more modern. Something like this:
    http://cbloomrants.blogspot.com/2009...edded-dct.html

  5. #5
    Member
    Join Date
    Feb 2010
    Location
    Nordic
    Posts
    200
    Thanks
    41
    Thanked 36 Times in 12 Posts
    You've worked in games, rgeldreich? The textures I've been compressing for models tend to contain absolutely buckets of symmetry typically mirroring (as do the meshes). Is that true of mainstream game data too? Are there any compressors that take advantage of this when it occurs?

  6. #6
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,475
    Thanks
    26
    Thanked 121 Times in 95 Posts
    I think that once you use something as complicated as DCT it starts to be feasible to use variable length block encoding. We can use ordinary JPEGs almost unmodified (of course we must change DCs from delta coded to independently coded but that's not a big deal) and then make a table with pointers to each macroblock. With such approach caches would have to hold Huffman tables, block pointers and compressed bitstream at the same time instead of keeping simply fixed size compressed blocks. But I think that's doable and there are plenty of well optimized JPEG coders. New/ upcoming generation GPGPU's are said to include memory paging/ addres translation and so on, so maybe that will become a standard in next major DirectX or OpenGL version.

Similar Threads

  1. Google released Snappy compression/decompression library
    By Sportman in forum Data Compression
    Replies: 11
    Last Post: 16th May 2011, 13:31
  2. pzpaq Parallel ZPAQ compressor released
    By Matt Mahoney in forum Data Compression
    Replies: 22
    Last Post: 22nd February 2011, 06:13
  3. QuickLZ ZIP - new zip/deflate library
    By Lasse Reinhold in forum Forum Archive
    Replies: 23
    Last Post: 1st October 2007, 23:08
  4. MM compression library
    By Bulat Ziganshin in forum Forum Archive
    Replies: 29
    Last Post: 12th September 2007, 16:40

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •