What's wrong with lossless texture compression?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
The new DX11 compression formats for FP32 textures are 8:1. That's way too high, IMO.

I don't know why no one has ever developed a lossless tc algorithim (other than for memory footprint).

I know that the ratios for a lot of textures would barely approach a full 2:1, but there would be no artifacts. They wouldn't be able to use nearly as large of textures, but at least there would be no artifacts and no blotchiness.

Any thoughts on this?

I've always been a proponent of lossless compression, be it for video, audio, textures, and whatnot, so maybe I'm biased. I'm more sensitive to artifacts than most people I guess.

Some lossy algorithims are better than others (VQ is better than DXT) but lossless is the way to go, IMO.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
I’d expect game developers already upload textures with lossless compression (e.g. PNG, TGA), so more lossless compression would achieve little to nothing.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,204
126
You would think that with the high-end video decoding hardware onboard GPUs these days, that they could use that hardware for texture decompression. Imagine textures stored in MPEG-4/h.264 frame format or something akin to that.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
You would think that with the high-end video decoding hardware onboard GPUs these days, that they could use that hardware for texture decompression. Imagine textures stored in MPEG-4/h.264 frame format or something akin to that.

Doesnt Civ5 do something like that with directcompute?
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
I think the decompression time of lossless textures may not acceptable in a real time game unless there is plenty of VRAM available. This is probably one reason why I can't play Starcraft 2 with Ultra settings on my 512MB VRAM based 9800GT's.

There are other available options including file formats which map very nicely to GPU's

OpenEXR
http://www.openexr.com

this is Industrial Light and Magic's lossless format for GPU rendering to cinema.

I don't know if this format has ever been used in a real time game, however.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
No, I'm talking about in HW. DXT isn't a lossless format.
I don’t think you understand what I’m saying. The GPU is given textures that have already been losslessly compressed at the source.

What you’re asking for is akin to taking a zip file and zipping it again; you’ll gain practically nothing by doing this.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
What you’re asking for is akin to taking a zip file and zipping it again; you’ll gain practically nothing by doing this.

you would actually lose. I tested it extensively back in the day, different files compress better on rar, zip, or ace (and different compression programs)
while you could get a little more space by taking a poorly compressed file and compressing it in another, you will get better compression by simply compressing it with another compressing program and at higher setting (aka, max compression setting)

if you take the best compressed file and start compressing it in other programs you will get a file that is LARGER then the original file (no further compression + more overhead). AND you need to waste more CPU time to decompress, it is a pure loss with no benefits.

However, I don't believe that is what the op is advocating. I think he is suggesting the use of lossless compression over lossy compression. Aka, use png rather then jpg, use FLAC rather then mp3, etc.
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,284
138
106
I don’t think you understand what I’m saying. The GPU is given textures that have already been losslessly compressed at the source.

What you’re asking for is akin to taking a zip file and zipping it again; you’ll gain practically nothing by doing this.

Well, sort of. AFAIK, the GPU handles only raw bitmaps. (no compression). You can store your images in whatever format you want. DirectX has the tools to decompress some of its own formats, but ultimately you don't HAVE to use one of the built in formats. So long as you can decompress whatever it is you want to use, you can use it.

That being said, there is no reason TO use lossless data compression in games. What are you trying to preserve? Do you really think that texture is going to be compressed again and again?

lossy compression offers much higher compression ratios and the ability of the developer/artists to be able to say when something is compressed enough. Thus, they are able to have much smaller game files (A big plus).

People unfairly give lossy compression the shaft. There is an excellent reason that it is used, because it works so well. The only legitimate claim against lossy compression is that someone wants whatever the media is to be able to be recompressed in different formats. Not a concern for games.
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
People unfairly give lossy compression the shaft. There is an excellent reason that it is used, because it works so well. The only legitimate claim against lossy compression is that someone wants whatever the media is to be able to be recompressed in different formats. Not a concern for games.

This. I think the performance improvements (even 2:1 is huge) clearly outweigh any image quality losses and most gamers couldn't tell the difference.

I read that Far Cry uses OpenEXR and so do maybe the newer Splinter Cell series of games.

I don't know if the games use the lossless modes of OpenEXR but I doubt it.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Well, sort of. AFAIK, the GPU handles only raw bitmaps. (no compression). You can store your images in whatever format you want. DirectX has the tools to decompress some of its own formats, but ultimately you don't HAVE to use one of the built in formats. So long as you can decompress whatever it is you want to use, you can use it.
Internally the GPU ultimately needs to decompress any given texture to use it in a rendering operation. However D3D's texture compression formats (BC1-BC7) are extremely cheap to decompress, so compressed textures are always held in compressed form in the GPU's RAM and simply decompressed as an extra stage in the rendering process*. A good texel cache would further offset any overhead, as texels are cached uncompressed.

Otherwise you're right. For the purposes of offline storage of a texture, the texture can be in any format. The application need only be able to decompress it so that it can feed it to the GPU in a uncompressed format.

For anyone curious, for D3D11 here's the master format chart. It includes shader, render target, and texture formats, so not everything here is a texture format (but most things are). The important bit is that anything with an X is required to be supported in hardware; all 7 BC formats are among those required.

Microsoft's own recommendations are to use compressed texture formats in all situations where you aren't directly mapped to the screen. "While lossy, block compression works well and is recommended for all textures that get transformed and filtered by the pipeline. Textures that are directly mapped to the screen (UI elements like icons and text) are not good choices for compression because artifacts are more noticeable." Thus as far as the original question is concerned, it's safe to say Microsoft doesn't consider a native lossless compression format to be necessary. After all, directly mapped textures are used extremely rarely and for everything else they want you to use lossy compression.

* Side note: The fact that it's decompressed on the fly is what lead to NV2x's terrible IQ with S3TC textures
 
Last edited:

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
I remember texture compression looked really bad with Quake 3. I don't remember ever really noticing the artifacts in any other games. Unreal Tournament looked awesome on my Voodoo5 with the S3TC patch :) Since then, texture compression has really become the norm and I don't mind. I'd rather have a 2048x2048 texture with lossy compression than a 512x512 with lossless..
 

Ross Ridge

Senior member
Dec 21, 2009
830
0
0
Internally the GPU ultimately needs to decompress any given texture to use it in a rendering operation. However D3D's texture compression formats (BC1-BC7) are extremely cheap to decompress, so compressed textures are always held in compressed form in the GPU's RAM and simply decompressed as an extra stage in the rendering process*. A good texel cache would further offset any overhead, as texels are cached uncompressed.

Yah, it would be way too expensive in terms of die space and computation speed to implement any sort of lossless compression algorithim on the GPU. Lossless compression algorithms also don't get good compression ratios, even 2:1 is very unlikely for the continuous tone images that make up most textures. You won't be saving much video memory, which would hurt performance even more. You'd end up getting better performance just using uncompressed textures.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yah, it would be way too expensive in terms of die space and computation speed to implement any sort of lossless compression algorithim on the GPU. Lossless compression algorithms also don't get good compression ratios, even 2:1 is very unlikely for the continuous tone images that make up most textures. You won't be saving much video memory, which would hurt performance even more. You'd end up getting better performance just using uncompressed textures.
Does anyone have any real data backing up the assertion that it would take too many resources to do lossless texture decompression? It doesn't seem like LZW/LZMA/DEFLATE would be used due to dictionary size, so it would have to be something quite different, which would make it hard to compare.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Can you store your textures as D3D lossy in the first place (instead of TGA/PNG/bmp) so that they don't get converted twice and take up less space?
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Does anyone have any real data backing up the assertion that it would take too many resources to do lossless texture decompression? It doesn't seem like LZW/LZMA/DEFLATE would be used due to dictionary size, so it would have to be something quite different, which would make it hard to compare.

I think the issue is that compression ratios with lossless compression are very poor compared to lossy compression. For example, lossless JPEG (which is quite a sophisticated lossless image compression algorithm - working on spatial differencing within image tiles, which is something more sophisticated than PNG offers) offers only 10-20% file size saving for photographic images, whereas lossy JPEG can offer 60-80% savings for a minimal decrease in image quality. While computationally, lossless JPEG is considerably simpler, it simply doesn't offer a useful compression ratio. As a result, lossless JPEG has been lost in the mists of time (not only is it unsupported any any modern software, I can't even find source code, or even a pseudo-code explanation of how it works).

So, with lossy compression, there may be significant performance benefits, due to less texture swapping, and reduced VRAM bandwidth consumption - which more than outweigh the computational cost of decompression.

With lossless compression, the reduced compression ratio may not be favorable enough to make it worth the effort.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Does anyone have any real data backing up the assertion that it would take too many resources to do lossless texture decompression? It doesn't seem like LZW/LZMA/DEFLATE would be used due to dictionary size, so it would have to be something quite different, which would make it hard to compare.
The problem is one of unknowns. With both lossy and lossless, you can have an unknown amount of memory to decompress, an unknown amount of time to decompress, and an unknown amount of space that the compressed input data will take up (or unknown output size, but for textures or other imagery, that would not make sense). It's less that there even could be such data, than it is that it's not generally going to be seen as a worthwhile effort--at least not for gaming hardware. If you can only save a little bit of space anyway, why even bother compressing it? It will just waste GPU resources.

On top of that, if you fix decompression complexity, you guarantee that you will not get as much compression. So now, instead of saving 30% of your space, you save 15%. Wow. Meanwhile, lossy are giving you 2x, 4x, 8x, etc. the pixels to work with, in any given amount of video memory. Even with 1-2GB being common, video memory is not free.

It's much simpler to know how much space the input data will take, how much space will be needed to do the decompression, and to have a minimum of complexity in decompressing it (part of which is to make the complexity a fixed value for any given chunk of compressed data). To do so, sure, you end up with a single image that looks worse than lossless, but you can have many times the pixels of said lossless image to work with, and do so with less time and power consumed (it's basically free on modern GPUs). The minimal complexity results in basically no impact on performance, and the larger data set allows more detailed textures, which can then be used to give more samples per outputted pixel for a given amount of memory space and bandwidth consumed, allowing the game devs to give the game better IQ than with no compression, or lossless.

If you are seeing artifacts from compression, the texture is too small, and/or is going from lossy to lossy (nothing like plain old JPEGs to work from...ugh!). If you are seeing blotchiness, it's probably undersampled by a great deal. If you are seeing aliasing w/ AA on, the shaders are probably written in a way that leaves them undersampled (not necessarily something texture compression can fix, and a common IQ problem seen in console ports). If you have many pixels to sample for each pixel on your display, you are far better off than having one pixel to sample for each pixel on your display, and even in that case, you are better than having a pixel get fuzzed out across tens of pixels of your display.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Can you store your textures as D3D lossy in the first place (instead of TGA/PNG/bmp) so that they don't get converted twice and take up less space?
Yes, in fact it's the ideal method for handling texture compression. It means each texture can be matched up with the best compression format ahead of time and specifically compressed for the best quality, whereas runtime compression requires quick & dirty guessing. To be honest I can't think of any instances where you'd want to use runtime compression these days - it was mostly used in the early days before compressed textures became widely used, particularly since S3 outfitted the Savage line with less RAM believing they could use S3TC to cut hardware costs.

As for when pre-compressed textures became mainstream, I'd have to go back and double-check the DX documentation. I believe full hardware support for BC1-BC3 (S3TC) has been required since DX8, while BC4/5 are required for DX10, and BC6/7 for DX11. The ability goes back farther than that though; one of the earliest games to ship with pre-compressed textures was UT99, which shipped them on a second CD at the behest of S3 as a perk for Savage owners.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
The problem is one of unknowns. With both lossy and lossless, you can have an unknown amount of memory to decompress, an unknown amount of time to decompress, and an unknown amount of space that the compressed input data will take up (or unknown output size, but for textures or other imagery, that would not make sense). It's less that there even could be such data, than it is that it's not generally going to be seen as a worthwhile effort--at least not for gaming hardware. If you can only save a little bit of space anyway, why even bother compressing it? It will just waste GPU resources.

On top of that, if you fix decompression complexity, you guarantee that you will not get as much compression. So now, instead of saving 30% of your space, you save 15%. Wow. Meanwhile, lossy are giving you 2x, 4x, 8x, etc. the pixels to work with, in any given amount of video memory. Even with 1-2GB being common, video memory is not free.

It's much simpler to know how much space the input data will take, how much space will be needed to do the decompression, and to have a minimum of complexity in decompressing it (part of which is to make the complexity a fixed value for any given chunk of compressed data). To do so, sure, you end up with a single image that looks worse than lossless, but you can have many times the pixels of said lossless image to work with, and do so with less time and power consumed (it's basically free on modern GPUs). The minimal complexity results in basically no impact on performance, and the larger data set allows more detailed textures, which can then be used to give more samples per outputted pixel for a given amount of memory space and bandwidth consumed, allowing the game devs to give the game better IQ than with no compression, or lossless.
Ahh, now there's a very good point I never even thought of. All of the BC formats have a fixed compression ratio. That would certainly make lossless compression difficult at best.

And actually, one more point.

The new DX11 compression formats for FP32 textures are 8:1. That's way too high, IMO.
Anarchist, where are you seeing 8:1 compression? BC4/5 are for normal maps & such, BC6 is for HDR textures and is only 6:1, and BC7 is for LDR textures and is 3:1 for RGB and 4:1 for RGBA. The only thing that compresses 8:1 is BC1, and it's old-school S3TC which means it only takes integer textures, so that's 8:1 for an R8G8B8A8 texture.
 
Last edited:

Ross Ridge

Senior member
Dec 21, 2009
830
0
0
I believe full hardware support for BC1-BC3 (S3TC) has been required since DX8, while BC4/5 are required for DX10, and BC6/7 for DX11.

It wasn't until DirectX 10 that Microsoft imposed any minimum standards on video cards. Before that every format was optional. Direct3D 9 was designed to work with any video card imaginable. The original Voodoo 1 could've gotten a WHQL certified DirectX 9.0c compatible driver if 3dfx still existed and bothered to write one.

It looks like ATI and NIVIDIA started supporting the block compressed texture formats (BC1-BC3/DXT1-DXT5) with the Rage 128 Pro and GeForce 256.
 
Last edited:

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Ahh, now there's a very good point I never even thought of. All of the BC formats have a fixed compression ratio. That would certainly make lossless compression difficult at best.

And actually, one more point.

Anarchist, where are you seeing 8:1 compression? BC4/5 are for normal maps & such, BC6 is for HDR textures and is only 6:1, and BC7 is for LDR textures and is 3:1 for RGB and 4:1 for RGBA. The only thing that compresses 8:1 is BC1, and it's old-school S3TC which means it only takes integer textures, so that's 8:1 for an R8G8B8A8 texture.
I had thought the HDR textures were compressed 8:1, thanks for pointing that out:)

But since lossless texture compression would kill performance it will probably never happen. The average lossless compression ratio would probably be ~8:5 and that's only 40% as much as 4:1. So it would on average, take up 2.5 as much RAM and require GPUs to be capable of much higher fillrate. Still, I always prefer less detail and no artifacts than more detail and artifacts.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I had thought the HDR textures were compressed 8:1, thanks for pointing that out:)

But since lossless texture compression would kill performance it will probably never happen. The average lossless compression ratio would probably be ~8:5 and that's only 40% as much as 4:1. So it would on average, take up 2.5 as much RAM and require GPUs to be capable of much higher fillrate. Still, I always prefer less detail and no artifacts than more detail and artifacts.
Actually the fillrate wouldn't be any different. The textures have to be decompressed before processing, so the number of texels processed remains the same. The penalty would be storing the relatively larger textures, and the memory bandwidth needed to work with them.