Texture Compression settings

Doxymarz

Junior Member
Apr 11, 2003
1
0
0
Hi
I am wondering if ?Texture Compression? should be enabled or disabled to improve FPS performance. It?s disabled in Powerstrip for both D3D and GL by default.
I have noticed that some games give you the choice of using 16bit or 32bit Texture.
How do these features, ?Texture Compression? and ?16/32bit Texture,? relate to one another? Do I want Texture Compression on or off with a Radeon 9000 Pro?
And, what would be the best choices regarding these settings that I could make within Powerstrip and/or a game to get smoother game play? I prefer frame speed over graphical quality. I normally run games at 800x600 or 1024x768 with 16bit color.

Thanks
 

notfred

Lifer
Feb 12, 2001
38,241
4
0
Turn on, run benchmark, turn off, run benchmark, compare. There, you've got your answer as to which way is faster.
 

VBboy

Diamond Member
Nov 12, 2000
5,793
0
0
Originally posted by: notfred
Turn on, run benchmark, turn off, run benchmark, compare. There, you've got your answer as to which way is faster.

LOL, I was going to recommend the same thing.

With the old S3 texture compression, you would see major quality loss. With new compression, there may be no loss yet much better performance.