nV Image Quality Optimizations

NokiaDude

Diamond Member
Oct 13, 2002
3,966
0
0
I currently use a Geforce 6800NU 256MB card from Asus. I'm new to the nvidia driver optimizations. I noticed when I switched to "performance" settings, I got a 3dmark03 score of ~9700. When set to "high quality", I get ~7600. Almost a 2 thousand point difference. What I want to know is what its the best sacrifice between performance and quality? I've noticed ever since I switched to an LCD monitor that runs 1280x1024, my FPS in games has taken a good hit.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
1280x1024 is a nonstandard resolution (not 4:3) that ATi helped to popularise. Try running at 1280x960 instead.

What is hugely annoying is even people who should know better like John Carmack have supported 1280x1024 while leaving out support for the correct 1280x960. Grrr.

When it comes to optimizations, choose those that look good to you. Personally I find I hardly have to turn optimizations on at all (using the 76.41 XG drivers).
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
ATI singularly "helped popularise" 1280x1024? I didn't realize they were responsible for every CRT maker available listing 1280x1024 as the "optimum" resolution for 19" tubes.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Telling someone what they want to hear even when it's wrong and someone else is saying it doesn't make it right.

My Philips 19" is supposedly "optimal" @ 1280x1024x86 hz, but I use it set to 1280x960x100 hz.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,008
126
John Carmack have supported 1280x1024 while leaving out support for the correct 1280x960
Every ID game from GLQuake onwards supports 1280x960 so I'm not sure where you're getting this from.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Gstanfor
Telling someone what they want to hear even when it's wrong and someone else is saying it doesn't make it right.
I'm not sure if that's directed at me, but I run 4:3 resolutions on my 4:3 CRTs (and I believe IHVs have added 12x9 to their drivers quite some time ago). What I was interested in were any references you might have relating to your singling out ATI as responsible for 5:4 res on 4:3 CRTs. In all the many, many threads created about 12x10 (and NokiaDude, you might want to run a search in these forums for that res), I've never heard of ATI as the driving force behind it. The most logical explanation I've heard was that 12x10 maximized power of two memory usage back when every kB counted.

BTW, 12x10 can now be considered a standard resolution, thanks to LCDs. I'm curious if older CRTs were in fact 5:4, or if 17-19" LCDs seized on 12x10 just because of its ubiquity in big CRTs, and not because of any inherent current technical or manufacturing advantage.
 

NokiaDude

Diamond Member
Oct 13, 2002
3,966
0
0
I'm using a Viewsonic VX910. The native resolution is 1280x1024. If I set it to 1280x960 it looks like CRAP! I've set the slider to Quality, I'll report back if I notice any glitches.