• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

nV Image Quality Optimizations

NokiaDude

Diamond Member
I currently use a Geforce 6800NU 256MB card from Asus. I'm new to the nvidia driver optimizations. I noticed when I switched to "performance" settings, I got a 3dmark03 score of ~9700. When set to "high quality", I get ~7600. Almost a 2 thousand point difference. What I want to know is what its the best sacrifice between performance and quality? I've noticed ever since I switched to an LCD monitor that runs 1280x1024, my FPS in games has taken a good hit.
 
1280x1024 is a nonstandard resolution (not 4:3) that ATi helped to popularise. Try running at 1280x960 instead.

What is hugely annoying is even people who should know better like John Carmack have supported 1280x1024 while leaving out support for the correct 1280x960. Grrr.

When it comes to optimizations, choose those that look good to you. Personally I find I hardly have to turn optimizations on at all (using the 76.41 XG drivers).
 
ATI singularly "helped popularise" 1280x1024? I didn't realize they were responsible for every CRT maker available listing 1280x1024 as the "optimum" resolution for 19" tubes.
 
Telling someone what they want to hear even when it's wrong and someone else is saying it doesn't make it right.

My Philips 19" is supposedly "optimal" @ 1280x1024x86 hz, but I use it set to 1280x960x100 hz.
 
John Carmack have supported 1280x1024 while leaving out support for the correct 1280x960
Every ID game from GLQuake onwards supports 1280x960 so I'm not sure where you're getting this from.
 
Originally posted by: Gstanfor
Telling someone what they want to hear even when it's wrong and someone else is saying it doesn't make it right.
I'm not sure if that's directed at me, but I run 4:3 resolutions on my 4:3 CRTs (and I believe IHVs have added 12x9 to their drivers quite some time ago). What I was interested in were any references you might have relating to your singling out ATI as responsible for 5:4 res on 4:3 CRTs. In all the many, many threads created about 12x10 (and NokiaDude, you might want to run a search in these forums for that res), I've never heard of ATI as the driving force behind it. The most logical explanation I've heard was that 12x10 maximized power of two memory usage back when every kB counted.

BTW, 12x10 can now be considered a standard resolution, thanks to LCDs. I'm curious if older CRTs were in fact 5:4, or if 17-19" LCDs seized on 12x10 just because of its ubiquity in big CRTs, and not because of any inherent current technical or manufacturing advantage.
 
I'm using a Viewsonic VX910. The native resolution is 1280x1024. If I set it to 1280x960 it looks like CRAP! I've set the slider to Quality, I'll report back if I notice any glitches.
 
Back
Top