Calling all nVidia 7 series owners

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
As anyone who spends time in video knows over the past few days the differences in performance and IQ between the various driver settings has become a topic of significant discussion. To try and settle this issue would any nVidia 7 series owner who has time benchmark, take video or screenshots (although screenshots reportedly don't show the difference very well) at different driver settings, High performance, Performance, Quality (the default), and High Quality.

For videos you can use Mega Uploads.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Google video won't show anything either, quality does not really get retained ;)
 

Fadey

Senior member
Oct 8, 2005
410
6
81
Ive owned a ti 4200 , 7800gtx and now a 7950 and theres almost no change in the quality settings being changed.. theres a slight difference from lowest to highest but unless u got a decent monitor u wont even notice it.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Thanks Fadey, but I'm looking for videos or screenshots so people can judge for themselves
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Do you have any games in mind that you want to screenshot or video? I also don't know about getting a video to show the quality errr or unquality of some of the nvidia settings. I have a 7800gt 540/1250 if you want me to bench or upload some games.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: redbox
Do you have any games in mind that you want to screenshot or video? I also don't know about getting a video to show the quality errr or unquality of some of the nvidia settings. I have a 7800gt 540/1250 if you want me to bench or upload some games.

Most of the discussion is about AF, so scenes that show off AF prowess are good.

This is a good spot in HL2.
and here is a good spot in Oblivion.
 

mylok

Senior member
Nov 1, 2004
265
0
0
why would nvidia put the HQ setting in there if there was no difference. I believe most people agree there is a speed difference between Q and HQ (some say it is small and some say the performance hit is larger but there is some kind of hit). So why would nvidia put that option in there unless there was a difference? makes no sense to me. I own both a 7900gtx and a 1900xtx so I will run some tests as soon as MSI returns my GTX. As best as I can remember 3dmark06 dropped about 200-300 point by switching from Q to HQ but again I will run some tests when MSI (if MSI) ever ships my card back. :)
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Redbox, people have said that some of the worst offenders are BF2 and WOW but just testing those two might lead so to call bias so I'm looking for a relatively wide selection of games, basically test what ever you want to test.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: mylok
why would nvidia put the HQ setting in there if there was no difference. I believe most people agree there is a speed difference between Q and HQ (some say it is small and some say the performance hit is larger but there is some kind of hit). So why would nvidia put that option in there unless there was a difference? makes no sense to me. I own both a 7900gtx and a 1900xtx so I will run some tests as soon as MSI returns my GTX. As best as I can remember 3dmark06 dropped about 200-300 point by switching from Q to HQ but again I will run some tests when MSI (if MSI) ever ships my card back. :)

Yes, there is a speed difference going from Q to HQ (although with 7900GT's in SLI I don't really notice it :))

One thing that can be noticed though is that the shimmering is almost non-existent when you use HQ and turn off all the Optimizations (and sometimes NV CP says there off, but I always use Rivatuner just to make sure).

I have 2x7900GT's in SLI @ 700/1620 so I can afford to really use HQ at basically any res with 4xAA/16xAF (as well as some other things :)) but if you want, I'll take some screenshots.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Maby I should start my own Nvidia IQ comparison thread? lol
Took me a while but I got some screenshots for ya. All were originally .png but were too big to host on Imageshack. So I compressed them with Photoshop with the jpeg quality meter all the way up to avoid any jpeg compressiong issues.

I only did Oblivion, but it took me a half hour so don't complain :)

Be patient as each picture is about 1 meg so they may take a while to load.

Alright, so everyone can get a reference image to use, here Oblivion set to HQ in CP, everything max, no AA and no AF: Text

Thats just so that there something to compare say 16xAF (which is a huge improvement over it) so here we go (the HQ, and Q indicate which option was used in the driver):

Oblivion HQ 0xAA/0xAF bloom no opt. (same as above)

Oblivion HQ 4xAA/0xAF bloom no opt.

Oblivion HQ 4xAA/8xAF bloom no opt.

Oblivion HQ 4xAA/16xAF bloom no opt.

Oblivion Q 4xAA/16xAF bloom 2 opt. on (first and last opt. middle one off)

Oblivion Q 4xAA/16xAF bloom all 3 opt. on (all opt on)

Also for comparison, a HDR picture: Oblivoin HDR HQ 0xAA/16xAF no opt.

So is this what the OP wanted? Someone, please comment as it took a while to do this.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
there is very little difference in IQ from quality with no optimizations to HQ with no optimizations with my 7600gt. im not on that computer right now, so ill post some screenshots later :)
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
thanks wizboy, what I was really looking for were the last 3 (before the HDR one) to show the difference the optimizations make, which so far looks minimal.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: fierydemise
thanks wizboy, what I was really looking for were the last 3 (before the HDR one) to show the difference the optimizations make, which so far looks minimal.

It is pretty hard to tell the difference in the screenshots.

wizboy, can you notice more of a difference in motion?
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
games shimmer much more when one uses high performance in contrast to high quality

in addition, the color seems a little off when using high performance especially in NFSMW

framerates-wise, i notice that i gain up to 10% minimum framerates when using sli

i experienced the same thing using 7800 gt sli, 7800 gtx, and 7900 gtx sli
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
moonboy, thanks for your input but would you be willing to take some video, screenshots or benchmarks so we can see exactly what you're talking about and judge for ourselves
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: fierydemise
moonboy, thanks for your input but would you be willing to take some video, screenshots or benchmarks so we can see exactly what you're talking about and judge for ourselves

What would you want us to capture video with? The quality is bound to be less than if you are at the computer screen yourself.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: redbox
What would you want us to capture video with? The quality is bound to be less than if you are at the computer screen yourself.
I don't really know, I haven't done that much but I know various forum members and review sites have relatively high quality videos. Hopefully someone with more experience can clear things up for us.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: wizboy11
Maby I should start my own Nvidia IQ comparison thread? lol
Took me a while but I got some screenshots for ya. All were originally .png but were too big to host on Imageshack. So I compressed them with Photoshop with the jpeg quality meter all the way up to avoid any jpeg compressiong issues.

I only did Oblivion, but it took me a half hour so don't complain :)

Be patient as each picture is about 1 meg so they may take a while to load.

Alright, so everyone can get a reference image to use, here Oblivion set to HQ in CP, everything max, no AA and no AF: Text

Thats just so that there something to compare say 16xAF (which is a huge improvement over it) so here we go (the HQ, and Q indicate which option was used in the driver):

Oblivion HQ 0xAA/0xAF bloom no opt. (same as above)

Oblivion HQ 4xAA/0xAF bloom no opt.

Oblivion HQ 4xAA/8xAF bloom no opt.

Oblivion HQ 4xAA/16xAF bloom no opt.

Oblivion Q 4xAA/16xAF bloom 2 opt. on (first and last opt. middle one off)

Oblivion Q 4xAA/16xAF bloom all 3 opt. on (all opt on)

Also for comparison, a HDR picture: Oblivoin HDR HQ 0xAA/16xAF no opt.

So is this what the OP wanted? Someone, please comment as it took a while to do this.

Your HQ 4xAA,16xAF,no opts screenshot actually looks like it has no AA. Is that another hidden "optomization"?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Your HQ 4xAA,16xAF,no opts screenshot actually looks like it has no AA. Is that another hidden "optomization"?

You sure you aren't looking at the HDR pic? I can definitely see AA being applied on the no opts/bloom capture compared to no opts/HDR capture.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: nitromullet
Your HQ 4xAA,16xAF,no opts screenshot actually looks like it has no AA. Is that another hidden "optomization"?

You sure you aren't looking at the HDR pic? I can definitely see AA being applied on the no opts/bloom capture compared to no opts/HDR capture.

I'm talking about this one:
Oblivion HQ 4xAA/16xAF bloom no opt.

Look at where the edges of the sidewalk meet the walls, or at the tower ahead. I can definitely see plenty of jaggies there, compared to the other AA screenshots.