A 7900GTX at 700mhz/1800mhz always gets beaten by a X1900XTX at stock

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Only exception being Quake 4

This seems to go overlooked by a lot of people but its important.

Legit Reviews did a review of a 7900GTX overclocked by XFX @ 700mhz/1800mhz versus several ATI and Nvidia cards.


The difference in this review, is that they finally ran both cards at equal image quality settings:

"Please note that for all tests, excluding 3D Mark 2006, Nvidia image quality settings were set from "Quality" to "High Quality." ATI image quality settings were left at "High Quality" with "High Quality Anisotropic Filtering" enabled in Catalyst Control Panel."

As you can see from the review, the GTX, despite the overclock, gets trounced more or less in every game by around 10-25%.


IMO people buying these cards (including myself) from nvidia based upon benchmarks with "quality" enabled are being fooled, because no one pays 500 bucks for a card only to leave it at lower quality image settings.

From what I've read ATI engineers focus quite a bit on low-impact at high image quality which might explain why their cards perform so much better.


But frankly, those buying based on "stock driver settings" benchmarks are making a big mistake IMO. I just wish they had benched Oblivion, BF2 and Source games


 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
In my expierence, NV cards do take a rather large hit when set to HQ, because it turns ops off. its pretty common knowledge. Sadly, this was a must for me, because shimmering is much worse in Quality, than High Quality.

And to add to the first post, ATi defaults to the highest mipmap detail. There are four settings, and ATi hsa a default of 4. NV on the other hand, defaults to the [bthird[/b] highest setting, or 75%. They too have four settings, and the default is 3/4. Virtually every review site uses "default" settings for drivers. It is not an apples to apples comparison. But then, its pretty hard to do that they way the different drivers are set up. Personally, I would love for reviewers to run the cards at their absolute highest drivers settings (other than AA), with all ops turned off. Thats how I play games, thats how I would like to see cards compared.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
uh, you do realize they only tested 6 games right? out of the six games, the 7900gtx is close or beats the x1900xtx in 3 of them :confused: all you accomplished by this post was to start another huge flamewar. testing six games can hardly be called "all" games :disgust:

Q4- 7900gtx
Seriuos Sam- x1900xtx
FEAR-x1900xtx
X3- x1900xtx
CoD2- tie
3dmark06- 7900gtx
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Most people run Nvidia cards on Quality setting, because the difference isn't even visible. As for shimmering, that's a problem that neither setting can fix, but I have never experienced it.

I don't see those tests as credible as I normally would, only because it's recommended to run Nvidia in Quality setting. Really, there is no difference. I've seen comparisons.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed

And to add to the first post, ATi defaults to the highest mipmap detail. There are four settings, and ATi hsa a default of 4. NV on the other hand, defaults to the [bthird[/b] highest setting, or 75%. They too have four settings, and the default is 3/4. .

:confused:

heres a little test I did

RTHDRIBL 640*480
P4 2.53ghz
1 gig of ram
6800nu @12pipes 6 vertex shaders, 300mhzcore/700mhzmem
all driver options to set HIGH except mip map filtering

default test (one w/marbles)

Mip map filtering set to trilinear 40.78 FPS
Mip map filtering set to none 40.89 FPS

a less than 1% difference, hardly noticeable in game or in a benchmark :confused:

edit: also, ill post some Quality mode VS HQ mode screenshots in a bit
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: schneiderguy
uh, you do realize they only tested 6 games right? out of the six games, the 7900gtx is close or beats the x1900xtx in 3 of them :confused: all you accomplished by this post was to start another huge flamewar. testing six games can hardly be called "all" games :disgust:

Q4- 7900gtx
Seriuos Sam- x1900xtx
FEAR-x1900xtx
X3- x1900xtx
CoD2- tie
3dmark06- 7900gtx

While I agree it shouldnt be called "all" or "every", with only a handful of games tested, and from one review. I would like more reviews to use the highest driver settings, as I said, thats how I play games. And having one set to the highest, and the other set lower, really isnt the same. However, your count is off. First, you call 3Dmark a game... sigh.

You give the GTX a "win" for being 6 points faster in the overall score, out of almost 6000. What res are you going by in the other games? In Q4, at 1600x1200 the GTX is only 1fps faster. And 8fps faster at 1280x1024. You call that a "win" for the GTX. Yet with CoD2, the XTX is 3fps, 1fps, and 2fps, faster. Yet its a "tie". If thats a tie, why isnt 6 points overall a tie in 3dmark06? If you add the overall score, SM2, and SM3 score, the GTX ends up with a wopping 3 points more, 10,550, to 10547. And thats a "win"? Yet the same difference in CoD2 is a "tie"? C'mon, At least be consistant...

 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: hans030390
Most people run Nvidia cards on Quality setting, because the difference isn't even visible. As for shimmering, that's a problem that neither setting can fix, but I have never experienced it.

I don't see those tests as credible as I normally would, only because it's recommended to run Nvidia in Quality setting. Really, there is no difference. I've seen comparisons.

Yes, both cards shimmer. However, NV shimmers much worse than ATi depending on the monitor. And much worse in Q, than in HQ. Of course NV wants cards reviewed at Q, its faster performance. You dont think they could, because NV think the cards shouldnt be reviewed with the best IQ? Go go marketing!

Originally posted by: schneiderguy
Originally posted by: Ackmed

And to add to the first post, ATi defaults to the highest mipmap detail. There are four settings, and ATi hsa a default of 4. NV on the other hand, defaults to the [bthird[/b] highest setting, or 75%. They too have four settings, and the default is 3/4. .

:confused:

heres a little test I did

RTHDRIBL 640*480
P4 2.53ghz
1 gig of ram
6800nu @12pipes 6 vertex shaders, 300mhzcore/700mhzmem
all driver options to set HIGH except mip map filtering

default test (one w/marbles)

Mip map filtering set to trilinear 40.78 FPS
Mip map filtering set to none 40.89 FPS

a less than 1% difference, hardly noticeable in game or in a benchmark :confused:

edit: also, ill post some Quality mode VS HQ mode screenshots in a bit

Right.. rthdribl is a game..

From BFG10K, here is some numbers with HQ settings, with actual games; http://episteme.arstechnica.com/groupee/forums/a/tpc/f/67909965/m/999005329731

As you can see, some rather large performance drops when going from Q, to HQ. But according to hans030390, they dont count. Because NV doesnt want cards tested like that. Nevermind the fact that people actually use cards like that.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed

Right.. rthdribl is a game..

From BFG10K, here is some numbers with HQ settings, with actual games; http://episteme.arstechnica.com/groupee/forums/a/tpc/f/67909965/m/999005329731

As you can see, some rather large performance drops when going from Q, to HQ. But according to hans030390, they dont count. Because NV doesnt want cards tested like that. Nevermind the fact that people actually use cards like that.

those werent Q vs. HQ

go and read my post again :confused:

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
ooooookay HQ vs Q screenshots as promised

same settings and same testing machine were used like above with the mipmap quality test. except of course Quality mode was used for the "Quality" screenshot and HQ was used for the "HQ" screenshot :Q

Quality Mode

HQ mode

sorry the crosshair wasnt in the EXACT same spot. it was close enough :)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Two things:

1) I have never played any games on either G7x card of R5xx card at anything other than the highest quality AF. It stands to reason that if I spend $600+ on a 7800GTX or X1800XT, I want to use the highest quality settings.

2)
Originally posted by: schneiderguy
Originally posted by: Ackmed

And to add to the first post, ATi defaults to the highest mipmap detail. There are four settings, and ATi hsa a default of 4. NV on the other hand, defaults to the [bthird[/b] highest setting, or 75%. They too have four settings, and the default is 3/4. .

:confused:

heres a little test I did

RTHDRIBL 640*480
P4 2.53ghz
1 gig of ram
6800nu @12pipes 6 vertex shaders, 300mhzcore/700mhzmem
all driver options to set HIGH except mip map filtering

default test (one w/marbles)

Mip map filtering set to trilinear 40.78 FPS
Mip map filtering set to none 40.89 FPS

a less than 1% difference, hardly noticeable in game or in a benchmark :confused:

edit: also, ill post some Quality mode VS HQ mode screenshots in a bit

You're confused...? How do you think we feel? What are you trying to prove with a 640x480 "test" of a piece of software that isn't even a game. Is there even much AF going on in RTHDRIBL? I thought the backgrounds were skyboxes or some kind of static image... RTHDRIBL is all about lighting, not textures. Sorry, I just don't understand your rationale at all. Maybe you can be a bit more specific...
 

ruijorge

Member
May 12, 2006
53
0
0
Haven´t you all noticed that the review date is 09/03/2006?

LOLOL

Many things have changed:

-improvements in drivers
-patches for the games tested
-.....

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: nitromullet
Two things:

1) I have never played any games on either G7x card of R5xx card at anything other than the highest quality AF. It stands to reason that if I spend $600+ on a 7800GTX or X1800XT, I want to use the highest quality settings.

2)
Originally posted by: schneiderguy
Originally posted by: Ackmed

And to add to the first post, ATi defaults to the highest mipmap detail. There are four settings, and ATi hsa a default of 4. NV on the other hand, defaults to the [bthird[/b] highest setting, or 75%. They too have four settings, and the default is 3/4. .

:confused:

heres a little test I did

RTHDRIBL 640*480
P4 2.53ghz
1 gig of ram
6800nu @12pipes 6 vertex shaders, 300mhzcore/700mhzmem
all driver options to set HIGH except mip map filtering

default test (one w/marbles)

Mip map filtering set to trilinear 40.78 FPS
Mip map filtering set to none 40.89 FPS

a less than 1% difference, hardly noticeable in game or in a benchmark :confused:

edit: also, ill post some Quality mode VS HQ mode screenshots in a bit

You're confused...? How do you think we feel? What are you trying to prove with a 640x480 "test" of a piece of software that isn't even a game. Is there even much AF going on in RTHDRIBL? I thought the backgrounds were skyboxes or some kind of static image... RTHDRIBL is all about lighting, not textures. Sorry, I just don't understand your rationale at all. Maybe you can be a bit more specific...

fine, I will do a test in HL2 also, just know that there will be next to no performance difference like ackmed was implying
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: schneiderguy
ooooookay HQ vs Q screenshots as promised

same settings and same testing machine were used like above with the mipmap quality test. except of course Quality mode was used for the "Quality" screenshot and HQ was used for the "HQ" screenshot :Q

Quality Mode

HQ mode

sorry the crosshair wasnt in the EXACT same spot. it was close enough :)

I see the light is missing in the Q pic, but I don't know if that's due to the different IQ settings or something else.

Could you try some different games if you have time??
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
fine, I will do a test in HL2 also, just know that there will be next to no performance difference like ackmed was implying
Fair enough, but since you already seem to know the conclusion before you run the test I will take your results with a grain of salt if you don't mind.

Back to the original topic. The thing about HQ AF on NVIDIA cards is that even when you do enable HQ AF, it still isn't as good as ATI's. That being said, I personally feel that NV's AA is a tad better.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: schneiderguy
Originally posted by: Ackmed

Right.. rthdribl is a game..

From BFG10K, here is some numbers with HQ settings, with actual games; http://episteme.arstechnica.com/groupee/forums/a/tpc/f/67909965/m/999005329731

As you can see, some rather large performance drops when going from Q, to HQ. But according to hans030390, they dont count. Because NV doesnt want cards tested like that. Nevermind the fact that people actually use cards like that.

those werent Q vs. HQ

go and read my post again :confused:


Sure looked like it. Then what was the point of your post. BFG10K did some numbers, and the difference is pretty large.

Originally posted by: schneiderguy

fine, I will do a test in HL2 also, just know that there will be next to no performance difference like ackmed was implying

Im not implying there is a difference, Im flat out saying that there is. If you want to see a quality difference, look here; http://www.3dcenter.org/artikel/neue_filtertricks/index2.php Sadly, hardly anyone this side of the Atlantic does such comparisons, so its in German. I have no probs reading it, Im sure others will.

Why would you think there wouldnt be a performance, or quality difference from one setting to another? Its not like stretchy pants, they dont have them "just for fun".
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0


Q4- 7900gtx
Seriuos Sam- x1900xtx
FEAR-x1900xtx
X3- x1900xtx
CoD2- 1900XTX by 6%
3dmark06- Tie


Man you are totally full of crap.

You call a 6% win by the XTX in COD-2 a 'tie', but call a .2% win (literally) by (overclocked) 7900GTX in 3DMark06 a win for the GTX? hahaha


And again, the GTX gets beaten pretty handily in the other benchmarks, particularly considering it's a fairly substantially Overclocked GTX at 700/1800 versus a STOCK x1900 XTX

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: nitromullet
fine, I will do a test in HL2 also, just know that there will be next to no performance difference like ackmed was implying
Fair enough, but since you already seem to know the conclusion before you run the test I will take your results with a grain of salt if you don't mind.

fine, take it with a grain of salt :)

anyway, same settings, HQ everything blah blah blah blah blah. 1280*1024 4x AA 16x AF CS Source VST

Mipmap no filtering: 41.06
Mipmap Trilinear filtered: 39.96 FPS

i stand corrected, a 2.6% performance decrease from trilinear to no filtering. STILL not significant to really make an impact in benchmarks that reviewers should "have" to turn it on
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
You got this from the same spot where you were? Wow, lots of action going on there. Sorry, I trust BFG10K more than you. He shows a rather large difference. Not to mention, the two AF ops Im sure you still have on.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Ackmed

Im not implying there is a difference, Im flat out saying that there is. If you want to see a quality difference, look here; http://www.3dcenter.org/artikel/neue_filtertricks/index2.php Sadly, hardly anyone this side of the Atlantic does such comparisons, so its in German. I have no probs reading it, Im sure others will.

Why would you think there wouldnt be a performance, or quality difference from one setting to another? Its not like stretchy pants, they dont have them "just for fun".

Wow, that's a pretty big difference in quality (no sarcasm here)
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed
You got this from the same spot where you were? Wow, lots of action going on there. Sorry, I trust BFG10K more than you. He shows a rather large difference. Not to mention, the two AF ops Im sure you still have on.

please read my posts before replying to them :confused:

CS Source VST

also, nice job arguing with a person who is agreeing with you and admitting there is a performance decrease with trilinear mipmaps on
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal


Q4- 7900gtx
Seriuos Sam- x1900xtx
FEAR-x1900xtx
X3- x1900xtx
CoD2- 1900XTX by 6%
3dmark06- Tie


Man you are totally full of crap.

You call a 6% win by the XTX in COD-2 a 'tie', but call a .2% win (literally) by (overclocked) 7900GTX in 3DMark06 a win for the GTX? hahaha


And again, the GTX gets beaten pretty handily in the other benchmarks, particularly considering it's a fairly substantially Overclocked GTX at 700/1800 versus a STOCK x1900 XTX




from the review you posted, under the CoD2 graphs

"Playing a game you would not be able to detect the difference between the three of them."

if thats not the definition of a tie, i dont know what is :confused:
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
fine, take it with a grain of salt

anyway, same settings, HQ everything blah blah blah blah blah. 1280*1024 4x AA 16x AF CS Source VST

Mipmap no filtering: 41.06
Mipmap Trilinear filtered: 39.96 FPS

i stand corrected, a 2.6% performance decrease from trilinear to no filtering. STILL not significant to really make an impact in benchmarks that reviewers should "have" to turn it on

Wait a minute... Your numbers suck for that game... Ok, you're running a 6800nu (missed that)... No offense, but can we get someone with a somewhat current NV gpu (G7x series) to test this on a more current game?
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: schneiderguy
Originally posted by: Frackal


Q4- 7900gtx
Seriuos Sam- x1900xtx
FEAR-x1900xtx
X3- x1900xtx
CoD2- 1900XTX by 6%
3dmark06- Tie


Man you are totally full of crap.

You call a 6% win by the XTX in COD-2 a 'tie', but call a .2% win (literally) by (overclocked) 7900GTX in 3DMark06 a win for the GTX? hahaha


And again, the GTX gets beaten pretty handily in the other benchmarks, particularly considering it's a fairly substantially Overclocked GTX at 700/1800 versus a STOCK x1900 XTX




from the review you posted, under the CoD2 graphs

"Playing a game you would not be able to detect the difference between the three of them."


That's weird, the sentence from the website seems to say, when copied in its entirety:

"In Call of Duty 2 we see that the 7900GTX has nearly caught up with the performance of both X1900 cards. Playing a game you would not be able to detect the difference between the three of them."

[/i]


if thats not the definition of a tie, i dont know what is :confused:
[/i]

The XTX wins by 6% period. You called that a "Tie"

In 3DMark06, the GTX wins literally by .2%, and you call that a "win" for the GTX. Ridiculous.


6% is enough to call a win IMO, particularly when the GTX has to be at 700/1800 to achieve that. You add 10% from OCing the XTX, and that 6% turns into 16%