A 7900GTX at 700mhz/1800mhz always gets beaten by a X1900XTX at stock

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hans030390
Most people run Nvidia cards on Quality setting, because the difference isn't even visible. As for shimmering, that's a problem that neither setting can fix, but I have never experienced it.

I don't see those tests as credible as I normally would, only because it's recommended to run Nvidia in Quality setting. Really, there is no difference. I've seen comparisons.

Actually there's a big difference, especially in HL2, CS:S. I tried it out on my friend's 7800gt, and he would have returned the card if I didnt show him how to reduce the shimmering on HQ mode. You think all those extra fps in "Quality" mode come for free?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: schneiderguy
from the review you posted, under the CoD2 graphs

"Playing a game you would not be able to detect the difference between the three of them."

if thats not the definition of a tie, i dont know what is :confused:
If we could just rely on subjective analysis, we wouldn't ask for benchmarks. Obvioulsy, both the X1900XTX and the 7900GTX are nice cards, and both will play most games very well, but there has to be some more concrete data than "they're nice".
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: coldpower27
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.

Sure, then ATI can release drivers defaulting to "high performance" and suddenly an XTX is faster than a 7950GX2
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal
Originally posted by: coldpower27
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.

Sure, then ATI can release drivers defaulting to "high performance" and suddenly an XTX is faster than a 7950GX2

and nvidia could release drivers defaulting to "high performance" and suddenly a 7900gt is faster than a x1900xtx
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Frackal
Originally posted by: coldpower27
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.

Sure, then ATI can release drivers defaulting to "high performance" and suddenly an XTX is faster than a 7950GX2

So does "high performance" look same to "quality"?

I mean, you are comparing HQ AF of ATi with HQ on NV. Arent those two different things? Even worse if we start doing this, how about TRAA and AAA where AAA actually gets rid of the fencing on HL2? what now?

Then theres the HDR difference between NV and ATi cards in titles like FC, SC CT, Oblivion and etc.

Frackal try to find a more current benchmark. That would be nice.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: coldpower27
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.

I agree. I have yet to see any of the major sites note a visual difference in the standard settings from both companies.

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Cookie Monster

I mean, you are comparing HQ AF of ATi with HQ on NV. Arent those two different things? Even worse if we start doing this, how about TRAA and AAA where AAA actually gets rid of the fencing on HL2? what now?

what???? i never said anything about AF :confused: yeah, ATI's AF is better, nvidia's AA is better.

:confused: what do you mean by AAA getting rid of the fence? im not sure I understand most of your post :)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I wasnt aiming the post at you. I was posting it to Frackal.

edit - Lol.. people get confused about HQ mode in NV and HQ "AF" for ATi.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Cookie Monster
I wasnt aiming the post at you. I was posting it to Frackal.

edit - Lol.. people get confused about HQ mode in NV and HQ "AF" for ATi.

oh, i guess I assumed since it was the post after mine it was directed at me. sorry for the misunderstanding :)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: schneiderguy
Originally posted by: Cookie Monster
I wasnt aiming the post at you. I was posting it to Frackal.

edit - Lol.. people get confused about HQ mode in NV and HQ "AF" for ATi.

oh, i guess I assumed since it was the post after mine it was directed at me. sorry for the misunderstanding :)

:beer: no problemo



 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: schneiderguy
Originally posted by: Cookie Monster

I mean, you are comparing HQ AF of ATi with HQ on NV. Arent those two different things? Even worse if we start doing this, how about TRAA and AAA where AAA actually gets rid of the fencing on HL2? what now?

what???? i never said anything about AF :confused: yeah, ATI's AF is better, nvidia's AA is better.

:confused: what do you mean by AAA getting rid of the fence? im not sure I understand most of your post :)

He's talking about some screen shots I posted a while back that compared HQ screenies from a GX2 and an XTX. In one of the HL2 shots, the chainlink fence just sort of disappears with the XTX, but not with the GX2. I took the original shot with AAA, but it happens with just regular AA as well. It is kind of odd once you notice it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nitromullet
He's talking about some screen shots I posted a while back that compared HQ screenies from a GX2 and an XTX. In one of the HL2 shots, the chainlink fence just sort of disappears with the XTX, but not with the GX2. I took the original shot with AAA, but it happens with just regular AA as well. It is kind of odd once you notice it.
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2
If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX versus those taken with the Radeon using its standard angular dependant anisotropic filtering mode, disregarding artifacts produced by the JPG compression.

The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the better image quality as it relates to anisotropic filtering when standard "optimized" aniso is used

It seems to be an issue with a HL2 setting that has yet to be fixed.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Wreckage
Originally posted by: nitromullet
He's talking about some screen shots I posted a while back that compared HQ screenies from a GX2 and an XTX. In one of the HL2 shots, the chainlink fence just sort of disappears with the XTX, but not with the GX2. I took the original shot with AAA, but it happens with just regular AA as well. It is kind of odd once you notice it.
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2
If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX versus those taken with the Radeon using its standard angular dependant anisotropic filtering mode, disregarding artifacts produced by the JPG compression.

The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the better image quality as it relates to anisotropic filtering when standard "optimized" aniso is used

It seems to be an issue with a HL2 setting that has yet to be fixed.

We were talking about AA not AF, plus who cares about angle dependent on an X1900 card? I don't think I've ever used it before. Even if you forget the default is HQ angle independent on the Radeon, so you actually have to ask for reduced IQ... Why would anyone ever do that?
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: SolMiester
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.


That a 7900GTX overclocked to 700/1800 loses by around 15-25% in the benchmarks shown once image quality settings are both at "High Quality" to a stock x1900XTX is a big deal. You drop that overclock and 10fps becomes 15+, which means going from 50fps to 35fps. That would be noticeable

 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Cookie Monster
Originally posted by: Frackal
Originally posted by: coldpower27
Very old benches, both cards should have been tested at the settings the drivers use standard, not disable optimization on 1 card and not the other.

Sure, then ATI can release drivers defaulting to "high performance" and suddenly an XTX is faster than a 7950GX2

So does "high performance" look same to "quality"?

I mean, you are comparing HQ AF of ATi with HQ on NV. Arent those two different things? Even worse if we start doing this, how about TRAA and AAA where AAA actually gets rid of the fencing on HL2? what now?

Then theres the HDR difference between NV and ATi cards in titles like FC, SC CT, Oblivion and etc.

Frackal try to find a more current benchmark. That would be nice.


Read the review and then respond. "HQ AF" is not particularly relevant to the point being made in this review.

Old benchies, are you kidding?

You're telling me that a 7900GTX @ 700/1800 which loses by 15-25% to a stock XTX in nearly every game is suddenly going to make up that difference with a couple new driver releases? Give me a break. GTX drivers are more mature than XTX drivers (or should be) since the GTX came out in ~ June of 05, (I know, I just switched from BFG GTX). The XTX is being tested there on first or second iteration drivers since its release, so if anything, one would expect greater performance increases with XTX over GTX. (6.6 Cats are a good example of that)




 

n7

Elite Member
Jan 4, 2004
21,281
4
81
The funny thing is how teh nVidiots find the quality at defaults the same, & those who have actually used both, like the OP, do not ;)

I think that speaks for itself.

Wreckage & co. will continue to spread as much FUD as possible though, since they are horrified at the thought of people seeing the difference.

Now that being said, i can't really see any difference personally, but i also don't notice ghosting & while not colorblind at all, i apparently cannot see colors quite as accurately as most, so i'm really not the best judge.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal
Originally posted by: SolMiester
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.

That a 7900GTX overclocked to 700/1800 loses by around 15-25% in the benchmarks shown once image quality settings are both at "High Quality" to a stock x1900XTX is a big deal. You drop that overclock and 10fps becomes 15+, which means going from 50fps to 35fps. That would be noticeable

:confused: your article doesnt even back your "facts" up. i dont see the x1900xtx beating the 7900gtx by 15% in CoD2, Q4, or 3dmark06
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: schneiderguy
Originally posted by: Frackal
Originally posted by: SolMiester
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.

That a 7900GTX overclocked to 700/1800 loses by around 15-25% in the benchmarks shown once image quality settings are both at "High Quality" to a stock x1900XTX is a big deal. You drop that overclock and 10fps becomes 15+, which means going from 50fps to 35fps. That would be noticeable

:confused: your article doesnt even back your "facts" up. i dont see the x1900xtx beating the 7900gtx by 15% in CoD2, Q4, or 3dmark06


3DMark06 isnt a game. I already said in the OP that it doesn't win in Q4. COD2 is 6% not 10% to the XTX, the other games tested, FEAR, SS, X3 Reunion do show the XTX winning at those numbers depending on resolution (I tend to use 1600x1200)

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal
Originally posted by: schneiderguy
Originally posted by: Frackal
Originally posted by: SolMiester
So, what are you guys trying to prove in all of this?, certainly sounds a bit anal to me, like someones going to notice 1-10fps different when you are over 50fps?!

Some people just prefer ATI and other NV.....Give it up!

To me, ATI is a follower!, NV the innovator. Lets face it, ATI only started getting into 3D performance once 3DFX finished. ATI have gone a different and complex route with there latest offerings and until unified shader technology with the next DX10/games, no-ones going to know who has got it right.

I'm not going to try and stuff my opinion down anyone esles throat.

That a 7900GTX overclocked to 700/1800 loses by around 15-25% in the benchmarks shown once image quality settings are both at "High Quality" to a stock x1900XTX is a big deal. You drop that overclock and 10fps becomes 15+, which means going from 50fps to 35fps. That would be noticeable

:confused: your article doesnt even back your "facts" up. i dont see the x1900xtx beating the 7900gtx by 15% in CoD2, Q4, or 3dmark06


3DMark06 isnt a game. I already said in the OP that it doesn't win in Q4. COD2 is very close, the other games tested, FEAR, SS, X3 Reunion do show the XTX winning at those numbers depending on resolution (I tend to use 1600x1200)

how does the 7900gtx not win in Q4???? it looks like it wins to me
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
oh i thought you said "I already said in the OP that it doesn't win in Q4" you were talking about the 7900gtx
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Most people run Nvidia cards on Quality setting, because the difference isn't even visible.
I don't believe that's the case but rather I believe they run those settings because they don't know better. If they tried HQ for an afternoon or so they would immediately see how bad Quality is.

Those settings are visibly worse than a default X800XL and R5xx hardware has even better IQ because of better trilinear and better AF.

Don't get me wrong, my 7900 GTX rocks for many reasons but let's not pretend Quality mode is an acceptable mode to play games under.

ooooookay HQ vs Q screenshots as promised
Screeenshots are useless because the problem is only visible during movement; you can't see texture wiggling, shimmering and/or crawling in screenshots.

If you go to the bottom of my benchmarks there are two links to 3DCenter articles that have comparison videos and also show the performance hit with numbers that jive with my own.