Quantitative and qualitative comparison of overclocked ATi X1900XT and nVidia 7900GT

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Image quality assessments up on 2nd post in thread.


I'm still waiting for an explanation for why you didn't use HQ AF when testing the 7900 GT when you did use it for the X1900. More importantly, I want to see what kind of hit the 7900 GT takes in Oblivion with HQ AF enabled though I already have a pretty good idea. BTW those shimmer shots you posted don't really show any shimmering for either card from what I could see (though I'm using my laptop screen atm). A better example to highlight moire/shimmer would be to turn on HL2 and go to the prison level where the tiles are - there's a very noticeable difference in moire/shimmer between ATi and nV there. Although the best way to catch shimmering is through a small uncompressed FRAPs video since it's typically seen during movement and not still shots.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
But it just so happens that, NVIDIA shimmers a bit more

Sorry, but depending on hardware, that is very false. What monitor are you using? Yes both shimmer, NV shimmers much worse on my 2405FPW. Just like HardOCP said as well.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker
Originally posted by: ST
Image quality assessments up on 2nd post in thread.


I'm still waiting for an explanation for why you didn't use HQ AF when testing the 7900 GT when you did use it for the X1900. More importantly, I want to see what kind of hit the 7900 GT takes in Oblivion with HQ AF enabled though I already have a pretty good idea. BTW those shimmer shots you posted don't really show any shimmering for either card from what I could see (though I'm using my laptop screen atm). A better example to highlight moire/shimmer would be to turn on HL2 and go to the prison level where the tiles are - there's a very noticeable difference in moire/shimmer between ATi and nV there. Although the best way to catch shimmering is through a small uncompressed FRAPs video since it's typically seen during movement and not still shots.


"ATI as well as NVIDIA has some decent implentations on their standard Anistropic Filtering as depicted below. One thing to note is that ATI and NVIDIA do have different different image quality modes for an even better visual experience. On the 7900GT, the difference between Performance and Quality setting is readily apparent as I noted some unusual image quality glitches with the Performance setting I did not readily see on ATI's AF solution. Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT. The same holds true for ATI's High Quality (HQ) setting: I could not distinguish any visible improvements, even at angles, which ATI trumps. One thing though that really tips the scales in ATI's favor is the performance impact of these settings. NVIDIAs G71 GPU imposes a framerate loss when set to higher AF modes, with up to a 20% loss from lowest (Performance) to highest (High Quality) visual quality setting, but still has an acceptable FPS at the higher (Quality) setting. Probably because of the R580's advanced 48 pixel shader pipeline, there was hardly ANY discernable framerate difference between the various AF configurations at all! "

You are correct, shimmering isn't readily shown in still captures. In motion, there is a discernable difference in visual quality that i saw on the City of Isle bridge.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: 5150Joker
Originally posted by: ST
Image quality assessments up on 2nd post in thread.


I'm still waiting for an explanation for why you didn't use HQ AF when testing the 7900 GT when you did use it for the X1900. More importantly, I want to see what kind of hit the 7900 GT takes in Oblivion with HQ AF enabled though I already have a pretty good idea. BTW those shimmer shots you posted don't really show any shimmering for either card from what I could see (though I'm using my laptop screen atm). A better example to highlight moire/shimmer would be to turn on HL2 and go to the prison level where the tiles are - there's a very noticeable difference in moire/shimmer between ATi and nV there. Although the best way to catch shimmering is through a small uncompressed FRAPs video since it's typically seen during movement and not still shots.


"ATI as well as NVIDIA has some decent implentations on their standard Anistropic Filtering as depicted below. One thing to note is that ATI and NVIDIA do have different different image quality modes for an even better visual experience. On the 7900GT, the difference between Performance and Quality setting is readily apparent as I noted some unusual image quality glitches with the Performance setting I did not readily see on ATI's AF solution. Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT. The same holds true for ATI's High Quality (HQ) setting: I could not distinguish any visible improvements, even at angles, which ATI trumps. One thing though that really tips the scales in ATI's favor is the performance impact of these settings. NVIDIAs G71 GPU imposes a framerate loss when set to higher AF modes, with up to a 20% loss from lowest (Performance) to highest (High Quality) visual quality setting, but still has an acceptable FPS at the higher (Quality) setting. Probably because of the R580's advanced 48 pixel shader pipeline, there was hardly ANY discernable framerate difference between the various AF configurations at all! "

You are correct, shimmering isn't readily shown in still captures. In motion, there is a discernable difference in visual quality that i saw on the City of Isle bridge.



You're doing an apples to apples comparison yet claiming you cannot see a quality difference between HQ AF and Q AF on the nVidia card and thus do not benchmark with it? What kind of shoddy reviewing is that? You claim a 20% loss but don't provide any real numbers to backup that theory - you need to provide actual FRAPs numbers so others can verify your results. Furthermore, at the end of your review you should post your save files for Oblivion so others can validate or refute your findings.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker

You're doing an apples to apples comparison yet claiming you cannot see a quality difference between HQ AF and Q AF on the nVidia card and thus do not benchmark with it? What kind of shoddy reviewing is that? You claim a 20% loss but don't provide any real numbers to backup that theory - you need to provide actual FRAPs numbers so others can verify your results. Furthermore, at the end of your review you should post your save files for Oblivion so others can validate or refute your findings.

NV's quality setting do not correlate directly w/ ATI and vice versa. There will be appreciable difference with the settings, even the default Performance. I tried to replicate the image quality comparable to one another, and since ATI HQ solution did not yield a performance delta, i kept it as such (this is actually a positive with the ATI solution that i pointed out). the actual frap numbers are posted in the results. you're welcome to validate or refute the findings at your own leisure.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: 5150Joker

You're doing an apples to apples comparison yet claiming you cannot see a quality difference between HQ AF and Q AF on the nVidia card and thus do not benchmark with it? What kind of shoddy reviewing is that? You claim a 20% loss but don't provide any real numbers to backup that theory - you need to provide actual FRAPs numbers so others can verify your results. Furthermore, at the end of your review you should post your save files for Oblivion so others can validate or refute your findings.

NV's quality setting do not correlate directly w/ ATI and vice versa. There will be appreciable difference with the settings, even the default Performance. I tried to replicate the image quality comparable to one another, and since ATI HQ solution did not yield a performance delta, i kept it as such (this is actually a positive with the ATI solution that i pointed out). the actual frap numbers are posted in the results. you're welcome to validate or refute the findings at your own leisure.


It doesn't matter if nV's HQ AF matches ATi's or not. The point of a direct comparison is that you set both cards to their maximum AF quality and then compare the results. What you did is setup a "best playable settings" comparison and then make a judgement that centered on an apples to apples comparison - that's shoddy reviewing. What's wrong? Did nVidia take too much of a hit and make your "david" look bad so you left that out? Oh and I can't validate your findings if you don't post the save files and exact methodology that goes with them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT
I'm sorry, but this is just plain nonsense. At the default quality settings nVidia's cards have horrendous shimmering in many games spanning old and new titles. Even if you untick all three optimizations you can still get wiggling textures in games like Serious Sam. The only solution is to use high quality but be prepared for a performance drop.

On a related note most online benchmarks of the 6xxx and 7xxx series of cards are invalid because of this reason. The shimmering with default driver quality is unplayably bad in many games.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: ST

On the 7900GT, the difference between Performance and Quality setting is readily apparent as I noted some unusual image quality glitches with the Performance setting I did not readily see on ATI's AF solution. Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT. The same holds true for ATI's High Quality (HQ) setting: I could not distinguish any visible improvements, even at angles, which ATI trumps. One thing though that really tips the scales in ATI's favor is the performance impact of these settings. NVIDIAs G71 GPU imposes a framerate loss when set to higher AF modes, with up to a 20% loss from lowest (Performance) to highest (High Quality) visual quality setting, but still has an acceptable FPS at the higher (Quality) setting. Probably because of the R580's advanced 48 pixel shader pipeline, there was hardly ANY discernable framerate difference between the various AF configurations at all! "



NV's quality setting do not correlate directly w/ ATI and vice versa. There will be appreciable difference with the settings, even the default Performance. I tried to replicate the image quality comparable to one another, and since ATI HQ solution did not yield a performance delta, i kept it as such (this is actually a positive with the ATI solution that i pointed out). the actual frap numbers are posted in the results. you're welcome to validate or refute the findings at your own leisure.

You don't really believe what you are saying?

Card A - I see no image quality difference in AF settings but I see a performance hit with higher settings so I choose to test it at a lower setting

Card B - I see no image quality difference in the AF settings but the performance penalty of going higher is not much so I choose to test it at a higher setting.


Or is that not what you just said?






 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: BFG10K
Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT
I'm sorry, but this is just plain nonsense. At the default quality settings nVidia's cards have horrendous shimmering in many games spanning old and new titles. Even if you untick all three optimizations you can still get wiggling textures in games like Serious Sam. The only solution is to use high quality but be prepared for a performance drop.

if you could show this discernable visual delta in Oblivion, i would welcome it. I noted the idiosyncracies i saw and the delta between High and High Quality settings accordingly with pix to boot.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: SilentRunning
You don't really believe what you are saying?

Card A - I see no image quality difference in AF settings but I see a performance hit with higher settings so I choose to test it at a lower setting

Card B - I see no image quality difference in the AF settings but the performance penalty of going higher is not much so I choose to test it at a higher setting.


Or is that not what you just said?

ATI's HQ AF and no HQ AF have zero impact on performance, switching between them. I have tested this on MANY occassions. I could as easily switch numbers and post only no HQ numbers (as i did on some occasions) and they would be virtually equal. But I noted no discernable advantages in visual quality with HQ AF on either.

Again you're welcome to check it yourself. THe methodology and settings are pretty self explanatory.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: SilentRunning
You don't really believe what you are saying?

Card A - I see no image quality difference in AF settings but I see a performance hit with higher settings so I choose to test it at a lower setting

Card B - I see no image quality difference in the AF settings but the performance penalty of going higher is not much so I choose to test it at a higher setting.


Or is that not what you just said?

ATI's HQ AF and no HQ AF have zero impact on performance, switching between them. I have tested this on MANY occassions. I could as easily switch numbers and post only no HQ numbers (as i did on some occasions) and they would be virtually equal. But I noted no discernable advantages in visual quality with HQ AF on either.

Again you're welcome to check it yourself. THe methodology and settings are pretty self explanatory.


Since the intent of this "review" was to do a direct comparison between the two cards, resorting to this subjective "best playable settings" method to give nVidia higher numbers just proves to us that you clearly have an agenda; either that or you just don't have a clue.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker

It doesn't matter if nV's HQ AF matches ATi's or not. The point of a direct comparison is that you set both cards to their maximum AF quality and then compare the results. What you did is setup a "best playable settings" comparison and then make a judgement that centered on an apples to apples comparison - that's shoddy reviewing. What's wrong? Did nVidia take too much of a hit and make your "david" look bad so you left that out? Oh and I can't validate your findings if you don't post the save files and exact methodology that goes with them.

Interesting, so what would you compare ATI's non HQ settings comparable to NV's Performance setting, which then would heavily favor the 7900GT? The numbers (as well as settings, methodology, etc.) again are embedded in the performance review, and you're welcome to challenge them at your leisure. I guess you didn't enjoy the heavy sarcasm in the post about David, Goliath, and Godzilla. :(
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
Great review... It seems the only people who didn't like it were ones with an obvious ati or nvidia agenda, pretty typical. This place has really gone downhill over the years. If you guys think you can do a better job, write your own review.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: 5150Joker

It doesn't matter if nV's HQ AF matches ATi's or not. The point of a direct comparison is that you set both cards to their maximum AF quality and then compare the results. What you did is setup a "best playable settings" comparison and then make a judgement that centered on an apples to apples comparison - that's shoddy reviewing. What's wrong? Did nVidia take too much of a hit and make your "david" look bad so you left that out? Oh and I can't validate your findings if you don't post the save files and exact methodology that goes with them.

Interesting, so what would you compare ATI's non HQ settings comparable to NV's Performance setting, which then would heavily favor the 7900GT? The numbers (as well as settings, methodology, etc.) again are embedded in the performance review, and you're welcome to challenge them at your leisure. I guess you didn't enjoy the heavy sarcasm in the post about David, Goliath, and Godzilla. :(



I don't care if you add perf. vs. perf. numbers. Problem is you purposely neglected to use HQ AF on the 7900 GT because you probably noticed it's framerate took a nosedive so you came up with your "no discernable IQ difference between quality and HQ AF" b.s. So much for your unbiased "review".
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: extra
Great review... It seems the only people who didn't like it were ones with an obvious ati or nvidia agenda, pretty typical. This place has really gone downhill over the years. If you guys think you can do a better job, write your own review.


Well, the AEG crusade to spread crap in the forum left a lot of wreckage behind.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker
I don't care if you add perf. vs. perf. numbers. Problem is you purposely neglected to use HQ AF on the 7900 GT because you probably noticed it's framerate took a nosedive so you came up with your "no discernable IQ difference between quality and HQ AF" b.s. So much for your unbiased "review".

Purposefully eh? Note that I cited it in the final text and comment being one of the downfalls of the 7900GT. Guess I'm biased on ATI side then? :( Can I at least get paid like you, please?! ;)


 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: 5150Joker
I don't care if you add perf. vs. perf. numbers. Problem is you purposely neglected to use HQ AF on the 7900 GT because you probably noticed it's framerate took a nosedive so you came up with your "no discernable IQ difference between quality and HQ AF" b.s. So much for your unbiased "review".

Purposefully eh? Note that I cited it in the final text and comment being one of the downfalls of the 7900GT. Guess I'm biased on ATI side then? :(


Once again you side step the issue. You left out HQ AF numbers due to a ridiculous subjective claim. Those are the facts.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker
Once again you side step the issue. You left out HQ AF numbers due to a ridiculous subjective claim. Those are the facts.

"Preface: Although there have been vast comparisons between X1900XT and 7900GT offerings, most of them have involved just plain performance numbers without much insight into image quality, which can be equally important. This brief post does not attempt to corral the endless number of configurations on overclocks and image options that are available for both solutions, but rather attempt to classify the performance from a visual standpoint in terms of framerates and quality with the more simplistic and achievable clocks and settings."

Note I left out ATI AAA numbers as well, which to say was anemic at best, and that was being kind. I will however reconsider looking at it again, if you Paypal my some of your ATI money. ;)
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: ST
Originally posted by: 5150Joker
Once again you side step the issue. You left out HQ AF numbers due to a ridiculous subjective claim. Those are the facts.

"Preface: Although there have been vast comparisons between X1900XT and 7900GT offerings, most of them have involved just plain performance numbers without much insight into image quality, which can be equally important. This brief post does not attempt to corral the endless number of configurations on overclocks and image options that are available for both solutions, but rather attempt to classify the performance from a visual standpoint in terms of framerates and quality with the more simplistic and achievable clocks and settings."

Note I left out ATI AAA numbers as well, which to say was anemic at best, and that was being kind. I will however reconsider looking at it again, if you Paypal my some of your ATI money. ;)


Oh I think your nVidia sponsors pay you enough, I don't need to paypal you anything. ATi's AAA numbers wouldn't be anymore "anemic" than nVidia's if you used TRSSAA since Oblivion uses alpha textures heavily outside. Your thread title says, "Quantitative and qualitative comparison..". Clearly that's b.s. because you're leaving out crucial information to put the nVidia card in a better light - I already expected b.s. from you but now everyone can see it for themselves.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: 5150Joker
Oh I think your nVidia sponsors pay you enough, I don't need to paypal you anything. ATi's AAA numbers wouldn't be anymore "anemic" than nVidia's if you used TRSSAA since Oblivion uses alpha textures heavily outside. Your thread title says, "Quantitative and qualitative comparison..". Clearly that's b.s. because you're leaving out crucial information to put the nVidia card in a better light - I already expected b.s. from you but now everyone can see it for themselves.

I'm sorry to hear about your paranoia...I guess all the reviews, including AT's own, are B.S. also paid by nVidia :( You're ALWAYS welcome to buy a 7900GT yourself or inquire with other folks ANYTIME. I have new found respect for ATI's solution by doing so myself...too bad their paid BSers still have no clue - its rather intersting you keep on commenting on nVIDIA, yet don't even approach the subject say why ATI's HQ AF doesn't do much in OBlivion....intersting.
 

Dman877

Platinum Member
Jan 15, 2004
2,707
0
0
Except every hardware review site shows the X1900XT dominating the competition...
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
ST you were an total NVIDIA fanboy a month ago. You had a NVIDIA SIG , your 1st thread on this bs was typed in fav to NVIDIA which you edited after you were flammed bad. Why should i even believe you did fair testing.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: 5150Joker

I don't care if you add perf. vs. perf. numbers. Problem is you purposely neglected to use HQ AF on the 7900 GT because you probably noticed it's framerate took a nosedive so you came up with your "no discernable IQ difference between quality and HQ AF" b.s. So much for your unbiased "review".

how can you accuse ST's review as b.s when you don't have any first hand experience on these cards?