nVidia vs ATi - Beyond the Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: xtknight
Originally posted by: CP5670
As far as I know, there is no way to get completely perfect AF on any of the modern cards.

I think it would be impossible for it to be perfect. You can't 'perfectly' map a texture to the ground without having some kind of issue viewing it at a different angle. It's just a matter of how good the cards can do it whilst maintaining a decent framerate. Personally I think what they have now is a perfect balance. I wouldn't be willing to sacrifice any more to reduce something that doesn't even bother me most of the time.

I guess by perfect AF I was referring to what the old Geforce 3/4 cards did. I keep an old computer for playing retro DOS and Win95 games and can't see any shimmering effects at all on the Geforce 3 in there, although that card takes a hefty performance hit with any AF.

The high quality modes from both companies looks quite acceptable to me in most games, but there are a few games where the shimmering still stands out, so it would be good to have the option. It generally tends to occur in old games (which have generic level geometry), where there is plenty of video card power to spare for any performance hit the "perfect" AF may have.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Whats funny is that "Crusader" (Crusader of spreading misinformation and bias) is accusing someone of having an agenda. His sig is bias, ignorant, and on some accounts plain wrong. Clearly he has had an agenda since he got here. Its really too bad these forums are so old that you cant ignore someone properly.

Originally posted by: hemmy
Originally posted by: Ackmed
Originally posted by: dagabs
Very Interesting.

I didnt realize High Quality removed shimmering, I might give it ago.

It doesnt. It simply reduces it. There is some misinformaton being spread that it "fixes" it, when it doesnt.

It depends on the game, some games there is none

No, it depends on the hardware the game is played on. Setting the drivers to HQ, from Q, does not "fix" shimmering in a game where its easily seen. Big LCD's show shimmering worse than smaller LCD's, and any CRT.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: nitromullet
Interesting point and certainly valid, but your results are riddled with inconsistent resolutions and settings. Why would you bother with HQ AF, then not use AA at 1600x1200? I think you're cherrypicking your results to prove your point. Your benchmarks would be far more credible IMO if you just said you ran these benchmarks at 1280x1024, 1600x1200, and 1920x1440 resoltuons with 4xAA/16xAF in both Q and HQ modes for NV cards, and presented all the results.

I agree with nitro here. I can definatley see some "cherry picking" happening, but remember, the 7800 was a generation after its competition. The tester wasn't trying to show that an X800XL was better than a 7800GT, he was just showing that when HQ is enabled on an Nvidia platform, sometimes you can get results in games that are very close if not less than an X800XL's results. People like Crusader get baligerant and begin claiming "the test wasn't right! Who uses a 7800 anymore anyway!?" Well, of course the test isn't completely failsafe, but he wasn't trying to make it so. The fact that an X800XL was involved is simply the median in voicing the hit with HQ performance sometimes and the reduced shimmering that HQ can cause. Whether this review threatens you or not, it has opened a lot of peoples eyes--both with most other benches, and Nvidias AF and that's all I think he wanted to do. Its okay Crusader, you can still pet you Nvidia logo as you fall asleep to night knowing that it still proved the victor here.
 

CKXP

Senior member
Nov 20, 2005
926
0
0
i ran a few benchmarks on my 7800gt(490/1200) between Quality and High Quality with no AA/AF

all test were done at 1280x1024, except for F.E.A.R@1280x960 using FW 84.56 v2

CSS stress test (Q=182/HQ=176)
HL2 LC stress test (Q=106/HQ=104)

F.E.A.R (Q=83/HQ=81)
min. 50/46
avg. 83/81
max. 178/176

a slight performance drop between the two settings, but nothing dramatic.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: CKXP
i ran a few benchmarks on my 7800gt(490/1200) between Quality and High Quality with no AA/AF

all test were done at 1280x1024, except for F.E.A.R@1280x960 using FW 84.56 v2

CSS stress test (Q=182/HQ=176)
HL2 LC stress test (Q=106/HQ=104)

F.E.A.R (Q=83/HQ=81)
min. 50/46
avg. 83/81
max. 178/176

a slight performance drop between the two settings, but nothing dramatic.
exactly. it seems most of us doing our own testing are seeing VERY little impact on performance.

 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
It's probably because you have no AF on. Try it on 16x. The main difference between the quality and HQ modes is the AF.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: CP5670
It might also be interesting to see how the Geforce 6 series cards stack up. Although I never did any direct comparisons, my old 6800 GT shimmered a lot less than the 7800 GTs I was using until recently. I never really noticed it unless I was specifically looking for it on the 6800 GT, but it stuck out like a sore thumb immediately when I first started a game with the 7800s.
that's a very good point, tho i could never really get an answer for it.

when this while "shimmering" thing started, i was like..."wtf? you guys are whacked.." i just never saw it (at least i was no worse than found on my ati card) - but i was using a 6800GT...
 

CKXP

Senior member
Nov 20, 2005
926
0
0
Originally posted by: CP5670
It's probably because you have no AF on. Try it on 16x. The main difference between the quality and HQ modes is the AF.

Okay here are the results with 16AF (in bold) applied:

all test were done at 1280x1024, except for F.E.A.R@1280x960 using FW 84.56 v2

CSS stress test (Q=182/HQ=176) (Q=172/HQ=158)
HL2 LC stress test (Q=106/HQ=104) (Q=99/HQ=89)

F.E.A.R (Q=83/HQ=81) (Q=78/HQ=72)
min. 50/46 44/40
avg. 83/81 78/72
max. 178/176 163/160


in F.E.A.R the difference is only marginal, but in CSS and HL2 Lost Coast there is more of a performance loss.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
ATi also has two optimizations available in Direct3D for both trilinear and anisotropic but unlike nVidia's respective settings, ATi's versions can safely be enabled with negligible impact on image quality.

While I would take issue with the negligible quality comment, what about Catalyst AI and all of the- using your definition- cheating that they have enabled with that by default? Did you disable Catalyst AI before testing the x800xl?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: CaiNaM
Originally posted by: CP5670
It might also be interesting to see how the Geforce 6 series cards stack up. Although I never did any direct comparisons, my old 6800 GT shimmered a lot less than the 7800 GTs I was using until recently. I never really noticed it unless I was specifically looking for it on the 6800 GT, but it stuck out like a sore thumb immediately when I first started a game with the 7800s.
that's a very good point, tho i could never really get an answer for it.

when this while "shimmering" thing started, i was like..."wtf? you guys are whacked.." i just never saw it (at least i was no worse than found on my ati card) - but i was using a 6800GT...


All to true. I didnt really notice any shimmering at all. I cant even see it now... lucky me :D
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: BenSkywalker
ATi also has two optimizations available in Direct3D for both trilinear and anisotropic but unlike nVidia's respective settings, ATi's versions can safely be enabled with negligible impact on image quality.

While I would take issue with the negligible quality comment, what about Catalyst AI and all of the- using your definition- cheating that they have enabled with that by default? Did you disable Catalyst AI before testing the x800xl?



Catalyst AI at low setting has very little to no effect on image quality. Secondly, nVidia has no comparable setting since their drivers are hard coded for game optimizations so if he did that, he'd be forcing the ATi card to run in unoptimized mode vs. full opts on the nVidia card.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I also wonder why BF2 benchmark weren't done :?
Because I don't own the game.

The benchmarks are a little off, cause he has only chosen one specific resolution per game to prove 800XL can beat or be close to the 7800gt.
The purpose of the X800XL is to act as a reference point to see how far HQ drops the performance on the 7800GT. You can use this to guess how the respective comparisons of other cards would shift when nVidia is running under high quality.

Even with his testing methods, and High Quality AF, the COD2, UT2004, Riddick, Quake 4 benchmarks clearly show the 7800GT would be a worthy upgrade from the X800XL.
That was never under debate. The article isn't a comparison of a X800XL to a 7800GT on the merits of upgrading, it simply demonstrates the performance hit that nVidia suffers from HQ.

(not to mention we're on 7900GTs these days)
And likewise the 7900GT can be made to act like the previous generation X1800XL when running under high quality.

The simple fact these results are based on a 7800, these results are not valid.. regardless of my accusations of BFGs preferences. Most everyone will agree this test is flawed, as well as out of date..
Why is it invalid, flawed and/or out of date?

nor does any credible reviewer back it up.
Did you miss the two 3dcenter articles I linked to that discussed the issue more thoroughly?

In case you also missed the benchmarks.

Their Serious Sam 2 scores drop about the same percentage mine do but their Quake 4 score drops almost one third! My 11% performance drop is actually very kind to nVidia.

Interesting point and certainly valid, but your results are riddled with inconsistent resolutions and settings.
The reason for this is mainly best playable score as I tried keep the results close to or above above 60 FPS whenever possible. Bottomline: if AA isn't enabled it's because it would generated a slideshow if it was.

The low FPS Call of Durty 2 is because I didn't expect the DirectX9 path to be so slow and I didn't want to swap the cards and re-test at lower resolutions. Even at 1280x960 it would be lunacy to enable AA in that game on the two cards I tested.

what about Catalyst AI and all of the- using your definition- cheating that they have enabled with that by default? Did you disable Catalyst AI before testing the x800xl?
Hi Ben. :)

The answer to your question is no, I did not disable Catalyst AI because it has little/no impact on image quality.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
LMAO! the ATI paperboy thinks it was too kind to nV, and the nV crusader thinks it had a pro-ATI agenda. Congrats BFG, you evidently wrote a very balanced review to have both camps unhappy
Heh heh, yes, whenever I can get both camps to complain I consider it a job well done. :)

I'm just surprised nobody has picked up on my comments about nVidia's superiority with AA. I have some sample patterns saved up in reserve just in case. :p

I agree with nitro here. I can definatley see some "cherry picking" happening, but remember, the 7800 was a generation after its competition. The tester wasn't trying to show that an X800XL was better than a 7800GT, he was just showing that when HQ is enabled on an Nvidia platform, sometimes you can get results in games that are very close if not less than an X800XL's results. People like Crusader get baligerant and begin claiming "the test wasn't right! Who uses a 7800 anymore anyway!?" Well, of course the test isn't completely failsafe, but he wasn't trying to make it so. The fact that an X800XL was involved is simply the median in voicing the hit with HQ performance sometimes and the reduced shimmering that HQ can cause. Whether this review threatens you or not, it has opened a lot of peoples eyes--both with most other benches, and Nvidias AF and that's all I think he wanted to do. Its okay Crusader, you can still pet you Nvidia logo as you fall asleep to night knowing that it still proved the victor here.
josh6079, you get a thumbs up and a cookie from me. :thumbsup: :cookie:

That post of yours perfectly describes what I was trying to achieve when I wrote that article. :)

Edit: spelling mistakes.
 

dagabs

Junior Member
May 17, 2006
21
0
0
I'm just surprised nobody has picked up on my comments about nVidia's superiority with AA. I have some sample patterns saved up in reserve just in case.

Can we see those samples?? Cause I'm upgrading my 6800gt soon and was going towards Ati, just want to see how the AA's compare. Thanks in advance BFG10K.
 

dagabs

Junior Member
May 17, 2006
21
0
0
What about Ati 4xAA vs Nv 4xAA on an image with alot of jaggies??? Cause I wont be running 16, 8 or 6 AA on any game. Maximum 4.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: dagabs
I'm just surprised nobody has picked up on my comments about nVidia's superiority with AA. I have some sample patterns saved up in reserve just in case.

Can we see those samples?? Cause I'm upgrading my 6800gt soon and was going towards Ati, just want to see how the AA's compare. Thanks in advance BFG10K.

i went from a GT to an XT, and honestly, if you're not running an older title that has the performance overhead to allow for 8xs, the AA quality is comparable between the 2.

HQAF otoh doesn't have that huge performance impact and is useable on everything - and is a significant upgrade in IQ from the GT...
 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
Good read. People are too defensive, it was just a simple study - not a ATI vs Nvidia war.