ATI cheating in AF?

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ChkSix

Member
May 5, 2004
192
0
0
You can see on the bottom of your browser (left hand side) which is which. The dimmer "Objectives Updated" message is the R420 (transition is worse) and the NV40 has the brighter "Objectives Updated" message (smoother transition).
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
It actually seems like the R420 has the IQ edge at long distances, look at the trees, and the fence.

Also you cant really see Brilinear unless you are in motion, and grass isnt really a good surface to look for banding anyway.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: ChkSix
Originally posted by: Ackmed
You're obviously ignorant about what is going on.

http://www.beyond3d.com/forum/viewtopic.php?t=12587

Open you eyes, and stop trying to be so bias.

Here is a much better example showing the clear differences between the R420 and NV40 filtering.

http://www.reflectonreality.com/nv40r420/

alp
Is this http://www.ixbt.com/video2/images/r420xt/r420-anis0x.jpg (copy and past it) bilinear or bri/trillinear as is is supposed to be...i heard it is possilby a bug in cod causing it to set filtering to bi rather than a really bad trillinear filtering method, is this true?

Andy/Raja
This we believe is test error and the X800 images appear to be obtained using only a bilinear filter. We have been unable to reproduce this internally. Also, note that the game defaults to bilinear when a new card is installed and this may explain the tester error


From the interview today... (my bolding of comments, note that the image is the same one as being used in the comparison)
edit: I should also add that COD is a game that I still can't run perfectly on my AIW 9700 Pro unless I use the 3.10 Cats. ATI has had quite a few problems running COD properly, so this may not be the best game to test the AF quality on.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Okay fine, but if it was a legitimate optimization why didn't they include it in the feature listing or tell us of their "advancement" before giving people a chance to bitch about it?
We've known about adaptive AF since the original Radeon days so it's nothing new. Likewise, nVidia never told us about their adaptive AF in the FX series yet we figured it out. That's why it pays to do some research before buying cards.

I dont but the degradation on the FX series is much less than on the R3.xx\R420 series
The degradation at certain angles is less. Likewise ATi's version looks better at other angles because they are taking up to double the amount of samples.

Remember that with 360 degrees accross three axis that gives you 46656000 possible planes to analyze filtering. It's foolish to just take two or three colourwheel screenshots and use them as the end-all to judge filtering quality.

We can see that the 90 degree angle portions have the full level of filtering applied, while the 22 degree portions are left at about 8x; the rest of the filtering appears to be at about 2x.
That isn't correct even by his own evidence. I did make two typos in my post so here is the correct version:

Full strength AF is at 90 degree increments and it then degrades as it leaves them until 2xAF is applied at 22.5 increments, rising again as it approaches the next 90 degree increment. Basically it's the R200 split in half with the Z rotation issue fixed.

BFG doesn't apply the same standards to both companies.
Yes I do.

Now he says adaptive AF is fine, no problem. When he only knew nVidia was doing it, it was a "cheat". See my quotes/links if you don't believe me, he said it, not me.
You are talking utter BS Rollo. Even without looking at my comments we can logically deduce that what you said was false because ATi have been performing adaptive AF since the original Radeon.

I can dig up more quotes.
Go for it. Dig up what you like. You will never find me complainig about adaptive AF except maybe in the R100/R200 days because the implementation had visible IQ problems at times. But then again I always singled out nVidia's poor performance on the NV25 so that's hardly grounds to call me a zealot in either camp.

As for trilinear my complaints in the past have been about nVidia's brilinear not the adaptive version that is being disussed here.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: ChkSix
Originally posted by: Ackmed
You're obviously ignorant about what is going on.

http://www.beyond3d.com/forum/viewtopic.php?t=12587

Open you eyes, and stop trying to be so bias.

Here is a much better example showing the clear differences between the R420 and NV40 filtering.

http://www.reflectonreality.com/nv40r420/


Once again, games are what matters. Not some synthetic test. You cant tell the difference in games, so what does it matter.
 

webmal

Banned
Dec 31, 2003
144
0
0
Originally posted by: Ackmed

Once again, games are what matters. Not some synthetic test. You cant tell the difference in games, so what does it matter.


Ahmed, YOU missed the point [yet another dumbass].
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
BFG:
So you're saying there's some big difference in ATI and nVidia's filtering based on that marketing spin posted above? That you believe them when they say "it's true trilinear, no image degradation" even after seeing the articles showing image degradation?

I can't say I'm surprised.

Did you hear the X800 cured cancer and implemented peace in the Middle East as well?
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: ChkSix
My response: Engineers don't write those .pdf's, they build and design hardware and drivers. And once a .pdf is written, it is usually checked and rechecked by lawyers of the company for legal purposes before it ever makes it out the front door and into the hands of the consumer.

Want to bet on that one? :D (looking at that bloody Adobe icon in Word and those documentum links in the menu bar) The only times a lawyer reviewed my documentation was for RFPs and security notices. Of course, I don't design hardware, I just abuse it.

Cut and paste rules, but copy and rename is even better! :D

?Relevance to this discussion?
As Connie Booth said, "But it was my only line." (The Mattress, errr Dog Kennel skit in the Monty Python's Flying Circus).
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Yes Ackmed, the POINT is ATI is doing things faster with no penalty to IQ, therefore they are cheating because NVidia's attempt to do the same thing didn't net them as much of a performance gain. Just accept it and move on! :p How dare you buy a bleeding edge card to play GAMES. Tsk tsk. I'm using mine to make waffles. Tasty tasty waffles.

Sarcasm aside, this discussion is pretty ridiculous, even with all the "you said, I said" BS aside, people are foregoing rational thought in order to "be right" or some such nonsense.
Someone do an IQ comparison between the X800Pro/XT and the 9800Pro/XT with Trilinear AF on, and do an honest comparison of the images. No mucking about with Call of Duty, that seems to have caught some off guard with its resetting-to-bilinear-by default after a card has been switched. Surely there are people on this forum who have both cards (CaiNaM?) and could do this sort of test. Until someone does that, I fail to see how this discussion can constructively move forward. It seems to me that mostly this is being made into a huge issue by people who haven't seen reliable comparisons, and/or haven't fronted the cash to pick up one of these cards.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
IN that one picture there are clear differences in the mipmap boundaries. The ATI card is not doing trilinear.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Actually I remember back when BFG10K considered ATI cards junk. Since the 9700 came to life he seems to like ATI. He does not seem to be a fanboy at all and prefers ATI now as he considers it better tech at this time.


There are alot in here now who claim the benchmarks as invalid as comparisons are not "apples to apples" anymore. Well here is a shocker, there have not been apples to apples comparisons EVER!. Nvidia and ATI both implement different optimizations and LOD and always have.

IMO, Nvidia lost their way since the TI series of cards. The current nv40 gained alot of lost ground to ATI, but still has a way to go, not do to speed or features, but to overall implementation. The NV40 is heavy, hot and thirsty right now. The NV45 may correct the NV40's shortcomings and I hope it does. Right now though, ATI seems to get the speed and quality with alot less effort than the Nvidia product.
 

imported_Aelius

Golden Member
Apr 25, 2004
1,988
0
0
Well after all that has been said and done I'm still not convinced that I should not buy the X800XT from ATI.

Unless something a little more Earth shattering occurs I will likely go ahead with my purhcase in a month or two once socket 939 boards and CPUs are out along with better RAM and hopefully lower RAM prices.

One can hope no?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: webmal
Originally posted by: Ackmed

Once again, games are what matters. Not some synthetic test. You cant tell the difference in games, so what does it matter.


Ahmed, YOU missed the point [yet another dumbass].

No I didnt. You cant tell the difference in games.

But you are getting no respect for your opinion for acting like a child with your name calling.
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
I'm not a wizzard on AF, but just an end user. I checked out the screen shots of the R420 vs NV40.

I can notice the slightly sharper transition in the grass about mid screen height for the R420. From what I understand, this smoothing is what TriAF is all about.

But I have two questions?

1) Why is there less detail regarding the leaves on the trees in the distance with the NV40? Does this have to do with AF?

2) Why is the foreground on the R420 sharper than on the NV40? I had thought the AF was only for smoothing distant objects in the screenshot. It is really noticeable in the surface details on the side of the fence post in the lower right corner of the screen shot.

Without continuing a flame war, I am just looking for an answer to the questions. It seem to me that the R420 is more detailed overall and the NV40 is slightly blurred over the whole screen, not just the distance. What causes this, or is this each manufacturer's interpretation of the game? If the NV40 has a blurrier foreground to begin with, will it not transition into the background less sharply?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: TStep
I'm not a wizzard on AF, but just an end user. I checked out the screen shots of the R420 vs NV40.

I can notice the slightly sharper transition in the grass about mid screen height for the R420. From what I uderstand, this smoothing is what TriAF is all about.

But I have two questions?

1) Why is there less detail regarding the leaves on the trees in the distance with the NV40? Does this have to do with AF?

2) Why is the foreground on the R420 sharper than on the NV40? I had thought the AF was only for smoothing distant objects in the screenshot. It is really noticeable on the side of the fence post in the lower right corner of the screen shot.

It looks (although I can't be sure, because I don't have one) like the R420 shots in that comparison are actually using bilinear filtering, not trilinear. This would explain the graphical differences in the textures. ATI apparently can't reproduce this effect (that very sharp texture transition you can see), and nobody on here with an R420 has posted anything like it either.

Also note that AF is separate from bilinear/trilinear texture filtering. You will see slight differences between NVIDIA and ATI's AF, since their adaptive algorithms are not identical. AF helps mostly with surfaces that are tilted relative to the viewer -- these are often distant surfaces, but it can help on close-up textures as well.
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Thank you for the sane response. Is the difference in detail on the fence post due to trilinear filtering? Or is it just the manufacturer's interpretation? Also, why are so many leaves lost on the trees on the NV40?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
From reading the chat, I become more and more curious if it would be possible to run ATi's newest optimizations on the 9700/9800's as well as what the performance gain would be. If the "findings" made by ComputerBase.de are true, and clock for clock 9800 and X800 are roughly the same then isn't $400-$500 a bit steep for a nothing more than a driver upgrade? On the upside, it would be awesome to get near X800 performance from a $200 9800 Pro by simply upgrading your Cat's. The 9800 Pro is the price/performance leader right now, and this would certainly sweeten the deal.

For those of you don't know your video card history, ATi has done a driver upgrade before and rebranded it as a new/updated card.

from: http://www.firingsquad.com/features/atihistory/default.asp
The 3D Rage Pro
ATI released the 3D Rage Pro in April of 1997. As one of the first AGP accelerators, the Rage Pro was a well-designed chip. The Rage Pro offered a fill rate of 45Mpixels/second (equal to the original 3dfx Voodoo Graphics), VQ texture compression, and a 1.2 M triangles/sec full triangle setup engine. The chip had single-pass trilinear filtering as well as a complete set of texture blending options. Unfortunately, the drivers severely limited the shipping product and the card was unable to overtake 3dfx's Voodoo Graphics. NVIDIA shipped the Riva128 later that year with a 100 Mpixels/sec fill rate and 5M polygon/sec setup engine, putting ATI even further behind.

Though the Rage Pro was an enormous success with OEMs (thanks to its DVD performance and low-cost), it never reached its full potential with gamers due to poor drivers. There was once a time when people considered ATI products to be the most stable products in the world. This was the time of the Mach64. Despite the Rage Pro's great paper specifications, the chip didn't reach its full potential until it was already too late.

Released in February 1998, ATI 's promised a 40% speed boost with its TURBO drivers. They even renamed the chip the Rage Pro Turbo. In practice, however, the first Turbo drivers only improved 3D Winbench 98 scores. 3dfx released the Voodoo2 at the end of February, and ATI was all but forgotten.

Ironically, when the Rage Pro's final driver set was released in May 1999, well after the next graphics generation was on the market, gaming performance had indeed improved 20-40% over the original shipping drivers. Had the Rage Pro shipped with these drivers, the 3D market might have been very different.

-underline not present in original text.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
It wouldn't matter for 9800Pros. Because they DON'T run at X800 speeds. Who cares if they X800 performs like a 9800 when it's underclocked. It's not ran at that speed and has more pipelines. Those extra pipes are sure to make a difference in some way.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
For those of you don't know your video card history, ATi has done a driver upgrade before and rebranded it as a new/updated card.

Seriously, the issue here is Trilinear filter techniques, not anything even remotely equivelent to 9800pro w/ driver upgrade = x800pro.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: nitromullet
From reading the chat, I become more and more curious if it would be possible to run ATi's newest optimizations on the 9700/9800's as well as what the performance gain would be. If the "findings" made by ComputerBase.de are true, and clock for clock 9800 and X800 are roughly the same then isn't $400-$500 a bit steep for a nothing more than a driver upgrade? On the upside, it would be awesome to get near X800 performance from a $200 9800 Pro by simply upgrading your Cat's. The 9800 Pro is the price/performance leader right now, and this would certainly sweeten the deal.

ATI didn't explicitly say it, but there was an implication that there is a hardware component to this filtering:

Q (crushinator): Is this Algorythm (sic) implemented in hardware? who's analysing texture maps, is it just the driver doing that or is it the chip?
A (Andy/Raja): The image analysis is performed by the driver to chose (sic) the appropriate hardware algorithm . This allows us to continually improve the quality and performance with future drivers.
(emphasis added)

I personally find it unlikely that, if they've had this working for nearly a *year* on the 9600 cards, that they wouldn't have also enabled it for the 9700/9800 cards if they could. Bizarre conspiracy theories to the contrary ("They've been crippling 9800 performance for a year so now they can sell more X800 cards!"), there's no good reason for them to hold back a performance-enhancing feature when it would presumably increase their lead over the GeForceFX cards when using AA/AF.

Also, turning this on wouldn't give you "near X800 performance" -- not unless you found a way to get your 9800Pro to ~500/550 clocks, or to give it 12 pipelines and 6 vertex shaders! They also seemed to imply in the chat that the figures given for the performance boost by compbase.de (~20%) are very high:

Q (chris): What performance boost does this give you, anyway?
A (Andy/Raja): It's a very mild optimization at the default levels, so of the order of a few percent. But this is a neat algorithm - we encourage you to take the texture slider all the way towards "performance" - we don't think you'll be able to see any image change until the very end, and the performance increases are significant.
(again, emphasis added)

For those of you don't know your video card history, ATi has done a driver upgrade before and rebranded it as a new/updated card.

Yes, but the X800 is clearly a different hardware platform, even if the baseline clock-for-clock performance is the same for non-shader rendering. Compare with other hardware developments -- the Pentium 4 is *slower* clock-for-clock than the Pentium 3, but because it reaches higher clockspeeds and offers other improvements, its overall performance is noticeably better.