ATI cheating in AF?

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
I should add that the only reason why I would choose Nvidia here over ATi is architecture. Unless the ATi offering was knocking the daylights out of Nvidia's offering across the board (which isn't the case, they are basically even throughout the spectrum), I cannot see spending 500 dollars on something that is roughly two years old with enhancements such as additional pipelines and speed increases on it's core. I love bleeding edge technology, and I am aware that there will always be problems to iron out with everything brand spanking new. But that's how it works in our hobby/profession. Technology should always progress forward, even if the designs are wrong and don't work like they should first time around. Because you can always learn off of the mistakes of something new and perfect them, at which time it would lead you to another bold push in another direction. You can't learn anything new if you just tinker with the same thing over and over and over again, already knowing it inside and out, backwards and forwards.

One could also make the point that, if you're unsure about your design, you shouldn't ask your customers to pay $500 to beta-test it for you. The tendency to rush incomplete or unfinished products out the door is something I don't like about the computer industry (in terms of both hardware and software). Now, obviously the 6800 works, so this may not really apply to NVIDIA in this case. But if they come out with a vastly improved NV45 in six months, anyone who jumped on the NV40 bandwagon now is not going to be happy. Of course, the same thing could happen with ATI and R480, which is why I'm not buying *anything* right now. It rarely pays to be an early adopter (ATI's 9700Pro and NVIDIA's GF4Ti cards being rare examples of first-gen hardware that worked and didn't become obsolete quickly -- folks who bought 5800Ultras, or the rev1 5600s, were probably not as thrilled).

Progress is good, but progress for the sake of progress is not necessarily a good thing.

Turbine engines would never have been if the prop wasn't completely perfected and someone said, ok out with the old, let's try something bold and new and walk down a path never taken before by anyone. Same holds true for the new Scramjet and Hypersonic engines being developed now because the Turbine has reached a point of perfection. Quantum Mechanics and the Uncertainty Theory came at the turn of the century, which led to microprocessors and the computer revolution. If no one had the courage to push the envelope to it's breaking point, we would all still be using punch cards or the basic Chinese computer (forget the name) to do our mathematical computing for us instead of desktops, laptops, pocket pcs and the like.

Progression can only come if one is willing to take a chance and dive into the unknown. And this time around, ATi dissapointed me by not doing that and elongating a seriously old design, that is if one thinks of it in terms of technology.

The abacus. Man, that thing was great. Of course, it only got like .000001FPS in Wolf3D, and your arm got sore after a while doing the texture calculations. :p

You're right that when one technology is tapped out, you have to turn and try something else if you want to get further. That's how engineering works -- you have something that works, you tweak and refine it until it's as good as it can be, and then if that's still not good enough, you look for alternatives. But doing so prematurely can be disastrous if the new technology doesn't turn out right. NVIDIA didn't have a choice -- the GeForceFX cards didn't look like they were going to scale much further, and probably wouldn't compete with ATI's new products without a serious overhaul. But ATI was able to give 50-100% more performance over their 'old' R360 cards with the R420 -- obviously there was still some headroom there. And they're working on a (supposedly) entirely new architecture for next year, the R500. It's not that they're standing still, it's that they saw they could get another generation's worth of performance out of their current design, and so it didn't make sense to rush out an entirely new architecture. Right now, while the NV40 has potential to use its advanced features (such as SM3.0 and its video encoder) in the future, it can't leverage those advantages over the R420.
 

caz67

Golden Member
Jan 4, 2004
1,369
0
0
Ati or Nvidia.!!

Both are great products.

The next gen cards offer enough firepower for the average enthusiast and the extreme gamer.

Lets all be thankful, that they are going to produce better products in the future.

We are all winners.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: caz67
Ati or Nvidia.!!

Both are great products.

The next gen cards offer enough firepower for the average enthusiast and the extreme gamer.

Lets all be thankful, that they are going to produce better products in the future.

We are all winners.

:beer:
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: obsidian
So he claims NVIDIA's brilinear has quality degradation, but ATI's doesn't? Please...

Prove ATi has suffered IQ loss. If you cant, you're talking out of your rear.
 

ChkSix

Member
May 5, 2004
192
0
0
Thanks for the response Matthias. And I agree with you completely.

The are both wonderful products in the end, and both are equally worthy of consideration when purchasing the latest graphics card for top 3d performance.


:beer:
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
NVIDIA will be must-have over ATi for people with NF3 boards. The proprietary performance increases are as high as 30% in games.

And it can only get faster :)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
You got a link for that? Ive read that a few times, but havent seen a review to show it, or any articles. I have a P4 now.. but am REALLY considering a A64 next upgrade, when the 939 socket comes out. And the NF3 250 looks to be really good.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: caz67
Ati or Nvidia.!!

Both are great products.

The next gen cards offer enough firepower for the average enthusiast and the extreme gamer.

Lets all be thankful, that they are going to produce better products in the future.

We are all winners.

damn this guy is good.
why cant everyone just have his outlook ?

:beer: :beer: :beer:
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Lumping a few responses together to save space, sorry if some retread resolved issues:

Originally posted by: VisableAssassin
NV came up with a fast method...their brilinear. It ran fairly fast but reduced IQ slightly...some notice..some dont. but the IQ was slightly less than what it shoulda been.
but when NV did this people b1tched up and down about it. But now ATi comes along pulls the same stunt...sure its harder to catch right off...but in my opinion its the same stunt....and nobody even raises as much hell as they did with NV, which is why I said NV does it its wrong...ATi does it...its ok...I dont get it
Ppl did report that mipmap transitions were still distinct with brilinear at its debut, so in that sense they were right to complain (well, that and the fact that nV introduced it with only UT2K3, so it was an app-specific and hand-coded "optimization"). ATi's "trylinear," OTOH, both doesn't seem to betray mipmap transitions and is a generic (not app-specific) optimization, so I don't think it warrants as much indignation on a technical level. I'm sure nV bore the brunt of the storm by being the first "caught" with it, though, and I agree that ATi is as guilty as nV from an ethical perspective by hiding it from end-users until caught.

Originally posted by: DAPUNISHER
That's a hell of a typo Pete.
Well, he just typed "22" instead of "45." Given the time pressure he's usually under with his reviews, I find it an understandable mistake. Besides, B3D readers are more than likely adept enough to catch the error, particularly since the contradictory picture is right there.

Originally posted by: lithium726
So if ATi has been using this new and improved triliniar on its 9600 series, why hasnt it been used on the 9500, 9700, or 9800? wouldnt that improve performance? i have a 9600 pro and i think its a great card for the 180 dollars i paid for it in oct of 2003, but why only on the 9600 and x800?
R3x0 probably doesn't have whatever hardware tweaks are required to run "trylinear." RV3x0 does, and R420 seems to have been built off of RV360 (both are 130nm & low-k, whereas R3x0 is 150nm and not low-k).

Originally posted by: Bar81
ATI has made no indication it is a hardware feature, rather they seem to imply it's a software algorithm in their drivers.
Software can only exploit what's present in hardware, and the current assumption is that R3x0 hardware doesn't allow for "trylinear." If "try" is in fact as good-looking as tri but faster, ATi has no reason not to want to enable it now.

Originally posted by: ChkSix
My response: Engineers don't write those .pdf's, they build and design hardware and drivers. And once a .pdf is written, it is usually checked and rechecked by lawyers of the company for legal purposes before it ever makes it out the front door and into the hands of the consumer.
Yes, well, that's never stopped any company in the history of man before, has it? Specifically and currently, if Apple can get away with calling their PCs the most powerful desktop supercomputers and the first 64-bit CPUs, then it's pretty much free reign for chaos. ;)

If they did do this with their 9800 cards, no one on God's green earth would buy a X800 for 500 dollars when they can get the same performance and IQ out of a card costing almost half as much today.
I don't see everyday 9800s clocking at 475/450 with an extra four pipelines, an extra two vertex shaders, and better pixel and vertex shader performance. :)

I think many others might disagree here bro. Me personally, I think it makes their benchmarking very invalid. But like you said, it does become extremely difficult as well as controversial now to get accurate results.
Actually, Damage himself (the author of that TechReport "ATi X800 filter games" article) used the 61.11 drivers with the 6800s when he benched them against the X800s, so he was in fact comparing "trylinear" to "brilinear" scores. So, technically, I think his numbers are still valid.

In short, I agree with Damage that slipping this optimization in under the radar kind of levels the playing field. Both ATi's and nV's marketing/PR must now be regarded with the same suspicion. But I haven't yet seen an analysis that conclusively proves that ATi's "trylinear" degrades IQ. I'm sure we'll see a raft of IQ articles in another week or two, after ppl with X800s have had a chance to figure out good tests, so I'll reserve judgement on the IQ impact until then.

Q:
Originally posted by: gsellis
Brilinear?
A: Brilinear. Note that Dave @ B3D has said bri has improved since its debut, so it may not be as bad at hiding Mipmap transitions as the six-month-old 3DC article describes. In fact, he seemed to imply in one recent B3D post that bri and try may now look very similar.
 

stickybytes

Golden Member
Sep 3, 2003
1,043
0
0
Originally posted by: Matthias99

One could also make the point that, if you're unsure about your design, you shouldn't ask your customers to pay $500 to beta-test it for you. The tendency to rush incomplete or unfinished products out the door is something I don't like about the computer industry (in terms of both hardware and software). Now, obviously the 6800 works, so this may not really apply to NVIDIA in this case. But if they come out with a vastly improved NV45 in six months, anyone who jumped on the NV40 bandwagon now is not going to be happy. Of course, the same thing could happen with ATI and R480, which is why I'm not buying *anything* right now. It rarely pays to be an early adopter (ATI's 9700Pro and NVIDIA's GF4Ti cards being rare examples of first-gen hardware that worked and didn't become obsolete quickly -- folks who bought 5800Ultras, or the rev1 5600s, were probably not as thrilled).

Progress is good, but progress for the sake of progress is not necessarily a good thing.

So should the wait and see approach be taken with the 6800 GT as well? Will there be a refresh for the GT's?

I really want the GT when it comes out.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: stickybytes
So should the wait and see approach be taken with the 6800 GT as well? Will there be a refresh for the GT's?

Your guess is as good as mine. *Generally* refresh parts only come in at the top -- but NVIDIA might also introduce a whole line of boards based on NV45. I mean, during the last generation they put out the 5800/5800U, then replaced those with the 5900/5900U (and added the 5900SE and 5900XT), and then later put out the 5950U. They might do something similar here, they might not. I haven't seen or heard any firm plans yet.

I'm just gonna sit tight with my OCed 9800Pro until everything is on the market, prices can stabilize, and my magic 8-ball stops saying "Situation Unclear". :p
 

stickybytes

Golden Member
Sep 3, 2003
1,043
0
0
Originally posted by: Matthias99
Originally posted by: stickybytes
So should the wait and see approach be taken with the 6800 GT as well? Will there be a refresh for the GT's?

Your guess is as good as mine. *Generally* refresh parts only come in at the top -- but NVIDIA might also introduce a whole line of boards based on NV45. I mean, during the last generation they put out the 5800/5800U, then replaced those with the 5900/5900U (and added the 5900SE and 5900XT), and then later put out the 5950U. They might do something similar here, they might not. I haven't seen or heard any firm plans yet.

I'm just gonna sit tight with my OCed 9800Pro until everything is on the market, prices can stabilize, and my magic 8-ball stops saying "Situation Unclear". :p

Does anyone have their hands on a 6800 Ultra yet?
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
I posted this in the "ATI's Radeon X800 texture filtering game at Tech Report" thread but think it applies better here...

Okay, maybe I don't know what I'm talking about - and that's entirely possible - but in my opinion ATI isn't really cheating. The way I understand it depending on what is being rendered they are sometimes using a workaround and the IQ result is the same with a higher frame rate (at least that's the way I understand it). If that's really the case, who cares?

And as far as benchmarks being unfair, maybe and maybe not. If ATI's "workaround rendering" results in equal or better IQ than Nvidia's full rendering then it's not unfair. I look at it as a feature rather than a cheat and just because Nvidia's cards can't do it doesn't mean ATI should have to disable it or turn it off in benchmarks.

I've read a lot of posts that don't agree with that but I don't understand the reasoning behind it. I want to see proof that ATI's "cheating" results in lower IQ than Nvidia's rendering. If it's true I'm sure the Nvidia camp would've exposed this by now and AFAIK they haven't. Until then I think ATI's only mistake was not being up front with exactly what's going on.

and

This is where I disagree. If ATI has a feature that Nvida does not why should they be forced to turn it off just to "level the playing field"? It would be no different to ask Nvidia to turn off their PS3 feature in tests because ATI doesn't offer it. If ATI can produce equal IQ with and without the feature enabled then they shouldn't be forced to disable it in tests. Just my opinion though...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
So he claims NVIDIA's brilinear has quality degradation, but ATI's doesn't? Please...
The issue being discussed is adpative trilinear, not brilinear. Don't confuse the two.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: BFG10K
So he claims NVIDIA's brilinear has quality degradation, but ATI's doesn't? Please...
The issue being discussed is adpative trilinear, not brilinear. Don't confuse the two.

If it only runs full trinlinear when it feels like it its cheating and id consider it Brilinear BFG.
If ATi told you the sky was purple in a PR write up would you believe it?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If it only runs full trinlinear when it feels like it its cheating and id consider it Brilinear BFG.
It takes eight samples when it requires it. If it can get away with less without impacting IQ then it does so.

This isn't brilinear and if you keep mixing the two it'll only continue to cause confusion.