Underhanded ATI

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I fail to see how an internal company presentation can be classed underhanded

If it was from nVidia some fanatics would be screaming for a public apology from nV executives implying this was a foul example of the kind of corrupt and manipulative corporation nVidia is. That isn't a hypothetical situation either, it's happened multiple times(check out the B3D forums).

I was being facetious in calling JC an nV lackey, and in doing so mocking your calling Newell an ATi lackey.

What if Carmack had pulled a Newell and made all reviewers only use the latest official Cats when they did their Doom3 benches? That would make your comparison valid. nV is faster then ATi at DooM3, just as ATi is faster then nVidia at HL2. The difference between Newell and Carmack is that Carmack wanted people to get as realistic assesment as possible while Newell was working for his employer to sell their product.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Wow, so you'd be happy to cross the Atlantic Ocean in a balloon when you have a glider available?

Would you also take a bicycle over a scooter because a Ferrari is much faster than either?
Would accept a cardboard box instead of a single room in a house because a mansion is much better than either?

No offense but your logic is bizarre.

I'd say my logic is good! Here's why:
Your analogies are too extreme, they don't make sense.

A 5950 and a 9800XT are fairly comparable performing cards in most aspects. A cardboard box and a room in a house aren't comparable at all really. The box is free standing, flimsy, impervious to the elements for minutes, whereas a room in a house shares all the advantages of the house.

Like I said, when ATIs best is losing half it's performance running PS2 effects, and nVidias best is losing 60-70% of their performance running PS2 effects- that only tells me one thing:
That I won't be using either to run PS2 effects. I'll wait till May or June and buy a card that has more realistic PS2 performance, especially since all I'm missing out on for now is Far Cry. Seems straightforward enough to me?


LOL BTW BFG- all these months and months you've been yelling about DX9 performance, doesn't it strike you at all silly as there's still only a game or two where it matters?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If it was from nVidia some fanatics would be screaming for a public apology from nV executives implying this was a foul example of the kind of corrupt and manipulative corporation nVidia is.
If ATi's presentation said something like "we're going to cheat and not tell anyone about it" then you bet I'd have a problem with it, much like nVidia's "optimization guidelines" that they promptly backpedalled on because they realized they couldn't continue to operate as they had been if they followed them. This presentation appears to be mostly harmless.

A cardboard box and a room in a house aren't comparable at all really.
It's true, the box was probably a bad example but what about the other two?

Also instead you can compare an apartment to a house in the context of a mansion being the third party. Are you saying you wouldn't have a free-standing house on its own section of land over an apartment because the mansion is much better than either?

Like I said, when ATIs best is losing half it's performance running PS2 effects, and nVidias best is losing 60-70% of their performance running PS2 effects
That isn't the best; the best is when ATi has double the performance of nVidia when running full PS 2.0 (eg Far Cry, Half Life 2).

That I won't be using either to run PS2 effects.
Have you removed your excess RAM and HD space from your system yet? How you downgraded your power supply so that it is providing the absolute minimum wattage your system needs?

Last time we had this discussion you agreed that you had no use for something you weren't currently using, much like the PS2 performance advantage on the R3xx.

So, how's it coming along then? If I looked at your system can I expect to see zero RAM and HD being reported as free? Can I expect to see the weakest PSU possible to support your system?

You claim to follow logic but your actions and reasoning are anything but.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: BenSkywalker

What if Carmack had pulled a Newell and made all reviewers only use the latest official Cats when they did their Doom3 benches? That would make your comparison valid. nV is faster then ATi at DooM3, just as ATi is faster then nVidia at HL2. The difference between Newell and Carmack is that Carmack wanted people to get as realistic assesment as possible while Newell was working for his employer to sell their product.

Newell excluded newer nV drivers on the basis of rendering errors, which he thought would affect benchmark scores. As the extensive testing of the HL2 leak at XbitLabs and elsewhere has shown, newer drivers can't hide the fact that nV is just slower than ATi in HL2 ... with that build.

I just think people are quick to call lackey based on second- and nth-hand info. Sure, I found both the D3 and HL2 tests suspicious in that they seemed tainted by marketing, but I think the general conclusions we can draw from there are valid. Besides, I still haven't seen screenshots or Valve Bink vids showing the differences between NV in DX8.2 mode and ATi in full DX9 mode, so all this may be a misplaced indignation. What if nV looks about as good as ATi at about the same speed? Then there's really no problem. Sure, we can grumble about ATi offering more for the money or being a smarter buy, but I doubt DX8.2 mode will severly compromise FX owners' ability to enjoy HL2. The D3 numbers are another story, though, as ATi was shown to be much slower than nV, period. We can bicker over IQ differences all we want, but framerates are pretty much indisputable. I hope early drivers were the culprit in ATi's disappointing D3 alpha performance, or ATi owners will have a lot more to grumble about over D3 than FX owners will over HL2.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Sorry BFG- I just don't think video cards lend themselves to housing and travel analogies.

The main point of contention you seem to have is that ATI cards run PS2 games twice as well. I've never argued that fact, only the importance of it. Do you even have or play Far Cry? What other games that are out right now are running worse on nVidia cards?

Why is it that PS2 is so important when only one game worth playing it uses it, and nVidia's better Linux, XP64, and OpenGl performance are meaningless?

Far Cry must be all that and a popsicle.....
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
* Looks at dead horse.

Nope, I just can't bring myself to flog the how-about-waiting-before-crowning-the-next-king horse anymore...


PS, Rollo, NVidia may have overall better Linux performance, but IME, ATI's drivers do a MUCH better job of supporting the features of the cards (ie Dual Out, TV Out, etc).
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: chsh1ca
* Looks at dead horse.

Nope, I just can't bring myself to flog the how-about-waiting-before-crowning-the-next-king horse anymore...


PS, Rollo, NVidia may have overall better Linux performance, but IME, ATI's drivers do a MUCH better job of supporting the features of the cards (ie Dual Out, TV Out, etc).

HAHA! Try multimonitoring OGL ;)
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Never needed to, tho I am sure there are issues. I am beginning to actually believe some of the people around here when they say that to get decent linux drivers we'll have to have NV/ATI open up their drivers, which I doubt is happening anytime in the next couple of years at any rate. ;)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Newell excluded newer nV drivers on the basis of rendering errors, which he thought would affect benchmark scores.

Newell hadn't touched the drivers nV wanted reviewers to use. He also went on extensively about how much they had optimized the engine particularly for nV while they hadn't bothered to recompile the code using MS's own compiler. And then we have the whole September 30th 2003 date which he still hasn't retracted from what I've seen. That date was for two different things, one was the game was supposed to ship and also the public bench of the game was supposed to be released on that date, according to him in the middle of that month.

nVidia has their own PR monkeys in the dev world, look at Core for an example. Newell is in ATi's camp and waves the banner proudly and loudly.

As the extensive testing of the HL2 leak at XbitLabs and elsewhere has shown, newer drivers can't hide the fact that nV is just slower than ATi in HL2 ... with that build.

I expect the final game to show the same, at any cost.

I just think people are quick to call lackey based on second- and nth-hand info.

Read through his statements from Shader Day and then read everything that B3D posted with follow up info on the game. He misled and flat out lied on different points.

Comparing him to Carmack is utterly absurd in this case. Carmack has proven himself as the premier engine developer in the gaming industry time and time and time again. Newell hasn't proven he can do anything besides show off a tech demo on the engine development side.

BFG-

If ATi's presentation said something like "we're going to cheat and not tell anyone about it" then you bet I'd have a problem with it, much like nVidia's "optimization guidelines" that they promptly backpedalled on because they realized they couldn't continue to operate as they had been if they followed them.

It has been explained to you and shown to you how ATi 'cheats' by your exact to the letter definition repeatedly.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Why is it that PS2 is so important when only one game worth playing it uses it, and nVidia's better Linux, XP64, and OpenGl performance are meaningless?
(1) Those features are important to someone who wants them, no doubt about it.
(2) nVidia is not inherently better than ATi in OpenGL.

It has been explained to you and shown to you how ATi 'cheats' by your exact to the letter definition repeatedly.
I'm afraid it hasn't.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
IMHO Saying NVidia has better linux support USED to be true, but no longer is. Both NVidia and ATI could take a page from Matrox' book when it comes to Linux support...