interesting.. counterpoint to d3: hl2 30% faster on ati hardware?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,102
32,655
146
Originally posted by: Draco
Originally posted by: DAPUNISHER
Originally posted by: Draco
no way current ati hardware is 30% faster then NVidia's 6800 line. Yes, this may have been true when we were comparing 9800PRo's and Nvidia 5900's, but not now. If this turns out to be the case when Half Life 2 ships (whenever that may be), I would highly question Valve/ATI's relationship and their motives.
You are questioning id&nVs' then correct?


Yes, they are all shady!
Precisely. <automated response> Trust corporations to attempt to part you from your hard earned bank and for nothing else.</>
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Draco
no way current ati hardware is 30% faster then NVidia's 6800 line. Yes, this may have been true when we were comparing 9800PRo's and Nvidia 5900's, but not now. If this turns out to be the case when Half Life 2 ships (whenever that may be), I would highly question Valve/ATI's relationship and their motives.

but you wouldn't question id/nvidia's relationship?

edit: lol.. dapunisher beat me to it!
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
A couple points:

First, neither the NV/iD or ATi/Valve performance differences are intended to make the "other" graphics card perform poorly. Instead, you have each SW developer going down a different coding path and HW developer making a card with different strengths and weaknesses.

If you look @ farcry, where ATI has an advantage, you will see a couple things. First, it is using DX with extensive shader routines (at least compared to existing games). Second, we notice that all the new cards perform similarly until you activate AF, at which point NV performance takes a dive compared to ATI performance. Since it has been mentioned that NV sacrifices some shader paths in order to run AF while ATI has a separate path for AF, the fact that NV looses out when AF is activated should not be surprising. Therefore there is no evidence that Farcry has attempted to optimize performance on one card and hinder performance on another.

I bring up Farcry because it is probably the DX9 game that is most comparable to HL2, and therefore will probably best represent how HL2 will perform on NV relative to ATI when it is released.

If you look @ iD, where NV has an advantage, you will see that they are using OpenGL without the shader routines. NV has historically had very strong OpenGL performance, and has also optimized their HW design based on that type of rendering. While iD will try and make their game run as well as possible on NV cards, they will also work with ATI towards the same goal. Now, I don't know enough about Doom3 to know how AF is implemented, and how that impacts NV's performance, however, if Doom3 does not make extensive use of the shader paths, then it is very possible that NV will not have as great a performance hit as they do under DX and will therefor perform better.

There is no evidence of any software company deliberatly optimizing their code for one graphics card while sacrificing performance on another. Instead, you have one set of HW that has an advantage in shader intensive games with AF enabled, and another set of HW that is better @ rendering Doom3. In both cases the software companies and the HW companies work together to get the best performance from any hardware combination.

Finally, all the high end cards play the newest games at very playable framerates, so the only reason to worry about which one ends up on top in which game is so you can brag about your HW.

-D'oh!
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Actually they admitted it wasn't ready and that the theft only exacerbated matters.

Did they admit it at shadey days? I didnt think so, and instead pushed the marketing presentation knowing full well the game was in no shape or form ready for release in 2 weeks.

Where did that time frame come from? I have never heard that before

Our friend Dave Baumen admitted this on this very msgboard last winter when everybody was flaming Valve. He apparently had a conversation with somebody at Valve who indicated they didnt start optimizing until June.

More importantly, it provided a much needed infusion of capital to continue game development as they were running low on funds as I understand it.

What, their publisher wouldnt have bailed them out?

My money is on HL2+mods. My reasoning is that the gameplay, particularly the degree of interaction with the enviroment possible, will make this title and it's spin-offs unmatched for both single&amp;multi-player. We will have the same bashers D3 has, most of whom are the rival GPU makers overzealous minions displeased with their products performance in the title and spouting FUD and propaganda in an effort to undermine the games success=sad but true.

Quake is a cult, dont forget it ;)

If they spent 2 months optimizing every series of GPU the game would never get finished.

*cough*

but you wouldn't question id/nvidia's relationship?

Did carmack give a marketing presentation for Nvidia including graphs that show the best bang for the buck?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,102
32,655
146
Genx87,

You will not get an argument from me over the collusion that took place between Gabe&amp;co with ATi. As to the question about the publisher bailing them out, I don't know, you tell me. BTW, Thanks for clearing up how you came to know how long they spent on coding for the nv3x.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: AnnoyedGrunt
If you look @ iD, where NV has an advantage, you will see that they are using OpenGL without the shader routines. NV has historically had very strong OpenGL performance, and has also optimized their HW design based on that type of rendering. While iD will try and make their game run as well as possible on NV cards, they will also work with ATI towards the same goal. Now, I don't know enough about Doom3 to know how AF is implemented, and how that impacts NV's performance, however, if Doom3 does not make extensive use of the shader paths, then it is very possible that NV will not have as great a performance hit as they do under DX and will therefor perform better.

Doom3 uses OpenGL fragment programs (essentially, pixel and vertex shaders) heavily, and makes extensive use of stencil shadows, which NVIDIA does VERY quickly in hardware (the 5900 and 6800 get essentialy double the IPC in their pixel shaders when doing this kind of calculation). However, I'm not sure if AF takes performance away from stencil shadowing, which would reduce its penalty considerably on the NV40 running Doom3 as compared to other shader-heavy games.

There is no evidence of any software company deliberatly optimizing their code for one graphics card while sacrificing performance on another. Instead, you have one set of HW that has an advantage in shader intensive games with AF enabled, and another set of HW that is better @ rendering Doom3. In both cases the software companies and the HW companies work together to get the best performance from any hardware combination.

I have to agree with your conclusion. I do like how some posters seem to think it's expected for NVIDIA to be faster at Doom3, but if ATI is faster at HL2, it *must* be Valve screwing over NVIDIA. :p
 

Illissius

Senior member
May 8, 2004
246
0
0
nVidia being faster at Doom 3 was expected and logical, as ATi's OpenGL drivers are slow (at least for the X800 series - oddly, I don't see this as much with the previous generation), as has been shown with many other games previously (Q3, COD, that Jedi game), plus nVidia has UltraShadow. With HL2 I know of nothing that would inhibit nVidia's performance in this way, asides from possibly 3Dc, which is why I would be surprised if they ended up considerably slower, clock for clock and pipe for pipe (ie, if the XT is faster than the Ultra, that's fine, but I'm not expecting that with the Pro and GT).

Oh, and what ever happened to ATi also having a 32x0 mode, that was only available with AA on? I remember them saying they had this in an interview somewhere, but I'm not seeing them getting a smaller hit with AA on than nVidia.