OGL vs DirectX performance differences between ATI/nVidia

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Don't know much about the fine details, but have been meaning to ask:

Seeing how nVidia usually performs better in OGL based games and ATI slightly favored in DirectX, is any of this hardware based? Meaning during the design phases of a core/card, are there hardware design choices that are made that would favor OGL over DirectX or vise versa? Or are all the differences just in the drivers?

edit: title
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
It's the drivers. At the most base hardware level, all OGL and DX functions get mapped to the same transistors. But certain games tend to favor certain cards as well - Doom 3 for example uses a shadow technique that favors Nv cards.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Some difference in the hardware, some in the drivers, some in the way the dev chooses to program the game (possibly specific to certain hardware). But ATI and NV have some unique hardware features of their own that games can take specific advantage of (UltraShadow, PCF).

The latest theory is that ATI is slower in Doom 3 b/c it's hierZ culling fails with D3's shadow algorithm, so it ends up having to draw extra pixels that later get thrown out. So, this is a hardware issue. Similarly, ATI apparently boosted 4xAA framerates in D3 by tweaking the way their drivers tell their hardware to work in that mode with that game.

I don't know what's happening with Riddick, though.

AFAIK, with OGL, hardware can be created first and a custom (NV_ or ATI_) OGL extension can be created specifically for it, with generic (ARB_) implementations added later. D3D is regulated by MS, so IHVs (ATI, NV, etc.) will want to know what MS is shooting for before spending time and money on hardware features. OGL is regulated by committee, so a new model or framework may take longer to implement. OTOH, MS has the final word with D3D, so it can apparently move faster, but IHVs may not get their way.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
If it's a driver issue, then i don't see how ati hasn't fixed it after all this time. If nvidias doom 3 advantage was eliminated for all ati cards right now that would be a huge boost for ati so it's definatly worth a lot of time and money for them to fix ogl drivers.

I think carmack just did a poor half assed job optimizing doom 3 for ati cards. AFIAK he pretty much wrote the doom 3 engine all by himself and had an nvidia card in his main development machine.
 

route66

Senior member
Sep 8, 2005
295
0
0
Did Carmack program Doom 3? In the credits he is listed as Technical Director, and there are a few programmers after him. So, does a Technical Director at iD program? Or does he spend all of his time now making rockets.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Originally posted by: crazySOB297
He was the lead programmer for the graphics engine AFAIK

Yeah, in interviews when he talks about the graphics engine he keeps refering to it as 'my code'. There are of course other programmers for other parts of the game, but i think he mostly did the graphics engine himself.
 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
IIRC, nVIDIA has generally (do not mistake for always, as many peole seem to do) performed better in OpenGL games. Correct me if I'm wrong, please.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
There are relatively few OGL games compared to D3D ones, thus ATI may not want to commit scarce resources to improving performance in a small segment of the market (similar to their Linux excuse). While it may be a valid reason, I'm still not satisfied with it. :)

Rewriting drivers that have to take into account a decade or more of backwards compatability across generations of GPUs is no small task.

I believe Tim Sweeney is also listed as "Technical Director" for UE3. They set the architecture and probably task much of the implementation to other programmers. Game engines are probably too complex to be coded mostly by one person, especially if you're reselling them as platforms for other games to be developed on.

Yes, nV has generally had an edge in both OGL performance and (according to Carmack's past .plans) robustness/correctness. I believe they were also the first with a full OGL driver ("ICD," IIRC), rather than a mini one like 3dfx used (only for the commands Quake used and actually mapped to Glide, IIRC).

But both the original Radeon (2x3) and the 9700 (8x1) may not have had pipelines as ideal for mainly multitexture Quake 3-engine games (for which nV's typically higher-clocked 4x2 GPUs may have been at least slightly better-suited before drivers enter the picture). If hierZ is in fact letting the 9700 series down with D3's shadows, then again it may be mainly a fixed hardware issue. The X1800's speed gain in D3 with AA seems to be a programmable hardware issue, or something ATI can improve after the fact.