Good "Editor Day" article at Firing Squad

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
It was alright, but I'm not sure it was entirely accurate, and it glossed over some issues. (For instance, I don't recall Valve/Gabe having mentioned that FP16 was insufficient for HDR--to the contrary, B3D's frontpage recently featured a short Q&A with Valve that said they were implementing FP16 HDR for nVidia. This is a moot point, though, as AFAIK nV still hasn't exposed the DX9 features nec'y for HDR in their drivers. And the author fails to mention that owners of the first-gen FX's appear to be deficient WRT FP16/32 performance, while all R3x0 cards are solid.) I suppose I'm not as impressed as I've already heard Tim's opinions at B3D and elsewhere, and the other quoted devs didn't have much of note to add.

(Sorry if this is a bit of a downer, I just had two cents burning a hole in my pocket.)
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
I also thought it was an interesting read. A few points:

-Nvidia really *really* seems to have swept the FX5800 under the rug. For owners of this card, good luck.

-Nvidia seems to be beating this "ATI doesn't have enough room for shader instructions" horse into the ground. Shaders slow the he!! out of any video card right now. For example even in Halo with it's crappy DX9 implementation, the glossy DX9 walls drop my framerate from 160 to 60fps. Will developers hit a wall on ATI hardware by late 2004? It's arguable - they might. By 2005? No question. However, for those future fully DX9 (9.1+) games, the shader routines will be so long they will bog the hardware down so much this will be a moot point. As is evident by Half Life 2 getting ~60fps on a Radeon 9800Pro, and future games only to be more demanding, we will need a very powerful GPU to get playable framerates on future, shader heavy games.

-Nvidia is talking the talk about driver optimizations. Now comes the difficult part of walking the walk. They seem to be headed in the right direction (the new 52.xx drivers seem to be much improved) however they said this same thing about agressive optimizations before.

One of the more interesting points Jen-Hsun tried to make was that ATI apparently spends far too much of its time worrying about what NVIDIA does

This is a very questionable comment. If Jen-Hsun did indeed admit to some cheats (or "overly-agressive optimizations") then complaining that ATI is picking through their drivers is nonsensical. ATI is in direct competition with Nvidia, and Nvidia has been caught in the past cheating. Why wouldn't they keep looking for cheats, at least until Nvidia proves they can be clean for awhile? There are still a few games where "overly agressive" cheats are apparent (ie, check the Wolfenstein pics in this review ).

-Nvidia certainly isn't going anywhere in the video market. They have the best driver team, clever engineers, good hardware design, their agressive 6-month product cycle, etc. ATI is giving them stiff competition, no question, but Nvidia certainly has tons of fight left in them. And that's a good thing because Nvidia has innovated much more in the video market than ATI has (IMO).

-Nvidia looks to be at least a bit frightened by ATI. Either that or they aren't and they should be. Their casual dismissal of ATI as a competitor might be because they are supremely confident they will kick their butts in the next generation, but more likely they have been blindsighted by the success of the 9700/9800 cards and want to keep even the acronym "ATI" as far out of everyone's mind as possible.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Their casual dismissal of ATI as a competitor might be because they are supremely confident they will kick their butts in the next generation, but more likely they have been blindsighted by the success of the 9700/9800 cards and want to keep even the acronym "ATI" as far out of everyone's mind as possible.

I don't think this is the case. It could be they just didn't host the event to make a direct attack on their competition like ATI did with "shader day". I trust these developers FAR more than Gabe for 5 million reasons*. I don't doubt that 24 bit precision is better in DX9 16 or mixed, but if the end result is water being a little shinier, and no cards can run this what I consider acceptably, who cares?

Anyway, I thought it was good to read the comments from some other prominent developers.


*I know Gabe could be "totally impartial" but I'm also betting the man who's company received $5m to partner with ATI is not going to be speaking the whole truth at an ATI sponsored event. I must be one of those "conspiracy theory" guys.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
Their casual dismissal of ATI as a competitor might be because they are supremely confident they will kick their butts in the next generation, but more likely they have been blindsighted by the success of the 9700/9800 cards and want to keep even the acronym "ATI" as far out of everyone's mind as possible.

I don't think this is the case. It could be they just didn't host the event to make a direct attack on their competition like ATI did with "shader day". I trust these developers FAR more than Gabe for 5 million reasons*. I don't doubt that 24 bit precision is better in DX9 16 or mixed, but if the end result is water being a little shinier, and no cards can run this what I consider acceptably, who cares?

It doesn't matter if 16-bit is only a little less shiny, or whatever you want to call it. 24-bits is the minimum precision for the DirectX9 spec. Anything below that is non-compliance with DX9.

Nvidia has levied "mixed-mode" into DX9 (and thankfully by the grace of god Microsoft allows mixed mode in DX9) however when an application requests 24-bit minimum precision Nvidia will have no choice to go with their 32-bit solution which, unfortunately, performs horribly.

Check this thread here for more info on Nvidia's non-compliance to DX9, etc (credit is due to Pete for the link):


 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Nvidia seems to be beating this "ATI doesn't have enough room for shader instructions" horse into the ground. Shaders slow the he!! out of any video card right now. For example even in Halo with it's crappy DX9 implementation, the glossy DX9 walls drop my framerate from 160 to 60fps. Will developers hit a wall on ATI hardware by late 2004? It's arguable - they might. By 2005? No question. However, for those future fully DX9 (9.1+) games, the shader routines will be so long they will bog the hardware down so much this will be a moot point. As is evident by Half Life 2 getting ~60fps on a Radeon 9800Pro, and future games only to be more demanding, we will need a very powerful GPU to get playable framerates on future, shader heavy games.


I dont think game developers will but......................Nvidia is also marketing these cards for professionals. Imagine taking the shader programs you wrote and had a CPU farm perform it at .5fps but now you can do it on the card at .5 fps. The longer shader length at this point in time imo has nothing to do with games. I dont think any of the hardware can run a shader program 50 instructions deep at anything we could consider playable frame rates. But in a situation where you dont require playable frame rates but can cut down on costs of hardware or even time?

Maybe in 5 years will have cards that can run a 1000 instruction long shader program at 60 FPS. But not on todays hardware.

It doesn't matter if 16-bit is only a little less shiny, or whatever you want to call it. 24-bits is the minimum precision for the DirectX9 spec. Anything below that is non-compliance with DX9.


I thought they had a partial precision mode for DX9? Microsoft apparently felt FP16 was worthy of something if they included it in the spec. Full precision is FP24.

Check this thread here for more info on Nvidia's non-compliance to DX9, etc (credit is due to Pete for the link):


I am sure there are a lot of things ATI is not compliant with. I am sure DX9 is a far stretching spec with many things. Just because you dont support every single one of them to a T doesnt mean your card is not DX9 compliant.


Edit: Also I thought for HDR you had to have FP targets available? I think the 52.xx series of drivers opened up that functionality.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
24-bits is the minimum precision for the DirectX9 spec. Anything below that is non-compliance with DX9.
Gasp! Oh no! That is definitely a big deal these days and in the near future!
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I don't think this is the case. It could be they just didn't host the event to make a direct attack on their competition like ATI did with "shader day". I trust these developers FAR more than Gabe for 5 million reasons*.
You know that UT2K3 and all EA games run under the TWIMTBP banner, right? And Shader Day(s) weren't held to attack nV. In fact, did Gabe even mention nV by name when listing those driver issues? The only nV-specific claim was that their Det 5x's didn't render everything correctly, thus were excluded from benchmarking. I also don't think we've had official confirmation of how much ATi paid Valve.

Nvidia is also marketing these cards for professionals. Imagine taking the shader programs you wrote and had a CPU farm perform it at .5fps but now you can do it on the card at .5 fps.
Well, ATi has ASHLI to help convert longer shaders into smaller ones. I suppose nV's support for 32-bit, assuming it's IEEE-comlpiant, gives it an edge if people will be using 3D cards to render final scenes. But I suspect 3D cards will be used more for previews for a little while yet. Maybe PCI-X will accelerate things, making it far faster to swap data b/w the CPU and video card.

Edit: Also I thought for HDR you had to have FP targets available? I think the 52.xx series of drivers opened up that functionality.
I think they're listed as being so in the release notes, but people at B3D are saying that functionality is still not exposed.

Edit: Spelling.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
Nvidia seems to be beating this "ATI doesn't have enough room for shader instructions" horse into the ground. Shaders slow the he!! out of any video card right now. For example even in Halo with it's crappy DX9 implementation, the glossy DX9 walls drop my framerate from 160 to 60fps. Will developers hit a wall on ATI hardware by late 2004? It's arguable - they might. By 2005? No question. However, for those future fully DX9 (9.1+) games, the shader routines will be so long they will bog the hardware down so much this will be a moot point. As is evident by Half Life 2 getting ~60fps on a Radeon 9800Pro, and future games only to be more demanding, we will need a very powerful GPU to get playable framerates on future, shader heavy games.


I dont think game developers will but......................Nvidia is also marketing these cards for professionals. Imagine taking the shader programs you wrote and had a CPU farm perform it at .5fps but now you can do it on the card at .5 fps. The longer shader length at this point in time imo has nothing to do with games. I dont think any of the hardware can run a shader program 50 instructions deep at anything we could consider playable frame rates. But in a situation where you dont require playable frame rates but can cut down on costs of hardware or even time?

The FX series is primarily a gaming card, and that is their largest market by several orders of magnitude.

Maybe in 5 years will have cards that can run a 1000 instruction long shader program at 60 FPS. But not on todays hardware.

Exactly! Which is why Nvidia designing the FX series with room for so many instructions is (at the moment) pointless - the DX9 spec only calls for 64 line (or is it 96) shader instructions.

It doesn't matter if 16-bit is only a little less shiny, or whatever you want to call it. 24-bits is the minimum precision for the DirectX9 spec. Anything below that is non-compliance with DX9.


I thought they had a partial precision mode for DX9? Microsoft apparently felt FP16 was worthy of something if they included it in the spec. Full precision is FP24.

Not initially. FP24 was the standard. After the FX series came out, Nvidia was screwed, so they lobbied for mixed mode. Thankfully for them, Microsoft put it in.

I am sure there are a lot of things ATI is not compliant with. I am sure DX9 is a far stretching spec with many things. Just because you dont support every single one of them to a T doesnt mean your card is not DX9 compliant.

No, the Radeon 9800 is 100% compliant with DirectX9 (the FX series is not). Neither company is 100% compliant with DX9.1 yet.

Originally posted by: Rollo
24-bits is the minimum precision for the DirectX9 spec. Anything below that is non-compliance with DX9.
Gasp! Oh no! That is definitely a big deal these days and in the near future!

It's already an issue! See: Half Life 2.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Well, ATi has ASHLI to help convert longer shaders into smaller ones. I suppose nV's support for 32-bit, assuming it's IEEE-comlpiant, gives it an edge if people will be using 3D cards to render final scenes. But I suspect 3D cards will be used more for previews for a little while yet. Maybe PCI-X will accelerate things, making it far faster to swap data b/w the CPU and video card.


It isnt necessarily the shaders but also the precision. 24bit is fine for small shaders in this regard but what happens when you multiply a lower precision number over and over? Errors will occur. Ill have to dig up a thread on beyond3d where they talked about this. There was some fascinating things being tossed around.

The FX series is primarily a gaming card, and that is their largest market by several orders of magnitude.


This is correct but Nvidia is also selling the exact same chip in thier quadro line of cards. And they do it at 5 times the price.

Not initially. FP24 was the standard. After the FX series came out, Nvidia was screwed, so they lobbied for mixed mode. Thankfully for them, Microsoft put it in.


Do you have a link on when FP24 was made the standard? The best I could come up with was July 12th of 2002 FP16 was still being tossed around as the minimum precision.

No, the Radeon 9800 is 100% compliant with DirectX9 (the FX series is not). Neither company is 100% compliant with DX9.1 yet.


Got a link?

It's already an issue! See: Half Life 2.


Is it? I dont even see HL2 out yet. And the screen shots I saw between a DX9 shader and DX8 was almost identical.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
Well, ATi has ASHLI to help convert longer shaders into smaller ones. I suppose nV's support for 32-bit, assuming it's IEEE-comlpiant, gives it an edge if people will be using 3D cards to render final scenes. But I suspect 3D cards will be used more for previews for a little while yet. Maybe PCI-X will accelerate things, making it far faster to swap data b/w the CPU and video card.


It isnt necessarily the shaders but also the precision. 24bit is fine for small shaders in this regard but what happens when you multiply a lower precision number over and over? Errors will occur. Ill have to dig up a thread on beyond3d where they talked about this. There was some fascinating things being tossed around.

The FX series is primarily a gaming card, and that is their largest market by several orders of magnitude.


This is correct but Nvidia is also selling the exact same chip in thier quadro line of cards. And they do it at 5 times the price.

Just because they charge 5 times the price does not mean they are making equal money from both the professional and the gaming sectors. They sell many, many more boards as gaming cards than professional ones. ATI also sells a professional board. What's your point? We're talking about the FX as a gaming card here.

Not initially. FP24 was the standard. After the FX series came out, Nvidia was screwed, so they lobbied for mixed mode. Thankfully for them, Microsoft put it in.


Do you have a link on when FP24 was made the standard? The best I could come up with was July 12th of 2002 FP16 was still being tossed around as the minimum precision.

Well here's an interview with Nvidia about it.

According to DirectX 9 spec, FP24 is allowed, but we at NVIDIA believe it is insufficient to avoid errors for a variety of texture and geometry calculations. So we designed our products to use FP32. For example, NVIDIA uses FP32 for texture address calculation. In the case of dependent texture reads (e.g. a bumpy shiny object with a reflection map in a scene), full precision (FP32) for the texture address calculation is critical for getting a high-quality result.

^ Can you not see how FP32 has totally backfired on Nvidia and how they originally called FP24 "insufficient" and yet they use FP16 now for many purposes? Isn't this called being a hypocrite?

No, the Radeon 9800 is 100% compliant with DirectX9 (the FX series is not). Neither company is 100% compliant with DX9.1 yet.


Got a link?

Check the most recent Microsoft whitepapers on DX9 to see what they say about it. I posted a link before to Microsoft where they call the Radeon 9700 Pro the first fully compliant DX9 card but every hardware site already says this. The only thing that is keeping the FX card from being 100% compliant is no FP24 support. They can alleviate this by running everthing at FP32, but they don't because performance is atrocious.

It's already an issue! See: Half Life 2.


Is it? I dont even see HL2 out yet. And the screen shots I saw between a DX9 shader and DX8 was almost identical.
[/quote]

Valve has talked about their pains taken to create the mixed mode time and time again. This is due to (brace yourself for it) Nvidia's non compliance to DX9 due to lack of FP24 support . I've said the same thing in a clear and concise manner - what part don't you understand?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
It's already an issue! See: Half Life 2.
I'd love to, but I can't because of holes in Valve security which allowed thieves to delay it even further. Maybe next summer I'll see HL2.

Pete:
I know some of those companies partner with nVidia, but id doesn't. My point was it was interesting to see the other side of the story on DX9, as I don't expect Gabe to point out the highlights of nVidia, or the low points of ATI.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Well, nVidia implemented some seemingly (for the moment) id-specific suggestions into FX hardware (stencil shadows), and id had that nVidia-sponsored Doom 3 benchmark event. So I don't know if id as a company is totally impartial (Todd H said he showed up b/c nV is "a good business partner of ours"), though I believe JC is. I agree that the article may be interesting in its potentially disinterested take on things. But you have to keep in mind the interviews were conducted at an nVidia-sponsored event. :) I also believe some of the things the author glossed over impinge on the article's neutral stance, though his errors/omissions were probably not purposeful (as he himself stated in the article, and later in B3D). As I said in that B3D thread on the article, perhaps I'm unrealistically expecting perfection. It was a good article, but I often gloss over the pleasantries to get at the perceived errors.