Originally posted by: Rollo
Kneel before your green nVidia OVERLORD, Pete!
LOL
This thread is not meant to be an ATI vs nVidia thread, I was honestly attempting to answer Gururu's question: Why did nVidia implement FP32 before it worked well on the card?
While I'm just regurgitating what I have read, I thought that when the nV30 was in development nVidia basically bet wrong that MS would have a 16 bit standard for DX9, and implemented the FP32 and much longer instruction sets with a focus toward developers/Quadro line?
If I'm on crack and someone has a link to an interview where they state,"It was costly and foolish to make those first steps toward FP32, we should have waited for 2005 like ATI" I'll gladly recant.
You're right 20-30% won't close the gap between nV3x and R300, but it is a huge gain.
These threads turn into flame fests whenever someone says something disagreeable to others, but I'll contribute what I read.
When Nvidia designed the NV3x line (which was several years before it launched, as you are well aware), they seemed to be in their 3dfx phase of "forcefeed the industry what we think it needs," ie. with Cg and the like. From what I heard, Nvidia essentially said 16/32-bit precision is 'what the industry needs,' and designed FP16 into their cores knowing full well FP24 was the standard for full precision (or was going to become the standard). Moreover, with their design of NV30, FP24 performance may have been inferior (possibly drastically) to FP16 (more on this below), so they went with the FP16/FP32 design with the intention of using FP16 now and FP32 on the future generations of cards. But hey they'd still support FP32 so you'd feel future-proof

.
ATI, on the other side, was mimicking Nvidia's past success by "designing a GPU around Microsoft's DirectX standards" and went with a strict FP24 design (which was part of DirectX 9.0's spec for "full precision." I believe Microsoft has ammended DX9.0c to now include FP32 for "full precision," although this is several years after DirectX9.0 was launched).
One of the biggest surprises on the 9700 Pro (and one of the reasons it's one of the longest surviving cards in the top performance brackets) is its 256-bit memory architecture, which matched very well with the FP24 spec as it turns out (although honestly, full DX9 games like Half Life 2 are best played on an X800 or 6800 series card). This last point, for the record, I'm totally pulling out of my @ss so I'm probably wrong or at least drastically stretching the truth. Alas, since I'm not an engineer for either company, I don't possess intimate knowledge of the inner workings of the GPU's (forgive me, please!).
Anyways, back to Nvidia; unless you believe the DustbusterFX cooling was their original intention, Nvidia had less memory bandwidth to work with, being on the standard 128-bit bus, and the GPU was probably designed to launch at a slower core than the 500 MHz the DustbusterFX equipped Rollo-approved 5800U debuted with.
---------------------------------------------------------
This is all hearsay, but from what I gathered it's as close to the truth as I can find. ATI did gamble correctly by going with FP24 for their 9700/9800 series cards (sticking to this on their current generation of cards is an entirely different argument), and Nvidia did, apparently, make a misstep for adopting FP16/32 so early.
Of course, FP16/32 is a necessary transition for the industry, and will be the standard most definately next generation (or it it already? Regardless all of the competitors will support it next generation. Unless Rollo deems ATI unfit to call competition

). Crytek, for one, already uses FP blending for HDR (which uses FP32 AFAIK), and thus only works on Nvidia cards. HDR doesn't exactly perform that great even on current generation cards, though, so again ATI's choice of FP24 (at least a generations plus a refresh ago) looks wise.
--------------------------------------------------------
I have a couple more points I'd like to discuss later, regarding HDR in HL2 (or the fact that it may be missing) and regarding how FP24 became the initial standard for DX9 but I'll save them for a later thread because I've spread enough hearsay for one post

.