^ Those specs were invented by a hardware guru named
Chairmansteve in the fall of 2002. They have gotten more press than pretty much any other hardware guess I've seen. Probably because they're pretty close to being ballpark accurate (albeit pretty optimistic).
Here's the original post of the
specs.
About NV40. There are two approaches Nvidia can take. Stick with 32-bit (which will obviously have much better performance on next-gen hardware) or scrap the mixed-mode for the time being and adopt 24-bit like ATI has. Both companies are being tight-lipped about their next-gen hardware so we really don't know what they will be doing aside from rumours.
There are a few factors to consider for guessing which route they take:
1) is DX 9.1 really going to use FP32 as a minimum? I think this seems doubtful for a few reasons. First of all, that would alienate all current hardware from DX9.1 games, and we haven't really had any DX9 games to speak of yet (I think the total is ~3 games so far). Second of all, performance is much slower with FP32, and for the moment, the IQ "improvement" is non-existant.
2) Nvidia made an engineering decision to go with 32-bits back with the NV30. They could not redesign a new GPU around 24-bit in time for the NV35- that would've taken them much longer than the short turnaround for NV35. However, will being "stuck" with 32-bit carry over into NV40? It might. Now that might be a mixed blessing. ATI will have to switch to pure 32-bit sometime in the future. ATI might need to switch to FP32 for the next generation anways, and Nvidia has already done this. *However*, With DX9 being already so new, I wouldn't be surprised if FP32 was not a necessity until the generation
after NV40/R420. I mean, Doom3 is set to launch around the time that NV40 and R420 are supposed to be released, and it still has some FP16 code!! (at least for the FX cards) Let alone HL2 and other future games which will use the DX9 standard, FP24.
3) It all depends on the performance also. If next-gen hardware is powerful enough to run FP32 at speeds similar to FP24, then perhaps both ATI and Nvidia will switch to FP32. Again, we don't know if it will or not. Unless the new GPU's are
drastically more powerful than current hardware though, it looks like FP32 will still have a very large performance penalty.
4) Nvidia seems to have taken a gamble with Cg and failed. They tried their luck at a proprietary API and it blew back in their face. Hopefully, they will stick to trying to run the standard APIs faster than the competition (DirectX and OpenGL- the second one which they already have a leg up on ATI). This is what got Nvidia in the lead in the first place - they kept up with the DirectX specs and beat 3dfx's Voodoo cards in DX and OpenGL. Plus they supported all of the DX features, while 3dfx was only partially compliant (Nvidia had 32-bit colour support and later stuff like bump mapping, etc in DX7, while 3dfx kept trying their own proprietary stuff, like motion blur).
If you look at the GeForce3, Nvidia worked very closely with Microsoft and follwed the spec exactly, and they launched the GF3 way before ATI had a DX8 card. Nvidia had a huge performance lead over current hardware at that time, and the GF3 was a wild success. Instead of using the same strategy for DX9, they seem to have gotten a little too clever, and tried to do their own thing, while ATI followed Nvidia's example to a T. For the R9700 Pro, it was ATI who worked with MS very closely to have their card follow the DX9 spec to the letter.