Well, if you've read the description of the current nv3x code path, many shaders have already been degraded to 16-bit instead of 24 (which is 1/256 the potential accuracy) and performance is still poor. Based on how nV has optimzed for 3DMark, UT2K3, etc. there is a fair amount of existing evidence that nV will degrade the IQ further to get the speed up.
(ed) but you're right that I haven't seen the code in question and haven't even written any 3d rendering code since grad school, so I can't say with certainty that nV won't find some major optimizations that don't hurt image quality. Considering that Valve has already spent "5x the time" on the nV code path it doesn't seem too likely though.
John Carmack has said repeatedly there is almost no difference between 16FP and 32FP. So what would make you think 16FP-->24Fp would be so glaring of a difference? The screenies I have seen I cant even tell the difference between 24FP(Full DX) and 12Bit Integer(DX8). They look the same to me............... So I highly doubt Ill will be seeing a difference between 16FP and 24FP.
And while we are on it lets talk about ATi degrading picture quality in the same benchmark as Nvidia did and the whole Quake mess. Get over it, they all do it, it doesnt make it right, but to think ATI is above this is hilarious.
And no performance is not poor using Nvidias codepath. John Carmack states it like this if I remember right.
ATI on standard code path > Nvidia
ATI > Nvidia on 32FP
Nvidia> ATI using Nvidia codepath.
The fact Valve cant seem to write descent performing shader programs using the Nvidia codepath does not mean it will be like this for every game. Hardocp has some Doom3 preview benchies and apparently John can get the Nvidia hardware running pretty nice. The scores I remember seeing was something like Nvidia 60 FPS, R350 ~35 FPS.