hehe, ment string!! but even that usage might not be correct, i took it out
To my knowledge Nvidia would use a 32bit reg if it was using a 24bit number.
from what i can remember, they used the 16bit where Nv felt that they could lose a bit of percision! i dont think it mattered whether or not it was 24bit or lower! think about it, if the number or string is less then 24bits, the GPU will not change it to 24bit, it will work with what it is given.
Just because it is a standard doesnt mean every single shader will require 24bit of precision. You may have shaders that are lower or higher. Just depends on the shader. Nvidia is trying to say they cover all avenues here vs the competitions limited single precision pipeline. Whether or not it really matters is upto the developers. You have some like Valve who says they need 24 bit and some like Carmack and ID who says you dont.
Right... not ever shader will require that, but Dx9 does require 24bit, if its less then that its not dx9! remember your dealing with the level of shading, dx9 is 2.0 which runs a 24bit, the others are 1.4, and 1.1 or 1.0 or something runing less bits, cant remember exactly. Shader version correlates to the dx version (i.e. dx9=shader 2.0 dx8.1=shader 1.4 etc.)
as for the limitation of ATI's 24bit, its not limited, if its given a 16bit shader, the 24bit reg will work with it, there will only be 8bits on the reg that is not used! or wasted power!! this is why Nv does so bad at dx9, it combines all its 16bit regs to make 32bit ones.. its a shortcut that cost them power!! using 32 bit takes up alot of space resulting in a large die, so Nv was limited to the amount of 32bit regs that can fit in there. So, in the end ur left with less register to handle 24bit computations, with wasted bits too!