• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Nvidia shows the whole picture.

Apr 17, 2003
37,622
0
76
"The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks"

ROFLMAO
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
They're generous with their wiggle room, aren't they? :)

Edit - Finally able to load the page and read the article. Seemed like a bunch of marketspeak, but this was an interesting allegation:

"The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge)."

I'm sure I'll hear more about this later today, as people get a chance to read and react to it.

I find it laughable that nV hopes for one driver update per year, given their track record with the FX, which is currently up to more than one HARDWARE update a year. (I know, cheap shot. :D But nV can afford it, as they appear to be bouncing back.)
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
The company's current line of cards based on the GeForce FX architecture (which is included in the GeForce FX 5800 and 5900 series), attempts to maximize high-end graphics performance by supporting both 16-bit and 32-bit per-color-channel shaders--most DirectX 9-based games use a combination of 16-bit and 32-bit calculations, since the former provides speed at the cost of inflexibility, while the latter provides a greater level of programming control at the cost of processing cycles. The panel went on to explain that 24-bit calculations, such as those used by the Radeon 9800's pixel shaders, often aren't enough for more-complex calculations, which can require 32-bit math.

So... 16-bit is good because it's flexible, 32-bit is good because it gives a better amount of programming control, but 24-bit isn't enough for complex calculations. That makes sense
rolleye.gif
. Valve found it quite easy to work with ATI's "limited" 24-bit shaders, and much more difficult to work with Nvidia's hardware.

Nobody's arguing that 32-bit doesn't give you the best quality. But this is what engineering is all about - tradeoffs. It appears that in this generation of hardware, 32-bit provides too big of a performance hit to be useful in a fully 32-bit shader mode. 24-bit appears to be the perfect balance of performance and video quality, while 16-bit is clearly not enough.

I bet they're going to be having fun with this one at Beyond3d.
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Originally posted by: shady06
yeah, this is basically all PR
Spin, spin, spin... ;) ATI had "shader day" and now Nvidia found something that makes ATI look bad. Whatever! I'll trust an *unbiased* opinion before buying the lip service from a PR staff of Nvidia or ATI.

 

Goose77

Senior member
Aug 25, 2000
446
0
0
god i love PR, gives a good laff all the time!!!

my favoit part!
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games

weren't they caught cheating on benchmarks???


this one is goog too ;)
The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene

attempts to maximize high-end graphics performance by supporting both 16-bit and 32-bit per-color-channel shaders

if your using 16bit regs for a 24bit games, dont u leave some of the seen out???? correct me if im wrong, but a 24bit will not fit in a 16bit reg!



attempts to maximize high-end graphics performance by supporting both 16-bit and 32-bit per-color-channel shaders--most DirectX 9-based games use a combination of 16-bit and 32-bit calculations

im sorry, i dont understand this one, Isn't the dx9 standard 24bits from MS??? where are all these games that they talk about. You would think that if the games did support 16-32bit that they would highly advertives it with links and packaged deals!!!! hummm... ATI & HL2 comes to mind...


anyways... i just love all this kaka!

edit: words
 

Zk1

Junior Member
Sep 20, 2003
23
0
0
say what u want about NV, but atleast they have the ability to make us laugh :)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
if your using 16bit regs for a 24bit sting games, dont u leave some of the seen out???? correct me if im wrong, but a 24bit sting will not fit in a 16bit reg!


To my knowledge Nvidia would use a 32bit reg if it was using a 24bit number.


im sorry, i dont understand this one, Isn't the dx9 standard 24bits from MS??? where are all these games that they talk about. You would think that if the games did support 16-32bit that they would highly advertives it with links and packaged deals!!!! hummm... ATI & HL2 comes to mind...


Just because it is a standard doesnt mean every single shader will require 24bit of precision. You may have shaders that are lower or higher. Just depends on the shader. Nvidia is trying to say they cover all avenues here vs the competitions limited single precision pipeline. Whether or not it really matters is upto the developers. You have some like Valve who says they need 24 bit and some like Carmack and ID who says you dont.

 

Goose77

Senior member
Aug 25, 2000
446
0
0
hehe, ment string!! but even that usage might not be correct, i took it out ;)

To my knowledge Nvidia would use a 32bit reg if it was using a 24bit number.


from what i can remember, they used the 16bit where Nv felt that they could lose a bit of percision! i dont think it mattered whether or not it was 24bit or lower! think about it, if the number or string is less then 24bits, the GPU will not change it to 24bit, it will work with what it is given.

Just because it is a standard doesnt mean every single shader will require 24bit of precision. You may have shaders that are lower or higher. Just depends on the shader. Nvidia is trying to say they cover all avenues here vs the competitions limited single precision pipeline. Whether or not it really matters is upto the developers. You have some like Valve who says they need 24 bit and some like Carmack and ID who says you dont.

Right... not ever shader will require that, but Dx9 does require 24bit, if its less then that its not dx9! remember your dealing with the level of shading, dx9 is 2.0 which runs a 24bit, the others are 1.4, and 1.1 or 1.0 or something runing less bits, cant remember exactly. Shader version correlates to the dx version (i.e. dx9=shader 2.0 dx8.1=shader 1.4 etc.)

as for the limitation of ATI's 24bit, its not limited, if its given a 16bit shader, the 24bit reg will work with it, there will only be 8bits on the reg that is not used! or wasted power!! this is why Nv does so bad at dx9, it combines all its 16bit regs to make 32bit ones.. its a shortcut that cost them power!! using 32 bit takes up alot of space resulting in a large die, so Nv was limited to the amount of 32bit regs that can fit in there. So, in the end ur left with less register to handle 24bit computations, with wasted bits too!
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Yes, the DX9 spec supposedly pegs full precision at 24bits minimum, but the spec also includes a provision for partial precision, which only the FX architecture implements.

I could care less if the games uses FX12, FP16, FP24, or FP32. I do care if the IQ differs among the settings, and we haven't seen pics showing any differences among them. No comparative D3 pics, and none for HL2.
 

Goose77

Senior member
Aug 25, 2000
446
0
0
Originally posted by: BenSkywalker
Right... not ever shader will require that, but Dx9 does require 24bit, if its less then that its not dx9!

FP16 is part of the DX9 spec.

well if thats true, my mistake. i need to reread some article. ty for the info and for correcting me.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: BenSkywalker
Right... not ever shader will require that, but Dx9 does require 24bit, if its less then that its not dx9!

FP16 is part of the DX9 spec.

FP16 is part of the DX9 spec in the same way that DX8 is part of the the DX9 spec. It is provided as a hint which means that software developers have the option use it. Developers also have the option to program in DX8 and it will work under DX9, but that does not mean it complies to the DX9 specifications.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
FP16 is part of the DX9 spec in the same way that DX8 is part of the the DX9 spec.

No, FP16 is new to DX9. Partial precission will produce identical results for a great deal of shaders where running full precission would be wasted resources for the hardware that supports the PP spec. The same is true for shaders that require FP24 but FP32 is overkill for, wasted resources when needing to run in full precission mode(although no current board supports all three standards).
 

Goose77

Senior member
Aug 25, 2000
446
0
0
Ok, can someone put up some links. This is something that i would love to understand!!! This info might shine some light on another reason why Nv did the 16-32bit core.

edit: and all that depends on whether FP16 was added at the beginning or after the Nv fiasco!