• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Shader Replacement

In the AT article Derek talked about this being the first GPU since the Geforce 4 that Nvidia has not had to use Shader replacement for.

Can someone explain this? I also assume this is a good thing, so what are the benefits to this?

-Kevin
 
Shader replacement is a technique for improving performance by rewriting pixel shader programs (the most common, but potentially also vertex shaders) that ship with a game to run better on a specific piece of hardware or shader architecture.

The only proven example of shader replacement so far is ATi's Doom3 replacement shader, which susbstitutes math instructions (which the ATi shader architecture is good at) for the original texture lookup instructions (which ATi is poor at compared to competing shader architectures).

Most of the fuss and controversy surrounding the issue, was fuelled by ATi and its supporters, alleging that their competition makes widespread use of shader replacement. As staed above, the Doom3 shader, by ATi is the only proven example sof far...
 
The only proven example of shader replacement so far is ATi's Doom3 replacement shader
If you seriously think nVidia is not guilty of shader subsititution I can only assume you're either blatantly trolling or have been living under a rock for the last few years.

But based on your "ATi doesn't do application profiles" comments I'd be inclined to believe it's a combination of the two.
 
Tommti sytstems 3d Analyser has featured to ability to save all shader programs in an application to file.

This allows them to be compared between the shader source code (where available) and output from other shader architectures.

3D Analyzer also features an anti-detect mode that prevents driver detection of shaders.

Those not living under rocks would know that such information is invaluable when detecting shader replacements...
 
This allows them to be compared between the shader source code (where available) and output from other shader architectures.
In many cases the visual output is identical.

3D Analyzer also features an anti-detect mode that prevents driver detection of shaders.
Uh-huh, and how many reviewers use this? For that matter how many games have you used it on?

Not to mention Riva Tuner used to have anti-cheat measures but nVidia defeated them by encyrpting the drivers. Seems mighty strange to do something like that if you've got nothing to hide, don't you think?

Unwinder also found over 40 applications being detected at the driver (i.e. non-profile) level. Coupled to the the comments from Carmack, 3DMark's audit report and general inconsistencies with nVidia when applications are renamed and you'd have to be a loon to make your original comment.
 
There is nothing wrong with shader replacements that are mathematically and visually identical.

There is also nothing wrong with compiler optimization of shaders (reordering of instructions to best suit the particular architecture) and both ATi and nVidia do this.

There is a problem when the replacement shader is not visually or mathematically identical, such as with the ATi Doom3 shader. In fact, the visual difference part is what ATi tried to argue nVidia was doing with its alleged shader replacements, when in fact, THEY are the ones providing a replacement shader (Doom 3) that produces incorrect (or unintended by the shaders author) ouput.
 
There is nothing wrong with shader replacements that are mathematically and visually identical.
I agree totally, if you can do the replacement without relying on any prior knowledge such as application detection.

And if the replacement is generic enough to be able to replace similar shaders in different games.

If you can't it's not a legitimate optimization at all but a cheat.

Do you think when Intel or AMD design their processor architectures they put in "if game = quake 3 and shader = x do this"?

There is also nothing wrong with compiler optimization of shaders (reordering of instructions to best suit the particular architecture) and both ATi and nVidia do this.
Exactly the same as above.
 
Catalyst A.I. is shader replacement and NVIDIA has an as-of-yet unnamed shader replacer for the rest of its cards. It's to improve performance on a per-game basis, which is considered by some cheating. As long as the visual result differences are negligible, I could honesly care less. Kudos to NVIDIA for releasing a legit GPU, per se. Improving processing efficiency is always the best way to do things...and it doesn't hurt quality.
 
There is NO application detection going on where nVidia is concerned, unlike ATi, who detect Doom3 then replace the shader!!!
 
It would seem ATi is growing more and more desperate by the day.

Check out these ATi shader related "issues" that Tech-Report has uncovered in ATi's latest drivers (you know - the "performance increasing" ones... bet I know how they increased the performance too! 😉 )

Tech Report 7800 GTX review - Shadermark page1
7800 GTX review - Shadermark page2
7800 GTX review - Shadermark page3
7800 GTX review - Shadermark page4

Considering how much the ATi supporters used to love to use Shadermark to bash nVidia this is beautifully ironic IMO :laugh: :evil:

What goes around comes around...
 
Back
Top