NVidia has updated their NVAPI to support Shader Intrinsics.

greatnoob

Senior member
Jan 6, 2014
968
395
136
More Gimpworks? No thanks.

Compiler optimisations, basically it's describing converting some code like this (note that this is NOT shader code nor shader assembly, this is an example of CPU code inlining)
Code:
int a = max(5,10);
int max(int i, int j) {
  return i > j ? i : j;
}

to this:

Code:
int a = 5 > 10 ? 5 : 10;

except for GPUs and for 'niche' functions using Nvidia's shader extensions.

The example above is pretty crap tbh because it's 1) not in a shading language and 2) doesn't show the assembly output where the difference in instructions called is actually shown, but that's the gist of what's happening: reducing lines of code (assembly instructions in actuality) into a simplified and optimised version (uses instructions that Nv gpus can run on actual hardware to optimise better for).

These instruction sets were used by the Nvidia DX11 drivers. This is just a way to open up and document the hidden instructions used by the driver team so game developers can use them for DX12 too via shader extensions.

EDIT: limited dx12 driver optimisations means developers are responsible for what Nvidia has been doing with its DX11 drivers so this exists for optimisations sake.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This is something very, very minor and doesn't need a thread of its own.

I disagree. We need to find out exactly, in layman's terms of course, how this helps nVidia performance in DX12. if it is truly minor we need to know.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
This isn't even news worthy when you consider that these extensions have been available for a VERY LONG time through NVAPI ...