(Heise.de) Nvidia Kepler GPUs are not fully compatible with DirectX 11.1

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If upcoming consoles are DX11 rather than DX11.1 then DX11.1 will be even more irrelevant.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,690
1,278
136
I would be surprised if at least one wasn't some form of GCN. If not, then AMD missed a huge opportunity.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Sure, because you like AMD more than nVidia so for you it's a "very obvious difference between supporting a feature and running it well".

But it seems for AMD there is no difference:
2009 - Tessellation is the most important DX11 feature
2010 - Went into the public and spread false information about Tessellation and the implementation in HAWX 2. Now AMD is the force behind the nearly non existent of Tessellation besides PN-Tessellation for low poly characters in games.

Why should i care about 11.1 when not even 11.0 get full support from both vendors?

It's consoles that are holding back tessellation. The game models need to be designed for tessellation from the start for it to be properly implemented. Adding tessellation to relatively high poly models just adds unneeded triangles and kills performance rather than helping it, as is the purpose of tessellation. Using tessellation for edge smoothing is the best implementation of it, at the moment.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Consoles are DX9 ATM so anything higher should of been totally irrelevant also.

I think the consoles often are blamed too much nowadays, a big reason DX9 lived for so long was because it was the only way to target the XP users.
The same will likely apply with Windows 7 and the longevity of DX11.0.

And it's well known that the console hardware have more efficient APIs and support more features than the hardware would do on the PC

Anyway, I don't see why anyone would expect the PS4/720 not to have DX11.1 class hardware.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
It's consoles that are holding back tessellation. The game models need to be designed for tessellation from the start for it to be properly implemented. Adding tessellation to relatively high poly models just adds unneeded triangles and kills performance rather than helping it, as is the purpose of tessellation. Using tessellation for edge smoothing is the best implementation of it, at the moment.

Check Assassin's Creed III and it's use of tesselation.
Never more will game be published without fully deformable and tessellated snow.


6481d061229eeb6723849195d2712fb3_lightbox.jpg
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It's consoles that are holding back tessellation. The game models need to be designed for tessellation from the start for it to be properly implemented. Adding tessellation to relatively high poly models just adds unneeded triangles and kills performance rather than helping it, as is the purpose of tessellation. Using tessellation for edge smoothing is the best implementation of it, at the moment.

So, i guess you never saw a car tyre, right?
http://www.abload.de/img/hma2012-11-2312-39-105wpsl.jpg

If you think that Tessellation should only be used for "edge smoothing" than I think we need only the new 8 optional features of 11_0 which were introduced by the 11.1 runtime.

Problem solved. Everyone is happy.
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
It's consoles that are holding back tessellation. The game models need to be designed for tessellation from the start for it to be properly implemented. Adding tessellation to relatively high poly models just adds unneeded triangles and kills performance rather than helping it, as is the purpose of tessellation. Using tessellation for edge smoothing is the best implementation of it, at the moment.
Sadly on the DX11 tesselation the objects already need to be decently tesselated to get proper results.
You simply cannot add it to huge wall and expect a it to work, you need to divide the mesh to much smaller quads. (remember you can only go as high as 64 divisions, it really isn't a lot of details.)
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Kepler supports UAVs in pixel and compute shaders. But not in tessellation related (hull and domain shader) or any other shader type before the rasterizers. And in my opinion, that is gaming relevant. Use of UAVs in vertex, geometry, hull and domain shaders have no potential use for anything outside of gaming. Where else does one use this shader types? So nV's claim they support all gaming relevant DX11.1 features is plain wrong in my opinion.
If I'm not mistaken, they don't support a single feature of level 11_1 which is not optional for level 11_0.

It might just be that Microsoft didn't want UAV acces by all shader types to be a new option for level 11_0. If Kepler doesn't support 64 UAV or TIR or anything else from level 11_1, there is no way for Nvidia to expose UAV access by all shader types.

UAV is a random access (read | write) view on a buffer. You can do scattered writes to an UAV fe. You can already render out into UAVs if you want in 11.0
I think you can not go without RT _and_ DS (+UAV) in 11.0, because then the whole pipeline goes to sleep basically. I'd say the "feature" it's really only a guarantee that nothing just turns off, you should be able to do the feature itself without problems if your hardware isn't a bit inflexible.
nVidias problem is likely the 64 UAVs, not that you can't turn off RT & DS together.

Edit: You also loose the resolution-information when you don't have a RT&DS bound, there it goes hand in hand with the "Target-Independent Rasterization" feature of DirectX 11.1.

http://forum.beyond3d.com/showthread.php?t=58668&page=233


When it comes to facts there are 2 parts, 1 to establish the facts and 2 do they matter to the individual once they know them, you don't need part 2 to for the right to know part 1 if the individual desirers it because everyone has a right to facts.


1)If i pay for 116dB SNR product but rumours are saying that its only in fact 115dB SNR components i have a right to establish the facts and whether i want to take action on it. 2) Would i notice the difference very unlikely and based on that would likely not do anything unless i paid a premium for the 116dB SNR product.

Personally i like to know facts in this case but i would not worry as im sure the next round of cards will be out before a single DX 11.1 game is, it maybe for the people who plan to hold on to the current gfx cards for considerable time.

Its not always about visual difference but performance difference just like Assassin Creed DX10.1.


First impression stick (thats where AMD went wrong with 7xxx launch, right ?) so the fact the press release and public questioning initially claimed DX11.1 would be left in the minds of people and releasing another press release stating its in fact only DX11 while the competition has a DX11.1 is not good for the marketing image, so just put DX11 on the box on the final product because specification can change at any time with out notice and most people wont read all of it before buying it.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Honestly I dont see a difference between DX9 and DX11

what is 11.1 gonna bring. Who cares.. ALl games right now are DX9 and DX10 ,

no worries kepler peeps...

False....DX9 and DX11 does have visual differences and performance.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
False....DX9 and DX11 does have visual differences and performance.

Indeed depending on the game and Tessellation (which you don't get at all in DX9) is sometimes very noticeable and lack of some AA modes and effects in some games and sharpness as the Dirt 2 DX9 and DX10 comparisons.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So I read that article as best I could but I do not really see what the fuss is about. What does this deny me as a gamer who is buying video cards for my PC games? I sure don't buy cards to run flash and stare at the 2D desktop all day. If that's my main concern I'd just use my iGPU.

So someone please break this down without your crap slinging about "nvidia lied" or "AMD lied about tessellation before". Explain how this matters to a gamer and also explain to me why it matters if my card does or doesn't do some of these things that even Nvidia said aren't relevant to gaming since they only apply to 2D.

This is a thread designed to troll anyway that much is obvious. It's just hidden underneath the guise of a discussion. When any argument is made to attempt to explain Nvidia's side or to point out that AMD hasn't always been compliant when they claimed to be etc. it's greeted with veiled hostility. We've had enough of that here so someone please explain why I should care what this article says.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
So I read that article as best I could but I do not really see what the fuss is about. What does this deny me as a gamer who is buying video cards for my PC games? I sure don't buy cards to run flash and stare at the 2D desktop all day. If that's my main concern I'd just use my iGPU.

So someone please break this down without your crap slinging about "nvidia lied" or "AMD lied about tessellation before". Explain how this matters to a gamer and also explain to me why it matters if my card does or doesn't do some of these things that even Nvidia said aren't relevant to gaming since they only apply to 2D.

This is a thread designed to troll anyway that much is obvious. It's just hidden underneath the guise of a discussion. When any argument is made to attempt to explain Nvidia's side or to point out that AMD hasn't always been compliant when they claimed to be etc. it's greeted with veiled hostility. We've had enough of that here so someone please explain why I should care what this article says.

If you read the thread you would know as your questions have been answered already and it has been pointed out that what NV is missing is relevant to gaming despite NV's claims its only relevant to 2D.
Whether you should care or not is upto you.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
If you read the thread you would know as your questions have been answered already and it has been pointed out that what NV is missing is relevant to gaming despite NV's claims.

Nice try...I did read the thread. Nobody here has described why 2D stuff matters in UE4 for example.

Nvidia is claiming that Kepler can do what matters to game developers at this point and this article is only looking at a checklist of features. Nobody has told me why those features that aren't supported will matter in the next Frostbyte or Crytek engine.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
No, none of it has been explained at all. Nice try...I did read the thread. Nobody here has described why 2D rasterizing matters in UE4 for example.

The thread was not about UE4.

Kepler supports UAVs in pixel and compute shaders. But not in tessellation related (hull and domain shader) or any other shader type before the rasterizers. And in my opinion, that is gaming relevant. Use of UAVs in vertex, geometry, hull and domain shaders have no potential use for anything outside of gaming. Where else does one use this shader types? So nV's claim they support all gaming relevant DX11.1 features is plain wrong in my opinion.
If I'm not mistaken, they don't support a single feature of level 11_1 which is not optional for level 11_0.

It might just be that Microsoft didn't want UAV acces by all shader types to be a new option for level 11_0. If Kepler doesn't support 64 UAV or TIR or anything else from level 11_1, there is no way for Nvidia to expose UAV access by all shader types.

UAV is a random access (read | write) view on a buffer. You can do scattered writes to an UAV fe. You can already render out into UAVs if you want in 11.0
I think you can not go without RT _and_ DS (+UAV) in 11.0, because then the whole pipeline goes to sleep basically. I'd say the "feature" it's really only a guarantee that nothing just turns off, you should be able to do the feature itself without problems if your hardware isn't a bit inflexible.
nVidias problem is likely the 64 UAVs, not that you can't turn off RT & DS together.

Edit: You also loose the resolution-information when you don't have a RT&DS bound, there it goes hand in hand with the "Target-Independent Rasterization" feature of DirectX 11.1.

http://forum.beyond3d.com/showthread.php?t=58668&page=233
Make of it what you will.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That's simply opinion on a feature and unless this person is working on a game, we don't know that this is what UAV is going to be used for.

Personally, until there's a developer that comes out and explains why such and such a feature won't work on Nvidia cards. This doesn't matter.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
That's simply opinion on a feature and unless this person is working on a game, we don't know that this is what UAV is going to be used for.

Personally, until there's a developer that comes out and explains why such and such a feature won't work on Nvidia cards. This doesn't matter.

Second, to improve performance when rendering irregular geometry (e.g. geographical borders on a map), we use a new graphics hardware feature called Target Independent Rasterization, or TIR.

TIR enables Direct2D to spend fewer CPU cycles on tessellation, so it can give drawing instructions to the GPU more quickly and efficiently, without sacrificing visual quality. TIR is available in new GPU hardware designed for Windows 8 that supports DirectX 11.1.

Below is a chart showing the performance improvement for rendering anti-aliased geometry from a variety of SVG files on a DirectX 11.1 GPU supporting TIR:
http://blogs.msdn.com/b/b8/archive/...celerating-everything-windows-8-graphics.aspx
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well, yeah I was wondering what effect this would have in actual games. As far as I know, Direct2D is not related to games. I'm still waiting to see why Nvidia is wrong when they say they support all the gaming related features.

Maybe we won't ever really know.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
You showed him 2D. He asked for 3D.

Tessellation is used for 3D and just because we personally don't know a use for a features does not mean it should al be dismissed because we don't understand it.

As my previous link shows there are opinions of others who seem to understand more who say it does matter to gaming in 3D regardless that there is also a 2d function.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Tessellation is used for 3D and just because we personally don't now a use for a features does not mean it should al be dismissed because we don't understand it.

As my previous link shows there are opinions of others who seem to understand more who say it does matter to gaming in 3D regardless that there is also a 2d function.

It specifically states Direct2D, not Direct3D. Again I am under the impression that Direct2D is not used in games today. Maybe it is and I don't understand how exactly.