What does DX11 bring?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 14, 2010
78
0
0
As long as developers don't make some cool games using the best of DX11, it will always be BS.

I still prefer running games on DX9, best performance without sacrificing much of graphic quality.

I do wonder sometimes, can you actually tell the difference in the graphics between DX9 and DX10 - in game? Well I can't, but I do notice a huge performance hit running games in DX10.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I do wonder sometimes, can you actually tell the difference in the graphics between DX9 and DX10 - in game?

Again, D3D is just an API, it all depends on what the developer does with it.
If you run the exact same code, shaders etc in DX10 or DX11, then it will look exactly the same as DX9. After all, the GPU doesn't know the difference between APIs. That's all handled by the drivers.
In theory it should be faster in DX10/11 than in DX9 though, because the driver model for DX10/11 is more efficient than DX9. But I've found that even when running identical code, DX9 used to be faster, until only a few months ago, when apparently AMD and nVidia finally managed to get their DX10/11 drivers close enough to get to the level of DX9 performance. Intel drivers are still a disaster in DX10/11.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
As long as developers don't make some cool games using the best of DX11, it will always be BS.

I still prefer running games on DX9, best performance without sacrificing much of graphic quality.

I do wonder sometimes, can you actually tell the difference in the graphics between DX9 and DX10 - in game? Well I can't, but I do notice a huge performance hit running games in DX10.

AFAIK, DX10 allows you to create the exact same graphics with less programing effort and at a higher FPS.
Some rare few games did exactly that. Most went ahead and piled on extra graphical quality though, because newer hardware is capable of that much more.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Most went ahead and piled on extra graphical quality though, because newer hardware is capable of that much more.

Yea, but that doesn't have much to do with the API itself.
If you look at the first DX9 games compared to the latest DX9 games, there's a huge difference in graphical quality aswell, and there's no way you can run one of the latest DX9 games with full detail on an early DX9 card.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yea, but that doesn't have much to do with the API itself.
If you look at the first DX9 games compared to the latest DX9 games, there's a huge difference in graphical quality aswell, and there's no way you can run one of the latest DX9 games with full detail on an early DX9 card.

exactly. I am in total agreement there.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Yea, but that doesn't have much to do with the API itself.
If you look at the first DX9 games compared to the latest DX9 games, there's a huge difference in graphical quality aswell, and there's no way you can run one of the latest DX9 games with full detail on an early DX9 card.

That's true, buth DX10 was slowed down considerably due to the Vista's slow adoption. The first tittle that got updated to DX10 and looked much better was Call of Juarez. Other games like Resident Evil 5, Crysis Warhead, Bioshock and Lost Planet offered marginal image quality improvements with DX10, specially with the HDR thanks to the higher Floating Point precision used, but the killer DX10 game hasn't arrived yet. DX11 is certainly faster than DX10 and the only previous generation of cards that can get most of it benefits are the DX10.1 cards.

The first DX9 game AFAIK was Tomb Raider Angel of Darkness and it indeed looks ugly compared to the latest DX9 currently. I think that DX9 API can't get any better without diping in performance, that's where DX10 was supposed to take in, but it failed and hopefully DX11 will continue that journey.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
the killer DX10 game will never arrive. Because the industry is vomiting up a whole slew of tired old crap.
The only ones who make original good games are indie developers. And they don't use high end graphics.
 

Matrices

Golden Member
Aug 9, 2003
1,377
0
0
The GPU features important to Xbox 720 and PS4 performance will be what's most important to PC gaming performance, as console development drives the graphics scene these days.

(Picture a stalled car at the edge of a cliff that will teleport itself forward in maybe 2012).

Until then, DX11 is just a checkbox used by PR to to sell more GPUs.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
That's true, buth DX10 was slowed down considerably due to the Vista's slow adoption. The first tittle that got updated to DX10 and looked much better was Call of Juarez. Other games like Resident Evil 5, Crysis Warhead, Bioshock and Lost Planet offered marginal image quality improvements with DX10, specially with the HDR thanks to the higher Floating Point precision used, but the killer DX10 game hasn't arrived yet.

It never will either.
DX10 is more about a total redesign of the driver and API than about better image quality.
I personally don't think Call of Juarez looks better than Crysis anyway (for starters, CoJ doesn't have shadowmapping indoors...).
I'd say Crysis *is* the killer DX10 game. The problem is, Crysis is ALSO the killer DX9 game.

DX11 is certainly faster than DX10 and the only previous generation of cards that can get most of it benefits are the DX10.1 cards.

DX11 isn't faster than DX10 at all. The API is virtually identical, and with drivers for DX10/10.1 hardware, there's no difference at all.
In fact, my own engine, which can run DX9, 10 and 11, is actually marginally slower with DX11 than with DX10, because there are extra states to manage (because of the new tessellation and compute shaders).
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
It never will either.
DX10 is more about a total redesign of the driver and API than about better image quality.
I personally don't think Call of Juarez looks better than Crysis anyway (for starters, CoJ doesn't have shadowmapping indoors...).
I'd say Crysis *is* the killer DX10 game. The problem is, Crysis is ALSO the killer DX9 game.

I never compared Crysis with Call of Juarez, Crysis looks almost identical between DX9 and DX10, Call of Juarez don't.

DX11 isn't faster than DX10 at all. The API is virtually identical, and with drivers for DX10/10.1 hardware, there's no difference at all.
In fact, my own engine, which can run DX9, 10 and 11, is actually marginally slower with DX11 than with DX10, because there are extra states to manage (because of the new tessellation and compute shaders).

I don't believe in the results of a engine created by a single individual like you, sorry for that, is nothing personal. Is just that the results obtained by developers with games with sophisticated engines proves you wrong. For example, Metro 2033 runs faster on DX11 than DX10 when Tessellation is Off, and Far Cry 2 runs faster on DX10 than DX9, STALKER Clear Sky runs faster on DX10.1 than in DX10, Dirt 2 runs DX11 as fast as DX9 when Tessellation is Off.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Crysis looks almost identical between DX9 and DX10, Call of Juarez don't.

That doesn't say much about the DX10 quality of a game though.
As I said, Crysis is also the killer DX9 game.

I don't believe in the results of a engine created by a single individual like you, sorry for that, is nothing personal. Is just that the results obtained by developers with games with sophisticated engines proves you wrong. For example, Metro 2033 runs faster on DX11 than DX10 when Tessellation is Off, and Far Cry 2 runs faster on DX10 than DX9, STALKER Clear Sky runs faster on DX10.1 than in DX10, Dirt 2 runs DX11 as fast as DX9 when Tessellation is Off.

Well, this is technology, not religion, 'belief' has nothing to do with it.
The problem with your comparisons is that they aren't apples-to-apples.
These games are tuned to specific DX versions, assuming that newer hardware is faster, and thus applies more effects.
I was talking about running the EXACT same code in DX10 and DX11. That is not a scenario you are likely to encounter in any game (which is why the 'sophisticated engines' don't prove me wrong). Then there is virtually no difference between DX10 and DX11 as I said.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
What does Dx11 bring? -> The need to buy new hardware.

Ironically, no.
DX11 also works on DX9 and DX10 hardware, giving you most of the benefits (multithreading, compute shader for DX10 hardware, more efficient API compared to DX9).
So if anything, DX11 gives your existing hardware a shot in the arm.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
What does Dx11 bring? -> The need to buy new hardware.

No, it's the Programmer Full Employment Act. Just keeps some disaffected Microsoft coders off the streets and out of trouble. You didn't think you ***needed*** any of this crap, did you?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Heh, actually I know some developers who got lured into a false sense of security with XP and DX9's longevity. People of the 'older generation' know that MS may churn out new APIs at an alarming rate sometimes, forcing you to rewrite your code everytime.
But the newer generation, they didn't see it coming with Vista... when DirectSound was no longer hardware-accelerated, and only supported for legacy reasons. And DX10 being a complete culture-shock to people who had only ever used one D3D API, and had no idea of the volatility of it all.

I know one guy who just gave up and moved to OpenGL (and then OpenGL 4.0 happened, deprecating a whole lot, pretty much also requiring you to rewrite most of your code, heh).
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Ironically, no.
DX11 also works on DX9 and DX10 hardware, giving you most of the benefits (multithreading, compute shader for DX10 hardware, more efficient API compared to DX9).
So if anything, DX11 gives your existing hardware a shot in the arm.

So, Dx11 gives your Dx9/10/10.1 hardware Dx11 features? Or, do you need to buy new hardware (and a new OS if you're on XP)?

The bottom line is, if they don't come out with anything new we don't have the "need" to buy new stuff. It's what makes the world go around.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
So, Dx11 gives your Dx9/10/10.1 hardware Dx11 features?

No, that's not what I said, is it? (not hardware features anyway, I said that there are some DX11 API features/benefits. Repeat after me: API and hardware are completely different things).

Or, do you need to buy new hardware?

No, I think it's pretty clear that is exactly what I said, isn't it?

(and a new OS if you're on XP)

That is pretty obvious.

The bottom line is, if they don't come out with anything new we don't have the "need" to buy new stuff. It's what makes the world go around.

I'm just saying that if people have Vista or Windows 7 and a DX9 card or better, they can benefit from the new DX11 API. They don't necessarily NEED a DX11 card for DX11 games. DX11 is a superset of DX10/10.1, and extends support to DX9 hardware aswell.
This makes DX10/10.1 itself completely useless to support any longer. And DX9 needs to be supported for XP only. This will probably disappear in a while aswell (some games are already DX10+/Vista+ only).
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
LOL I can't believe you resorted to a quote fest for that.

I'm just saying that if people have Vista or Windows 7 and a DX9 card or better, they can benefit from the new DX11 API. They don't necessarily NEED a DX11 card for DX11 games. DX11 is a superset of DX10/10.1, and extends support to DX9 hardware aswell.
This makes DX10/10.1 itself completely useless to support any longer. And DX9 needs to be supported for XP only. This will probably disappear in a while aswell (some games are already DX10+/Vista+ only).

Back to the beginning now. Making your old stuff obsolete. Except, of course, unless you want to install Win7 on your old system that has a Dx9 card in it. Assuming it meets minimum specs (not likely). If not.... then it's new hardware time.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
LOL I can't believe you resorted to a quote fest for that.

I did that because I wanted to separate several different issues that you were lumping together, 'LOL'.

Back to the beginning now. Making your old stuff obsolete. Except, of course, unless you want to install Win7 on your old system that has a Dx9 card in it. Assuming it meets minimum specs (not likely). If not.... then it's new hardware time.

Not really. DX11 also works on Vista, it's not a Windows 7-exclusive. A Vista system with DX9 hardware is not unlikely, as Vista has been around for about as long as DX10 hardware. So initially a lot of Vista systems were sold with DX9 hardware, as DX10 was only for the high-end. Hence it's not a case of installing a new OS on an old system. It's a software-update allowing better use of your current hardware.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
That doesn't say much about the DX10 quality of a game though.
As I said, Crysis is also the killer DX9 game.



Well, this is technology, not religion, 'belief' has nothing to do with it.
The problem with your comparisons is that they aren't apples-to-apples.
These games are tuned to specific DX versions, assuming that newer hardware is faster, and thus applies more effects.
I was talking about running the EXACT same code in DX10 and DX11. That is not a scenario you are likely to encounter in any game (which is why the 'sophisticated engines' don't prove me wrong). Then there is virtually no difference between DX10 and DX11 as I said.

Don't be a jackass with your smartass attitude. What the hell religion has to do with technology? Because the word believe is used often on religion that doesn't mean that I can't use it for other topics. Please, then I will retract it and use your rethoric way of thinking. "Sorry, but I don't think that you are right, or you are wrong, or I don't trust you, are you satisfied now?" Thanks you.

DirectX introduced major features like higher texture resolution support, HDR compression formats, computing stuff like Direct Compute, Tessellation and some more stuff for improved performance and none of all those features are available under DX10. You can run the exact same code under DX9 and DX10 and of course both will run the same. But if you try to create for example, specific DX10 code and optimize it, it will simply not run on DX9. :rolleyes:

It isn't like creating code will magically have a boost in performance because of a simply change of API. Metro 2033 proved to run faster under DX11 than DX10, Far Cry 2 proved to run faster under DX10 than DX9, Assassin Creed proved to run faster under DX10.1 than DX10, and we all know that DX11 is a super set of DX10.1, the same goes to DX9.0C being a super set of DX9.0b, but both introduced optimizations to boost speed. If you are gonna use the newer API, you will make sure to boost performance using optimizations and tweak your code, only lazy developers will port the same exact code to different API's and voila!, that's why PC gaming is stalled.

It would be a nice idea if you tweak your code with DX specific tweaks to see how much of a gain can you get, specially with DX10 or DX11 compared to DX9 and posts your results here. :)
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
DirectX introduced major features like higher texture resolution support, HDR compression formats, computing stuff like Direct Compute, Tessellationband some more stuff for improved performance and none of all those features are available under DX10.

Various features, including DirectCompute are now available on DX10 hardware through DX11 (most DX10 hardware exceeded the DX10 featurelist, and those features can now be used through DX11).

You can run the exact same code under DX9 and DX10 and of course both will run the same.

Uhh, my point was exactly that they do NOT run the same.
DX10/11 have a completely new driver/API design, which reduces CPU overhead compared to DX9. On the other hand, DX9 drivers and runtime are extremely optimized, DX10/11 are not fully mature yet. For example, on Intel IGPs I notice that shaders are a lot less efficient when compiled against DX10/11, compared to DX9.
There are just a lot of factors at work here. Is it the API? Is it the driver? Is it the code?

It isn't like creating code will magically have a boost in performance because of a simply change of API.

Yes, it is, in some cases. Or a drop in performance, like for example the port of Source to OpenGL on Mac.

Metro 2033 proved to run faster under DX11 than DX10, Far Cry 2 proved to run faster under DX10 than DX9, Assassin Creed proved to run faster under DX10.1 than DX10, and we all know that DX11 is a super set of DX10.1, the same goes to DX9.0C being a super set of DX9.0b, but both introduced optimizations to boost speed.

Thing is, you never know what you compare.
In Crysis the new DX10 functionality is mainly used for better image quality. In BioShock it is mainly used for better performance.
So comparisons of APIs based on games are useless.
As I already said before, it's not the API, it's what the developer does with the API that matters.

If you are gonna use the newer API, you will make sure to boost performance using optimizations and tweak your code, only lazy developers will port the same exact code to different API's and voila!, that's why PC gaming is stalled.

Many early DX10 games were actually rather 'lazy' ports of DX9, where DX10 was mainly used to run some extra eyecandy... at the cost of performance. This gave DX10 a bad name.
A proper DX10 engine needs to be redesigned from scratch, as the state management is completely different from DX9, and retro-fitting DX9-code into a DX10 framework is going to be horribly inefficient... yet this is what various early 'DX10' games did.

I approached it from the opposite direction... I started a new DX10 framework with a clean slate, then I retrofitted DX9 into that, backporting some of the new DX10 interfaces to DX9. This means that DX10/11 performance is optimal, and DX9 overhead is marginal. There's just some DX9 functionality that is 'cut off', because it doesn't exist in the newer APIs anymore, like fixed function T&L (although I have put in a backdoor so you can still use it in DX9... but then you can't use the other APIs anymore).
But proper DX9 engines have been purely shader-based for years anyway.

It would be a nice idea if you tweak your code with DX specific tweaks to see how much of a gain can you get, specially with DX10 or DX11 compared to DX9. :)

But that's exactly what I do. I create an engine that is optimized to the extreme to get the most from any of the three APIs (if it wasn't, there was no way DX10/11 could be anywhere near as fast as DX9... DX10/11 are considerably more difficult to use efficiently, as part of the runtime is just 'missing').
But I do compare the same algorithms, because as soon as you compare different algorithms, it no longer says anything about the API. And I was talking about the API.
Everything else comes down to 'what you do with it', as I already said. And then you can go both ways... better performance, or better quality.
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Oh well, what can we expect from Sleazysoft I mean, Microsoft :rolleyes:

I have a question, did you used Multi Sample read backs to boost anti aliasing and Cube Map arrays to improve shadow performance? How many instruction in average has your shaders? I heard that more short shaders are better than few long shaders. I love to learn about those things even though I hate algebra.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Oh well, what can we expect from Sleazysoft I mean, Microsoft :rolleyes:

Microsoft is the only one you CAN'T blame this on.
DX10/11 problems are mainly caused by poor driver implementations and poor understanding of the new API by developers.
Microsoft is the only one who had its stuff together.

I have a question, did you used Multi Sample read backs to boost anti aliasing

Multi-sample readback doesn't boost anti-aliasing. It can enable it in some cases where AA wasn't possible before (eg deferred rendering), or it can allow custom algorithms for AA. But neither improve performance.

How many instruction in average has your shaders?

There's no such thing as an average shader.

I heard that more short shaders are better than few long shaders.

There's no way you can make a general claim like that. It depends on far too many factors.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Multi-sample readback doesn't boost anti-aliasing. It can enable it in some cases where AA wasn't possible before (eg deferred rendering), or it can allow custom algorithms for AA. But neither improve performance.

I wonder how Assassin Creed got a boost in performance when Anti Aliasing was on, like in Far Cry 2 compared to the standard DX19 path...

There's no such thing as an average shader.

Average instruction count... Far Cry longest shader was 50 instructions which was used on the water surface.

There's no way you can make a general claim like that. It depends on far too many factors.

I rode ATi's X800 whitepaper a long time ago (Save the Nanosecond) and it stated that lots of shorter shaders were faster than using few long shaders.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I wonder how Assassin Creed got a boost in performance when Anti Aliasing was on, like in Far Cry 2 compared to the standard DX19 path...

Because they had a workaround in DX10 involving an extra renderpass. It's difficult to compare, since they aren't equivalent paths.

Average instruction count... Far Cry longest shader was 50 instructions which was used on the water surface.

What does the average instructioncount say?
Some shaders may be hundreds of instructions, others might be only a handful. An average is completely meaningless.

I rode ATi's X800 whitepaper a long time ago (Save the Nanosecond) and it stated that lots of shorter shaders were faster than using few long shaders.

And you think it's wise to assume that what held true for ATi's X800 years ago, with DX9, is still going to be valid for today's DX11 hardware with a completely different architecture, completely different API, running completely different types of rendering algorithms?

I don't, so as I say: There's no way you can make a general claim like that. It depends on far too many factors.