AT has dx12 drivers?

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
This was a GREAT read. I haven't been this excited for a DX version since DX9. :)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The benchmark for quick view:
71450.png

71451.png
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,749
345
126
Looking forward to how developers leverage the performance increases with DX12. Will we get better performing games, or better looking games that utilize the extra performance?

I don't think I am very CPU limited in any of the games I play right now, except maybe AC Unity.

A little off-topic, but I don't see the fun in that game. Maybe it just isn't my type of game, but there is so much going on its hard to keep track. Is this just a benchmark? How does the game actually work?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
A couple questions spring to mind, why is the 980 that much faster? Is it due to NV focusing on DX12, and AMD not, or is it more likely the 980's DX12 partial support?

Look at the power consumption, there goes the efficiency!

71452.png

Power consumption looks fine. Remember its a CPU limited benchmark mainly. Originally designed to demo the advantage of Mantle. So obviously entire system load will increase when GPU use more than CPU.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,122
622
136
This caught my eye, "However as awesome as Mantle is though, it is currently a proprietary AMD API, which means it can only be used with AMD GPUs". I thought all the AMD proponents were saying Mantle is open for anyone to use? The same people who blast NVIDIA for using proprietary technology..hmm double standards

As much as I like thread derails like this.../s


Iirc, Mantle is still in beta and is not going to be released to others till its out of beta. Maybe Ryan can chime in on this.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
If I see anymore thread crapping you will be immediately infracted after this warning.

Remember, this is a discussion on an API that is not even released to the public yet, no reason to start bashing either company when the final product hasn't even been released.

-Rvenger
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Seeing these "benchmarks" I can't help but feel AMD has left a lot of potential on the table with DX11, hopefully once they sort out the early kinks with DX12 they will unleash everything.

The biggest thing that stands out to me in the chart is how close the 285 & 750Ti are in some, hopefully that's just down to Driver/API immaturity and not a sign of something darker lurking in there.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
So, in other news, what is supposed to show us that there are some devs really close to release a DX12/Mantle game and the closeness between those 2 APIs, is instead used to shovel that, for a change, a newer GPU is better than a 1 year older one. VC & C, pointing out the obvious for all of us :awe:
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If that graph is representative of cards as well as actual speedups in games (it's probably not due to other bottlenecks), then DX12 is actually going to do AMD a favor because the GTX 980 did not benefit as much as the 290x, proportionately. Then again the 3xx is different so who knows. But I know that if I were using a 29x card, I'd be grateful for DX12.
 
Feb 19, 2009
10,457
10
76
I think as time goes on, AMD will have no choice but to ditch Mantle. It's just too much to expect their driver team to optimize for DX11, Mantle AND DX12.

Overall its a great sign for DX12 moving forward.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Power consumption looks fine. Remember its a CPU limited benchmark mainly. Originally designed to demo the advantage of Mantle. So obviously entire system load will increase when GPU use more than CPU.

Did you even read the article? It's not at all a CPU limited benchmark. That's literally the point of DX12. In fact it takes a GTX 980 to not be GPU limited with 2 cores.

It looks like the 980's doing a ton of work and is using a lot of power right now. Wonder where the performance will be by the time DX12 drops.
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Rather than a hardware bottleneck it looks like it's AMD's drivers that are the bottleneck here with the batch submission time taking too long compared to Nvidia's.

Looking forward to how developers leverage the performance increases with DX12. Will we get better performing games, or better looking games that utilize the extra performance?

I don't think I am very CPU limited in any of the games I play right now, except maybe AC Unity.

A little off-topic, but I don't see the fun in that game. Maybe it just isn't my type of game, but there is so much going on its hard to keep track. Is this just a benchmark? How does the game actually work?

More variety hopefully. IIRC unique textures, shaders, meshes, light passes etc. all use draw calls and while there are optimization techniques like batching and occlusion culling having a significantly higher capacity for draw calls should mean there can be more unique stuff, for a lack of a better word, on screen at a time.
 
Feb 19, 2009
10,457
10
76
Rather than a hardware bottleneck it looks like it's AMD's drivers that are the bottleneck here with the batch submission time taking too long compared to Nvidia's.

Yup, and they will need to focus on it, so Mantle is dead or if it isn't, it should be. It's not clever to spread driver development time among too many APIs.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Did you even read the article? It's not at all a CPU limited benchmark. That's literally the point of DX12. In fact it takes a GTX 980 to not be GPU limited with 2 cores.

It looks like the 980's doing a ton of work and is using a lot of power right now. Wonder where the performance will be by the time DX12 drops.

The GPU got a higher powerdraw than the CPU. So if you move the bottleneck from CPU to GPU your system power usage will increase.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
This was a GREAT read. I haven't been this excited for a DX version since DX9. :)

Agree - i bought a nv 6800 just to play doom3 because of the great dx9 graphics. Lol
And its great we get support back to fermi class with dx12. It means a new generation of games and far better gameplay on weaker mobile cpu and all playable on older gfx.
Thank you mantle - ms moved their fat butt :)

And excellent write by Ryan btw. Same for Denver description in nexus review.
 
Last edited:

parvadomus

Senior member
Dec 11, 2012
685
14
81
AMD need to tweak a bit dx12, that submission times are obviously affecting the overall framerate. However its doing ok and the difference should not be notable in real world scenarios.
I hope with DX12 both companies wont need to tweak drivers on a per game basis as it has been until now (more work for developers).
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I hope with DX12 both companies wont need to tweak drivers on a per game basis as it has been until now (more work for developers).
That will be one of the benefits. But the dev then have to work more on the engine and the game.
The thinner api is only cost effective because there is a consolidation of engines on the market.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Based on theoretical performance, as an AMD user, I'd much rather have DX12 than Mantle. All games benefitting is absolutely more desirable than Mantle games only, so thanks to AMD I guess for the increased focus on API. As an Nvidia user, the theoretical increased performance looks great. Win-win all around.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Based on theoretical performance, as an AMD user, I'd much rather have DX12 than Mantle. All games benefitting is absolutely more desirable than Mantle games only, so thanks to AMD I guess for the increased focus on API. As an Nvidia user, the theoretical increased performance looks great. Win-win all around.

As amd will never have the same per core perf as Intel it was crusial to get an api that actually effectively used more cores. Dx12 does that as well as mantle.
So its not a winner for intel. And ms monopoly is tied to intel x86 monopoly so a slap with a stick was needed to get them moving. No wonder.
Besides a new thinner api also opens for other os. That not in ms interest either. Another reason for the stick and ms move.
But for gamers its a win all around.
 
Feb 19, 2009
10,457
10
76
As amd will never have the same per core perf as Intel it was crusial to get an api that actually effectively used more cores. Dx12 does that as well as mantle.
So its not a winner for intel. And ms monopoly is tied to intel x86 monopoly so a slap with a stick was needed to get them moving. No wonder.
Besides a new thinner api also opens for other os. That not in ms interest either. Another reason for the stick.

It's a win for Intel too. Whatever makes slower CPU perform better also makes faster CPUs perform better.

Intel has the node advantage, they can cram in more cores if they want to.

While single GPU setup on a fast Intel CPU may not see massive gains (in normal games, not pure draw call limited benches such as Star Swarm), multi-GPU setups have seen major boosts in performance. To drive multi-card setups, a strong CPU is still required. This automatically means Intel CPU = default for high-end rigs.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Developers may simply as previously in history use the newly found CPU power for something else. And we are back to staus quo on the CPU requirement.

I do not expect AAA games (and others) to simply let this oppotunity pass to add more features, AI etc to the games to make them even better.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
That will be one of the benefits. But the dev then have to work more on the engine and the game.
The thinner api is only cost effective because there is a consolidation of engines on the market.


Maybe in the pc space but console devs have been exposed to this kinda environment for decades.