Kepler, Maxwell, Pascal - Performance comparison from new drivers (353 -> 376)

Spjut

Senior member
Apr 9, 2011
928
149
106
Finally someone looked into it properly. Only lacking Vulkan performance that stands out for Kepler.

Also nice to see DX12 performance seems to be good in Tomb Raider after all the patches.
 
  • Like
Reactions: deepthikv

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
I wonder if people will finally stop saying nVIDIA doesn't support Kepler. It seems it's lackluster performance in some modern games is due to architectural deficiencies (as I and others suspected).
 
  • Like
Reactions: Arachnotronic

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I wonder if people will finally stop saying nVIDIA doesn't support Kepler. It seems it's lackluster performance in some modern games is due to architectural deficiencies (as I and others suspected).

That and the fact that most PC games are developed primarily on two AMD consoles. As I've said time and again, look at titles that are only PS4/PC or XB1/PC OR PC exclusive and Kepler holds up much better.
 
  • Like
Reactions: Arachnotronic

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Also nice to see DX12 performance seems to be good in Tomb Raider after all the patches.
Huh? Are looking at the graphs? Ashes, Tomb Raider and Deus Ex DX12 are all slower across the board compared to DX11.

Likewise, Doom Vulkan is slower across the board than OpenGL.

How many more such games do we need to see before people accept low level APIs are a failure?
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
676
136
Huh? Are looking at the graphs? Ashes, Tomb Raider and Deus Ex DX12 are all slower across the board compared to DX11.

Likewise, Doom Vulkan is slower across the board than OpenGL.

How many more such games do we need to see before people accept low level APIs are a failure?
How many dx transitions do we need to repeat the same things over and over again before people stop. People said the same about basically every new dx transition. This new one is bad, I like the old ways. Suppose it's just human nature.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Huh? Are looking at the graphs? Ashes, Tomb Raider and Deus Ex DX12 are all slower across the board compared to DX11.

Likewise, Doom Vulkan is slower across the board than OpenGL.

How many more such games do we need to see before people accept low level APIs are a failure?

The differences between DX11 and DX12 are looking similar for all three of the cards, and they are mostly minimal. In the case of Tomb Raider, only the 1080 got almost 4 FPS improvement in DX11. Perhaps I remember badly, but didn't DX12 lag behind DX11 significantly when those games got the first DX12 patch? I'd guess one could also get into DX11 vs DX12 framepacing tests if one would like to be more thorough.

Right now, DX12 and Vulkan are only useless if you are fully GPU limited. The CPU used in that test was an i7 6700k at 4,5 ghz.
It's easy to declare low level APIs a failure if you have Skylake i7 at 4+ ghz. Most users don't, and even on Anandtech you see users are clinging onto their Sandy Bridge i5/i7 CPUs.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
The differences between DX11 and DX12 are looking similar for all three of the cards, and they are mostly minimal. In the case of Tomb Raider, only the 1080 got almost 4 FPS improvement in DX11. Perhaps I remember badly, but didn't DX12 lag behind DX11 significantly when those games got the first DX12 patch? I'd guess one could also get into DX11 vs DX12 framepacing tests if one would like to be more thorough.
Low level APIs require a massive amount of extra developer resources. So if they aren't totally dominating the traditional APIs, they're a failure. Saying "it's okay because they're almost as fast" is lunacy.

If I spend $1000 and get 90% while someone spends $100 and gets 100%, I've failed if I could've taken the second option.

Right now, DX12 and Vulkan are only useless if you are fully GPU limited. The CPU used in that test was an i7 6700k at 4,5 ghz.
It's easy to declare low level APIs a failure if you have Skylake i7 at 4+ ghz. Most users don't, and even on Anandtech you see users are clinging onto their Sandy Bridge i5/i7 CPUs.
It's interesting how you point out not everyone is using a fast CPU but seem to believe everyone has a 1080.

The top five GPUs on Steam are 970, 960, 750 Ti, Intel HD Graphics 4000 and Intel Haswell: http://store.steampowered.com/hwsurvey/videocard/

Actually six Intel iGPUs come above a 1080. Why don't you show us how well these low level APIs work on a 750Ti and all those Intel iGPUs?
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
I bet you if they even tested the 780TI none reference modell vs 290X people would stop beliving the 290X magicaly got faster in DX11. 780TI none referance ahead of 390X in : GTA 5 1080p, AC Unity 1440p, BF4 1440p , dont forget 390X is nothing more then a factory OC'ed 290X with more ram. https://www.youtube.com/watch?v=OwIRyJ2WqHI
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Low level APIs require a massive amount of extra developer resources. So if they aren't totally dominating the traditional APIs, they're a failure. Saying "it's okay because they're almost as fast" is lunacy.

If I spend $1000 and get 90% while someone spends $100 and gets 100%, I've failed if I could've taken the second option.


It's interesting how you point out not everyone is using a fast CPU but seem to believe everyone has a 1080.

The top five GPUs on Steam are 970, 960, 750 Ti, Intel HD Graphics 4000 and Intel Haswell: http://store.steampowered.com/hwsurvey/videocard/

Actually six Intel iGPUs come above a 1080. Why don't you show us how well these low level APIs work on a 750Ti and all those Intel iGPUs?

Feel free to bring up the benchmarks you find for the 750 Ti and intel IGPUs if you want to. Are they getting superb performance in DX11 btw? The neat thing for the 750 Ti users though is that if they want to get a better graphics card this year or the next, they won't have to be as anxious about their aged CPUs if the games are using Vulkan or DX12. Even a 1060 would be a nice upgrade over the 750 Ti.

This test was done in July 2016. The GTX 1060, paired with either the Phenom II X4 955 or i5 750 at stock speeds, came close to the i7 6700k 4.5 ghz + 1060 combo when using Vulkan in DOOM.
https://www.hardwareunboxed.com/gtx-1060-vs-rx-480-in-6-year-old-amd-and-intel-computers/
 
  • Like
Reactions: Bacon1

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Low level APIs require a massive amount of extra developer resources. So if they aren't totally dominating the traditional APIs, they're a failure. Saying "it's okay because they're almost as fast" is lunacy.

This is only true if the developers have to built and/or optimize the engine on their own. Tons of developers use third party engines these days (i.e. UE, Unity etc.), and once those engines have been ported and optimized for DX12/Vulkan, it really won't take a significant amount of resources for people using them to implement DX12.
 
  • Like
Reactions: Rifter

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
"Low level APIs"

They're not programming in assembly. Direct3D 12/Vulkan/Mantle are to Direct3D 11, what C++ is to LUA. If you know what you're doing, there is a huge amount of performance to be gained.

I think it was Zlatan that said the proper term for them is "Explicit APIs".

Moreover, show me a game making tens of thousands of draw calls, and compare it's performance between D3D11 and D3D12. An MMO like The Elder Scrolls Online would be a prime candidate.

You'll find that ol' 11 is left in the dust, supposing the API has a proper driver implementation.

Case in point, the consoles have had explicit APIs since their inception. Hell, the Gamecube's draw calls were so fast that it was far cheaper (CPU side) to draw objects, than it was to cull them.

But even on today's PC hardware and software, you'd be a buffoon to not have object culling, since draw calls are slow as all hell.
 

SirDinadan

Member
Jul 11, 2016
108
64
71
boostclock.com
Feel free to bring up the benchmarks you find for the 750 Ti and intel IGPUs if you want to.
We made some API comparisons on integrated graphics (A10-7860K and i3-6100) with proper frame time measurement.
We found:
nice improvements - DOOM (A10)
some improved frame times - HITMAN (A10)
nothing or minimal uplift - BF1 (A10), RotTR (A10)
regression - BF1 (i3)​

Hopefully, we can add the new Bristol Ridge to our hardware selection soon, we plan to revisit the topic.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I wonder if people will finally stop saying nVIDIA doesn't support Kepler. It seems it's lackluster performance in some modern games is due to architectural deficiencies (as I and others suspected).

That and the fact that most PC games are developed primarily on two AMD consoles. As I've said time and again, look at titles that are only PS4/PC or XB1/PC OR PC exclusive and Kepler holds up much better.

Not happening. Far too much time and keystrokes invested in trashing Kepler. Nobody is going back on that commitment.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
We made some API comparisons on integrated graphics (A10-7860K and i3-6100) with proper frame time measurement.
We found:
nice improvements - DOOM (A10)
some improved frame times - HITMAN (A10)
nothing or minimal uplift - BF1 (A10), RotTR (A10)
regression - BF1 (i3)​

Hopefully, we can add the new Bristol Ridge to our hardware selection soon, we plan to revisit the topic.

Thanks!
Do you have the opportunity to test discrete cards as well? An i3 4130 and GTX 750 Ti build was quite a popular budget build three years ago. Digitalfoundry for example tried the i3 4130 paired with various cards in DOOM OpenGL, but couldn't lock performance at 60 FPS, declaring it inferior to the PS4 version. Sadly, they never tested that i3 again for Vulkan.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
676
136
Not happening. Far too much time and keystrokes invested in trashing Kepler. Nobody is going back on that commitment.
Neither of those two statements (Hi-Fi Man and architectural deficiencies, tviceman and consoles affecting development style) refute the claims that kepler has aged poorly. I own a 780ti non reference. I own a 390x. I own a 1070. The 780ti does just fine in older games. The 390x is faster in a lot of newer stuff. Kepler has a pretty different SM setup compared to Maxwell/Pascal. Kepler is now three generations back. Kepler was released in 2012. AMD stayed with roughly the same architecture from 2012 through today. All of that combined makes it easy to understand why Kepler doesn't look as good today as it did in 2012.

The people claiming that Kepler is falling behind are over the top and just pointing out the obvious nature of hardware progression. The people spending time defending Kepler and saying this isn't the case are baffling to me. Both sides need to let old hardware die a peaceful death.
 
  • Like
Reactions: Rifter
Mar 10, 2006
11,715
2,012
126
Sweet, big jump in Doom performance, both Vulkan and OpenGL. Vulkan path is still slower/no better than the OpenGL one though on the 1080.

Surprisingly huge gain for the 980 and 780 Ti in the OpenGL path.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Neither of those two statements (Hi-Fi Man and architectural deficiencies, tviceman and consoles affecting development style) refute the claims that kepler has aged poorly. I own a 780ti non reference. I own a 390x. I own a 1070. The 780ti does just fine in older games. The 390x is faster in a lot of newer stuff. Kepler has a pretty different SM setup compared to Maxwell/Pascal. Kepler is now three generations back. Kepler was released in 2012. AMD stayed with roughly the same architecture from 2012 through today. All of that combined makes it easy to understand why Kepler doesn't look as good today as it did in 2012.

The people claiming that Kepler is falling behind are over the top and just pointing out the obvious nature of hardware progression. The people spending time defending Kepler and saying this isn't the case are baffling to me. Both sides need to let old hardware die a peaceful death.

Bold^ Couldn't agree more! But I'm sure it's obvious that this "schtick" is being used to discourage people from buying CURRENT Nvidia products saying they will age badly. I'm all for letting old hardware be just that. Old hardware. But it's part of the AMD marketing machine here in the forums and around the web. Sucks. But whatcha gonna do eh? LOL.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
676
136
Bold^ Couldn't agree more! But I'm sure it's obvious that this "schtick" is being used to discourage people from buying CURRENT Nvidia products saying they will age badly. I'm all for letting old hardware be just that. Old hardware. But it's part of the AMD marketing machine here in the forums and around the web. Sucks. But whatcha gonna do eh? LOL.
But you have to agree that both sides have their "schtick" to discourage buying CURRENT products from the other company. I'm in no way implying that anyone is a shill but there is indirect marketing all over this and other forums from both sides. It all needs to die.
 

SirDinadan

Member
Jul 11, 2016
108
64
71
boostclock.com
Do you have the opportunity to test discrete cards as well? An i3 4130 and GTX 750 Ti build was quite a popular budget build three years ago. Digitalfoundry for example tried the i3 4130 paired with various cards in DOOM OpenGL, but couldn't lock performance at 60 FPS, declaring it inferior to the PS4 version. Sadly, they never tested that i3 again for Vulkan.
Well, we just started benchmarking dGPUs, but we are lacking on the CPU side.
Can you recommend an equivalent budget build with a Kaby Lake CPU. Some users on the forum were quite upbeat about the G4560. Pairing it with a 460 / 1050 would be something along the same lines, I guess. Unfortunately, no KB CPUs are available at the moment in my country.

Sorry for derailing the thread.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Not happening. Far too much time and keystrokes invested in trashing Kepler. Nobody is going back on that commitment.

And it's so silly. Instead of praising a particular company for having the position and foresight to win big contracts (like the PS4 and XB1), people instead want to seek out negativity because it fits their narrative that one company is evil and the other is good.

Look at AoTS, Shadow Warrior 2, Quantum Break Steam version (which performs better on ALL GPUs, not just Nvidia), No Man's Sky, ARK, XCOM 2, Civ 6, Rise of the Tomb Raider (which was XB1 exclusive for 10-11 months before PS4), Vanishing of Ethan Carter, SOMA, and EVERY.SINGLE.PC.EXCLUSIVE.MMO. Those are all games that were only released on 1 console (except ROTTR which still had a 11 month exclusive XB1) or were PC only and the 780 TI either ties or beats the R9 290X.

AMD had great foresight to win the console contracts and scored a headwind of performance upticks for games that are released on all 3 platforms because of it. Kepler AND Maxwell suffered more because of this than anything else.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
But you have to agree that both sides have their "schtick" to discourage buying CURRENT products from the other company. I'm in no way implying that anyone is a shill but there is indirect marketing all over this and other forums from both sides. It all needs to die.
I thought we were talking about Kepler? Why now is it about schticks? Please don't try to exonerate AMD marketers Kepler schtick by saying everyone has a schtick. Might be true about everyone having one, but that would handily steer off the topic. Wouldn't it.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I was fully expecting all the Kepler slides to look like this:
F7IhEM5.jpg
 
  • Like
Reactions: Headfoot