Mid-high end cards on low end CPUs - an issue that needs some light shed

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Pre-scripted benchmarks can be accurate for the GPU in many cases, but VERY inaccurate for the CPU. Biggest place this is an issue is online games.

Take BF4 for instance, the "scripted benchmark" shows low end CPUs are just fine! But jump on a 64 man server and suddenly your FPS tank and the CPU will be pegged. Even my 4690K @4.5GHz will run at ~85% utilization. My old Phenom II 95BE @4GHz would run at 95-98% utilization and cap my FPS in the mid 30's in DirectX mode, but allow me to get 50-60 in Mantle.

Like I said, they are game play benchmarks, they just pick one spot, and a single player one at that. They may not tell the whole story, as multiplayer will often be worse as you noted, and other areas may be better. They just give you a window, like any other type of benchmark. Sites can't play for hours to give benchmarks, as that is too difficult and time consuming.
 
Feb 19, 2009
10,457
10
76
Assetto Corsa - Completely GPU limited.
Dayz - Nvidia does better except for 2C @ 3.5
Wreckfest - Nvidia solidly domintes
Project Cars - Nvidia dominates
Star Citizen - Slight Nvidia edge
Talos Principle - GPU limited.


Interestingly it looks like AMD does better in 2C setups, add HT or more cores and Nvidia tends to scale better (alternatively you could think of it as nvidia tanking at 2C CPUs). Nvidia also seems to have problems with HT on 4C CPUs.

You are looking at it as a function of scaling which is only part of the problem. You must also look at the absolute performance (if one driver is 30% more efficient than another and gets similar core scaling then the scaling as you change the CPU grunt remains the same but one GPU can get 30% better frame rates).

Looking at wreckfest for instance AMD exhibits better scaling than Nvidia however, the framerates are across the board higher for Nvidia. At 4C @ 2.5 ghz AMD gets 22 fps and nvidia 39.8. Moving up to 4C @ 3.5 ghz Nvidia goes to 44.6 fps while AMD improved to 29.8 fps. AMD scaled much better however it is clear that at 4C 2.5 ghz both are CPU limited and Nvidia is getting massively better FPS.

I'm not trying to get into any AMD vs. Nvidia debate, my point is that relying solely on scaling as a measure of driver efficiency is wrong.

Also note that these are all early access games when tested and may not be indicative of patched performance. Thus data from these tests is questionable.

That's now exactly how you analyze "driver overhead", some of those games favor NV GPUs a lot as you can see by the huge gap that the 980 has. The comparison isn't overall fps, its fps loss as CPU get weaker or less cores. There's practically no difference, it also shows AMD doing just fine on 2cores, which is the premise of this OP, that AMD perform worse with low end Intel CPUs like the i3.

I've shown in GTA V and Witcher 3 that it isn't true, far from it. I've also shown these recent tests by computerbase.de that also proves AMD performs fine with i3s.

Any odd result is entirely due to NV sponsored games like COD where this was blown up into an issue when its just lack of optimizations.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
That's now exactly how you analyze "driver overhead", some of those games favor NV GPUs a lot as you can see by the huge gap that the 980 has. The comparison isn't overall fps, its fps loss as CPU get weaker or less cores. There's practically no difference, it also shows AMD doing just fine on 2cores, which is the premise of this OP, that AMD perform worse with low end Intel CPUs like the i3.

I've shown in GTA V and Witcher 3 that it isn't true, far from it. I've also shown these recent tests by computerbase.de that also proves AMD performs fine with i3s.

Any odd result is entirely due to NV sponsored games like COD where this was blown up into an issue when its just lack of optimizations.

And this is part of the whole issue but not all. You must account for absolute performance as well for the reasons I outlined. Needless to say regardless of scaling in some of the games (wreckfest) the 980 is playable at 4C 2.5 ghz at ~40 fps while the AMD 290X gets only 22fps. Both are CPU limited and it is apparent, regardless of how well the driver scales with cores and clockspeed that the 980 is dealing with less overhead as it is getting better fps.


Analyzing how performance scales is NOT explicitly driver overhead. You could have some magic driver that gave 100 fps on a single 2.5 ghz core (compared to AMD and Nvidia's 22-40 fps on 4 C) with absolutely no scaling to two cores. Does the magic driver scale well? No it does not. It is more efficient with less overhead. Of course it is.
 
Feb 19, 2009
10,457
10
76
Wreckfest is severely GPU limited on Radeons, a gtx760 is out performing a R290X which only gets ~35 fps at 1080p. 980 is literally double that at ~60 fps.

But is it because the Radeons are CPU bottlenecked? Nope. Why? Cos look at the 720p test, the R290X manages 45 fps, again maxing out.

CPU limited scenario, that would not occur when you lower the resolution and fps goes up. The game is just heavily NV biased and the CPU tests show equal performance loss as a ratio thus there is no evidence there to suggest there's more driver overhead for AMD, none whatsoever.

One would have to find better examples than these, because the CPU penalty doesn't exist outside of NV sponsored titles.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Wreckfest is severely GPU limited on Radeons, a gtx760 is out performing a R290X which only gets ~35 fps at 1080p. 980 is literally double that at ~60 fps.

But is it because the Radeons are CPU bottlenecked? Nope. Why? Cos look at the 720p test, the R290X manages 45 fps, again maxing out.

CPU limited scenario, that would not occur when you lower the resolution and fps goes up. The game is just heavily NV biased and the CPU tests show equal performance loss as a ratio thus there is no evidence there to suggest there's more driver overhead for AMD, none whatsoever.

One would have to find better examples than these, because the CPU penalty doesn't exist outside of NV sponsored titles.

My first post mentioned the accuracy of the data.

However, wreckfest is showing some CPU scaling on the radeon 290X, 3.5 ghz 4C is getting more fps than 2.5 ghz 4C so the 2.5 4C is CPU limited.

My post was more with methodology than with a specific case. My point is that core scaling alone does not show the entire picture of driver overhead. Absolute fps at a given CPU perf. level is also required. I mentioned that the data seemed weird.
 

Prefix-NA

Junior Member
May 17, 2015
8
0
36
Wreckfest is severely GPU limited on Radeons, a gtx760 is out performing a R290X which only gets ~35 fps at 1080p. 980 is literally double that at ~60 fps.

But is it because the Radeons are CPU bottlenecked? Nope. Why? Cos look at the 720p test, the R290X manages 45 fps, again maxing out.

CPU limited scenario, that would not occur when you lower the resolution and fps goes up. The game is just heavily NV biased and the CPU tests show equal performance loss as a ratio thus there is no evidence there to suggest there's more driver overhead for AMD, none whatsoever.

One would have to find better examples than these, because the CPU penalty doesn't exist outside of NV sponsored titles.

DigitalFoundry video's were found to be fake running different settings on AMD cards. There is an issue but they overstated it to make it look extreme for attention.

Also Windows 10 AMD drivers show huge improvements.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,893
5,825
136
DigitalFoundry video's were found to be fake running different settings on AMD cards. There is an issue but they overstated it to make it look extreme for attention.

Also Windows 10 AMD drivers show huge improvements.

That's a big accusation to make without supporting evidence.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,893
5,825
136

It sounds like BS to me, I don't believe they have any kind of Nvidia bias. I was looking at the R9 280 before and one of the big reasons I wanted it was because Digital Foundry was showing how much better it performed than the higher priced GTX 760. Of course they were doing all their testing with i7s back then.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
How about for guys like me with my cpu? I was gonna buy a used 7970 for 150$ but it might be better for me to buy the Nvidia equivalent.

I don't think its going to make a tremendous amount of difference what you get for $150.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127760

You could go for that and call it a day. New card, lifetime warranty and all your CPU could possibly handle. I really don't think a GTX 960 would do any better more efficient driver overhead or not.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
I blew it and linked the wrong one - that's the one I meant to link LTC ^^^
 

ibex333

Diamond Member
Mar 26, 2005
4,091
119
106
I don't get why some people are stressing the importance of quad core CPUs so much... I remember about two years ago... Maybe more. I was arguing with some folks on this forum about how quad core CPUs are irrelevant for gaming because so few games truly utilize them Those folks said: "Just wait and see in a few years, your dual core will be garbage and quad cores will rule".


Few years passed and the overall situation is still the same. About 90% of games DO NOT benefit from quad core CPUs to any serious extent. (Serious extent to me being a game being unplayable on a dual core as opposed to playable on a quad core)

As a gamer on a budget the idea for me is this: "If a game "requires" a quad core CPU for good performance, I simply don't play it.". This could be a stupid train of thinking, but there is so few such games, it really doesn't matter, in a sea of other games that will run just fine on a dual core.

What is better? Avoiding that one game that needs a quad core, or spending $100-200 more on a quad core CPU? I think the answer is obvious. Why not play that one game a few years from now? It's not the only good game out there!

Besides, by avoiding such games, you are telling the devs that you wont have their BS, and they should probably spend more time optimizing said games for dual cores as opposed to rushing the quad core technology that so few games and apps benefit from.

Say what you want, but my 2nd PC is an overclocked e6300 Conroe, and it's just as fast as my overclocked 2500k in most applications. Games are a different story, but about 50% of all games run just as well on the e6300.

When it comes to my 2500k, there are plenty of people out there who say it's time to upgrade. But why?! There is not a single game out there that my 2500k cant handle at at least 40 fps paired with a good GPU. Why in the world do I need a quad core?

One particular game that comes to mind that everyone was touting a few years back as a game that really benefits form a quad core CPU is Supreme Commander. Does it benefit from a Quad Core? Maybe. But does it run perfectly fine on a dual core? Absolutely!

Do we play games to have them run "fast enough" to be enjoyable, or do we play them to compare the difference in fps with each other?

If I am running a game at 30fps and you are running it at 60fps does it really matter? 60fps may be better, but 30fps is plenty "good enough".
 
Last edited:

maddogmcgee

Senior member
Apr 20, 2015
384
303
136
When it comes to my 2500k, there are plenty of people out there who say it's time to upgrade. But why?! There is not a single game out there that my 2500k cant handle at at least 40 fps paired with a good GPU. Why in the world do I need a quad core? .... If I am running a game at 30fps and you are running it at 60fps does it really matter? 60fps may be better, but 30fps is plenty "good enough".

1. 2500k is 4 cores not 2.
2. if 30fps if fine for you and the games you play that is great, but I like 60-110 if possible. Each to our own.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I don't get why some people are stressing the importance of quad core CPUs so much... I remember about two years ago... Maybe more. I was arguing with some folks on this forum about how quad core CPUs are irrelevant for gaming because so few games truly utilize them Those folks said: "Just wait and see in a few years, your dual core will be garbage and quad cores will rule".


Few years passed and the overall situation is still the same. About 90% of games DO NOT benefit from quad core CPUs to any serious extent. (Serious extent to me being a game being unplayable on a dual core as opposed to playable on a quad core)

As a gamer on a budget the idea for me is this: "If a game "requires" a quad core CPU for good performance, I simply don't play it.". This could be a stupid train of thinking, but there is so few such games, it really doesn't matter, in a sea of other games that will run just fine on a dual core.

What is better? Avoiding that one game that needs a quad core, or spending $100-200 more on a quad core CPU? I think the answer is obvious. Why not play that one game a few years from now? It's not the only good game out there!

Besides, by avoiding such games, you are telling the devs that you wont have their BS, and they should probably spend more time optimizing said games for dual cores as opposed to rushing the quad core technology that so few games and apps benefit from.

Say what you want, but my 2nd PC is an overclocked e6300 Conroe, and it's just as fast as my overclocked 2500k in most applications. Games are a different story, but about 50% of all games run just as well on the e6300.

When it comes to my 2500k, there are plenty of people out there who say it's time to upgrade. But why?! There is not a single game out there that my 2500k cant handle at at least 40 fps paired with a good GPU. Why in the world do I need a quad core?

One particular game that comes to mind that everyone was touting a few years back as a game that really benefits form a quad core CPU is Supreme Commander. Does it benefit from a Quad Core? Maybe. But does it run perfectly fine on a dual core? Absolutely!

Do we play games to have them run "fast enough" to be enjoyable, or do we play them to compare the difference in fps with each other?

If I am running a game at 30fps and you are running it at 60fps does it really matter? 60fps may be better, but 30fps is plenty "good enough".

I think it is obvious, you get the quad core, as you did.

Now, for a few notes. FPS doesn't tell the whole tale. Sometimes the FPS doesn't change much, at least when it comes to the average. It is the hickups that get annoying and the minimums that stand out. People focus a lot on the average, but it is not when things are going well that we experience poor game play.

I also don't think most serious gamers are after good enough. We want to play at optimal conditions. For me, 40 FPS is nausea inducing, so I'm not going to settle for 40 FPS.

If you stick to 2 core systems, then those games which need/benefit from 4 cores will remain out of reach no matter how many years you wait. If you want to play them, and enjoy them, you have to fork out the extra $100 for a 4 core. Not only that, but if you do stick to the 2 core CPU's, how often do you have to upgrade? I don't seem many people still on Nahelim and Sandy Bridge i3's, because they won't hold out as long. Yet you see plenty of i5's and i7's, because they can stand the course of time. As a result, they actually cost less in the long run.

And that last bit about how dev's should optimize games for 2 cores is ridiculous. The reason the CPU is falling way behind GPU's is because dev's are not optimizing for 4+ cores. The more cores dev's optimizing games for, the more FPS we can enjoy. Games like Arma and DayZ are stuck at 30 and under FPS when you play multiplayer because they use only 2 cores.

And the fact that you haven't upgraded your i5 2500k and I haven't upgraded my i7 920 goes to show that more cores has a great advantage in longevity. When OC'ed to 4Ghz, they are still better than any i3 and most stock i5's today. These are great examples of why we recommend 4 core CPU's.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
401
126
I don't seem many people still on Nahelim and Sandy Bridge i3's, because they won't hold out as long.
A highly clocked Nehalem is still relevant today. Hell, I have a spare i7 920 @ 4.1GHz rig that I just upgraded to a X5675 @ 4.5GHz and it is a screamer!
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
A highly clocked Nehalem is still relevant today. Hell, I have a spare i7 920 @ 4.1GHz rig that I just upgraded to a X5675 @ 4.5GHz and it is a screamer!

That is what I was talking about. I still have an i7 920 too, but do you know anyone with an i3 from those generations?