Nvidia drivers give lower performance than AMD's due to high CPU overhead/inefficient

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
again, have a look at the results of the Phenom X2 550 in alienbabel review.

http://alienbabeltech.com/main/?p=22167

You will see that both GTX480 and HD5870 exhibit the same behavior with the dual core Phenom X2 550 and you will not find a trend forming to justify saying AMD cards perform better with dual core CPUs.

Here's where I see it:

Left 4 Dead
It's minimal but you can see it if you look at the dual core speed scaling. 5870 drops from 108 to 97 fps and the 480 drops from 104 to 91 (at 1900x1200).

X3:TC
This game is obviously hardcore CPU-limited, so it doesn't quite show the same kind of scaling. What it does show is that when the 480 is CPU-constrained it every so slightly trails the 5870. But if you look at i7-920 @ 3.8 GHz results you'll indeed see the 480 slightly ahead of the 5870.

Far Cry 2
The results speak for themselves, really.

World in Conflict
Game shows a similar pattern as X3:TC, but it's easier to see here. With the slowest processor the 5870 is clearly faster. With the fastest processor the 480 is clearly faster.

Just Cause 2
The 480 needs a Core i7 to best the 5870, otherwise it falls behind by a significant margin.

Resident Evil 5
Well there are odd results if you look at the phenom II x4 numbers, but if you look at everything else, especially lower resolution, you can see the pattern. With an i7-920 at 2.6 GHz the 5870 is faster; with the i7-920 at 3.8 GHz the 480 is faster. And if you just focus on the dual core scaling numbers you'll see a similar pattern as with X3:TC, where both cards are bottlenecked but the 480 is ever so slightly behind.

HAWX
There's some interesting things going on with a single AMD card on the Intel platform at lower resolutions. But if you look at 2560x1600 you can see the similar pattern: The 5870 is ahead when paired with the slowest processor and the 480 pulls ahead when paired with a faster processor. Also if you just focus on the dual core results at various clockspeeds, you will see the trend happening as well: Both cards are bottlenecked, the AMD card is slightly faster, but the 480 does indeed pull ahead with the 3.8 GHz at higher resolutions.

-----------------------------------

So it doesn't happen all the time, but it appears to not happen infrequently enough to dismiss.

More testing needs to be done on midrange cards. We don't have to see every card, but it would be interesting to have the 6850, 460, 450, and 5750 have a go with more reasonable CPU choices: An Athlon II X2, Athlon II X3, Core i3, Phenom II X4, Core i5, and Core i7.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Some examples showing some hefty FPS differences with 2 vs 4 cores on Fermi.

Im not talking about 2 core vs 4 core
Im talking about 2 core CPU with GTX480 vs 2 core CPU with HD5870



Here's where I see it
-----------------------------------

So it doesn't happen all the time, but it appears to not happen infrequently enough to dismiss.

More testing needs to be done on midrange cards. We don't have to see every card, but it would be interesting to have the 6850, 460, 450, and 5750 have a go with more reasonable CPU choices: An Athlon II X2, Athlon II X3, Core i3, Phenom II X4, Core i5, and Core i7.


Out of 20 Games there are 3 or 4 that exhibit this behavior.

Im with you that we could use more data from testing middle end graphics cards with dual or lower end quad cores to see if this behavior is been repeated in more games.

The OPs title is coming from a single source (3D mark 11) and that’s why im saying it is not valid and cannot be used in general, that’s why it is misleading.
 

96Firebird

Diamond Member
Nov 8, 2010
5,735
329
126
So... when this non-issue is fixed, nVidia users will get even better perfomance? Nice! AMD better be ready to adjust prices if need be.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
HAHA if you read the latest feedback for the XFX 5970 on newegg, from a verified owner, it makes this day even better.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Here's where I see it:

Left 4 Dead
It's minimal but you can see it if you look at the dual core speed scaling. 5870 drops from 108 to 97 fps and the 480 drops from 104 to 91 (at 1900x1200).

X3:TC
This game is obviously hardcore CPU-limited, so it doesn't quite show the same kind of scaling. What it does show is that when the 480 is CPU-constrained it every so slightly trails the 5870. But if you look at i7-920 @ 3.8 GHz results you'll indeed see the 480 slightly ahead of the 5870.

Far Cry 2
The results speak for themselves, really.

World in Conflict
Game shows a similar pattern as X3:TC, but it's easier to see here. With the slowest processor the 5870 is clearly faster. With the fastest processor the 480 is clearly faster.

Just Cause 2
The 480 needs a Core i7 to best the 5870, otherwise it falls behind by a significant margin.

Resident Evil 5
Well there are odd results if you look at the phenom II x4 numbers, but if you look at everything else, especially lower resolution, you can see the pattern. With an i7-920 at 2.6 GHz the 5870 is faster; with the i7-920 at 3.8 GHz the 480 is faster. And if you just focus on the dual core scaling numbers you'll see a similar pattern as with X3:TC, where both cards are bottlenecked but the 480 is ever so slightly behind.

HAWX
There's some interesting things going on with a single AMD card on the Intel platform at lower resolutions. But if you look at 2560x1600 you can see the similar pattern: The 5870 is ahead when paired with the slowest processor and the 480 pulls ahead when paired with a faster processor. Also if you just focus on the dual core results at various clockspeeds, you will see the trend happening as well: Both cards are bottlenecked, the AMD card is slightly faster, but the 480 does indeed pull ahead with the 3.8 GHz at higher resolutions.

-----------------------------------

So it doesn't happen all the time, but it appears to not happen infrequently enough to dismiss.

More testing needs to be done on midrange cards. We don't have to see every card, but it would be interesting to have the 6850, 460, 450, and 5750 have a go with more reasonable CPU choices: An Athlon II X2, Athlon II X3, Core i3, Phenom II X4, Core i5, and Core i7.

:thumbsup:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
HAHA if you read the latest feedback for the XFX 5970 on newegg, from a verified owner, it makes this day even better.

When nVidia drivers are used on 5970's this might be relevant. Until then though, it's OT and a thread crap.

If your intention is to debunk the premise of the thread, I'd advise you find some pertinent reliable information to back your position.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
When nVidia drivers are used on 5970's this might be relevant. Until then though, it's OT and a thread crap.

If your intention is to debunk the premise of the thread, I'd advise you find some pertinent reliable information to back your position.

marauder.jpg


It's about to get heavy!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Some examples showing some hefty FPS differences with 2 vs 4 cores on Fermi.

Based on that comment, I am assuming you are implying that Fermi runs faster with 4 vs. 2 cores? You presupposed the very conclusion you were trying to prove without considering other factors such that the game itself runs faster with more cores, regardless what GPU it's paired with. :hmm:

GTAIV and Dragon Age Origins are actually 2 games that benefit from having > than 2 CPU cores. Those benchmarks have nothing to do with the fact that Fermi cards run faster with more cores. Those 2 games run MUCH faster with 3/4 cores + any modern NV/AMD graphics card.

GTAIV E8600 @ 3.33ghz vs. Q9550 @ 2.83ghz, both paired with HD5870 - The quad core is 41% faster with an AMD card.
http://www.pcgameshardware.com/aid,...ead-of-Core-2-Quad-in-CPU-benchmarks/Reviews/

DA:O E8400 @ 3.6ghz vs. Q6600 @ 3.6ghz, both paired with an HD5870 - The quad core is 64% faster with an AMD card.
http://www.pcgameshardware.com/aid,...rks-75-percent-boost-for-quad-cores/Practice/

So you can't just generalize that Fermi runs slower with 2 cores and "suddenly" starts running faster if you throw 4 cores at it....it just depends on the actual game, clock speeds of the CPU, specific CPU architecture, etc. It's pretty clear, you are far better off choosing an Intel Core i5/i7 system as the backbone of your system in the first place and then worrying if you want to get 2 comparable cards (such as $350 HD6970 or GTX570). In fact, it is the CPU differences between Intel and AMD that have been majorly downplayed in gaming when comparing similar cards. From the review I linked in this paragraph, a Core i5/i7 series smokes a Phenom II X4 system in games like FC2, Supreme Commander, even when paired with an HD5870. :D
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
4,300
2,632
136

You've been told this before but Intel holds more than 50% of the graphics market yet they were at 8.8 percent reported crashes. I think you need to go ask a statistician for help there buddy :colbert: .
Re the Vista crash chart, well it was a troublesome OS in its early days and many hardware vendors were having driver issues. And bear in mind that simple integrated GPUs are far, far less complicated than discrete GPUs. On that front, Nvidia was in its most successful period then with its 88xx series which outshone ATI performance-wise and sales-wise in that period. So its only relative to how much sales Nvidia, ATI sold.
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
t's pretty clear, you are far better off choosing an Intel Core i5/i7 system as the backbone of your system in the first place and then worrying if you want to get 2 comparable cards (such as $350 HD6970 or GTX570). In fact, it is the CPU differences between Intel and AMD that have been majorly downplayed in gaming when comparing similar cards.

That doesn't invalidate the meaning of this thread. You're using a strawman to come up with a limited scenario when the issue we're discussing covers more scenarios than "first time system builders". There are plenty of users who already have Phenom II, Core 2, or Athlon platforms considering a GPU upgrade. And for those people this information is useful.

russian said:
So you can't just generalize that Fermi runs slower with 2 cores and "suddenly" starts running faster if you throw 4 cores at it....it just depends on the actual game, clock speeds of the CPU, specific CPU architecture, etc.

You can clearly see the CPU scaling have different affects on GPU performance per vendor in a few of the Alienbabel tests. The Tom's article does not show this, but he wasn't exactly generalizing with his statement. His intent is debatable, but this time you assumed he was generalizing when he may not have been. He did say "some examples". He did not say it happens all the time. And you also need to understand the context he was using. Someone mentioned a Tom's article that might be relevant to this thread and he tried his best to find it. He said "Could only turn this one up, only using a GTX 460 and various cpus", implying he wasn't as successful as he would have liked. So he posted results with a factual commentary, but his commentary and the results don't showcase the discussion of this thread. It was basically a fringe comment, and his comment seems pretty self-contained.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Out of 20 Games there are 3 or 4 that exhibit this behavior.

Im with you that we could use more data from testing middle end graphics cards with dual or lower end quad cores to see if this behavior is been repeated in more games.

The OPs title is coming from a single source (3D mark 11) and that’s why im saying it is not valid and cannot be used in general, that’s why it is misleading.

Well, its 7 games and almost every game in the xbit article.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Well, its 7 games and almost every game in the xbit article

But it's not only NV who benefits from increased CPU clock speed. In fact, a handful of slower AMD cards respond better when paired with a faster CPU:

1920x1080 - i7 975EE vs. i5 750

Average performance increase

GTX275 = 1.7%
GTX470 = 6.2%
HD5770 = 6.3% (HD5770 actually benefits just as much as a GTX470)
5850 = Average increase 9.3% (HD5850 benefits more than the faster GTX470)
5970 = Average increase 10.4%
GTX480 = Average increase 13.9%

The bottom line is, it just depends on the graphics card and the game. Generalizations such as proposed in this thread that a user is better off going with an HD6850 vs. GTX460 if he has a slower CPU are unsubstantiated. The only way one can show that is to exactly test that specific CPU architecture with those cards - that's what a CPU scaling article addresses.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
But it's not only NV who benefits from increased CPU clock speed. In fact, a handful of slower AMD cards respond better when paired with a faster CPU:

1920x1080 - i7 975EE vs. i5 750

Average performance increase

GTX275 = 1.7%
GTX470 = 6.2%
HD5770 = 6.3% (HD5770 actually benefits just as much as a GTX470)
5850 = Average increase 9.3% (HD5850 benefits more than the faster GTX470)
5970 = Average increase 10.4%
GTX480 = Average increase 13.9%

The bottom line is, it just depends on the graphics card and the game. Generalizations such as proposed in this thread that a user is better off going with an HD6850 vs. GTX460 if he has a slower CPU are unsubstantiated. The only way one can show that is to exactly test that specific CPU architecture with those cards - that's what a CPU scaling article addresses.

Personaly, I think RE5 and SF4 skew those percentages badly. I mean how is a 275 faster with an i5 than an i7?

MW2 is really odd as well.
 
Last edited:

reb0rn

Senior member
Dec 31, 2009
279
90
101
I would like to ask for the huge difference of 6850 and 6870 in 3d mark 2011 physix test from the lostcircuits.com site?
x61100.jpg


I would ask if test is valid, thay blame NV only and we see a bit huge difference in 6870 and 6850 also... how that is possible if we blame NV on this test only?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Somehow understanding how to read a simple bar graph is dishonest now... *face palm*

Making a bar graph where the bars do not start at zero is indeed dishonest.
It makes the information gleaned from a cursory glance false, requiring careful examination to notice the true data.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Esoteric claims with little or no bearing on gaming performance and subjective claims about driver quality. There's been way too much of the "insert GPU company here" has the best drivers claims. I don't think either company is producing sub-standard drivers but given they are drivers, occasionally you're going to have a situation where one might be optimized a little better for a given game, benchmark or task than the other. I know I won't lose sleep over it.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Making a bar graph where the bars do not start at zero is indeed dishonest.
It makes the information gleaned from a cursory glance false, requiring careful examination to notice the true data.
Again, understanding how to read a simple bar graph, which is something kids learn in elementary school, is now somehow dishonest *face palm*
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Again, understanding how to read a simple bar graph, which is something kids learn in elementary school, is now somehow dishonest *face palm*

You know full well that nobody was discussing the ability and knowledge of how to read a bar graph.
It has to do with how the data is presented and how much attention is lavished on it. Misleading titles, weasel words, and misleading pictures are dishonest. Even if careful examination reveals the truth.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
HAWX
There's some interesting things going on with a single AMD card on the Intel platform at lower resolutions. But if you look at 2560x1600 you can see the similar pattern: The 5870 is ahead when paired with the slowest processor and the 480 pulls ahead when paired with a faster processor. Also if you just focus on the dual core results at various clockspeeds, you will see the trend happening as well: Both cards are bottlenecked, the AMD card is slightly faster, but the 480 does indeed pull ahead with the 3.8 GHz at higher resolutions.


Both cards are bottlenecked?

Not really. They are limited to only around 80 FPS or so at 1920x... that's not limiting jack. Typical monitors only show 60FPS, and my own testing of myself has me unable to tell differences above about 35-40 FPS. There's tons of headroom before this "bottleneck" actually affects how any real person would actually perceive the game's performance as anything less than absolutely perfect.

So many people incorrectly apply the term bottleneck. There is no bottleneck when a game is showing 90 FPS. Performance will be perfect from the user point of view. If you want more, and have already maxed out graphical details, the only thing you can do that you'll even notice is to get a bigger monitor... and once you do that... hey, guess what... the CPU is no longer limiting anything at 2560x. Same numbers at 2.6GHz and 3.8Ghz except for the dual core.

Is a limitation that limits performance to some value that a real person can't actually perceive actually a limitation? I say no, it isn't. True CPU bottlenecks that limit performance to a point that people can actually tell a difference are damn near nonexistant in today's games.

In my mind if a graphics driver can give better graphical performance by increasing CPU usage, it should do that, because 100% of the games I actually play at the settings I actually use, my CPU is not limiting my perceived performance in any way.
 
Last edited: