• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

CPU or GPU for games? - Intel E6850 Bottleneck Investigation

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Ok, so what did we learn? For many first person shooters released in the last 4 years the CPU is not as important as a video card at high resolution when FSAA is enabled. I have no trouble believing that.

Now let's mix it up a bit with some RTS, MMOs, sims (flight and racing) including some poor console ports, shall we? I suggest:

GTA IV
Anno 1404
Supreme Commander
X3: reunion, final mission
X3: terran conflict, Aldrin system
MS Flight Sim X
Warcraft III, peak time in Dal, running 2 clients
Eve Online, running 4 clients, one of which is in a 800+ man fleet fight
Evony (a flash web game), running 2 or more browser tabs

It doesn't come as a big shock that most first person shooters don't need the cpu to do very much other than compute whether you properly shot something in the face. That problem was adequately solved to run on pentium IIIs with halflife. That doesn't mean that other game situations won't choke an overclocked i7.
so what kind of games are Far Cry 2, Red Faction Guerrilla and Ghostbusters? Metro 2033 is FPS thats also having a quad listed for the recommended cpu.
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
This guy always seems like he's defending himself in his articles

Why dose my 4850 perform so much better when I overlcok my Phenom II x3 from 2.8ghz to 3.8? He acts as if there's nothing to gain from more computational power, that simply isn't the case.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Well CPU can not do GPU which is why you get 1fps in the CPU tests in 3dmark.

The time will come where the CPU will have a GPU on it. Not anytime soon though dont hold your breath. Im talking 2014 to 2020 Intel era . Eventually CPU can do ray tracing in realtime,,,etc...
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
so what kind of games are Far Cry 2, Red Faction Guerrilla and Ghostbusters? Metro 2033 is FPS thats also having a quad listed for the recommended cpu.

Battlefield Bad Company 2 also lists a quad recommended as well. As we all know, RTS's benefit greatly from quads, certain FPS's will as well. Battlefield Bad Company 2 uses Havok Physics which I'm sure is part of the reason why it runs better on a quad.
 

jtisgeek

Senior member
Jan 26, 2010
295
0
0
I think what it comes do too is if you have say a 8400 core 2 with a 4850.

If you wanted to upgrade to say a 5850 with the same cpu you would get a lot more bang for your buck then say a i7 920 with the 4850.

Also when it comes down to it like he kinda said most people don't game on 30 inch screens either.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think what it comes do too is if you have say a 8400 core 2 with a 4850.

If you wanted to upgrade to say a 5850 with the same cpu you would get a lot more bang for your buck then say a i7 920 with the 4850.

Also when it comes down to it like he kinda said most people don't game on 30 inch screens either.
but does that ever really come up for debate? most of us know that if you have a decent cpu then a gpu upgrade will offer the best benefit in most cases. the problem is that plenty of people with really slow cpus think about getting top of the line cards which is silly. also sometimes those same people may already have a gpu that is being limited by the cpu. in many cases an upgrade of both is needed if playing modern games smoothly on high settings is the purpose.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
This topic is particularly interesting to me since I've got an e7300 at 3.6 and a 5770 so both are potential limits depending on the game played. Since I'm just at the border of 30 fps in parts of games like dragon age I want to know what to upgrade to stay that way in the future.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
This topic is particularly interesting to me since I've got an e7300 at 3.6 and a 5770 so both are potential limits depending on the game played. Since I'm just at the border of 30 fps in parts of games like dragon age I want to know what to upgrade to stay that way in the future.

What rez do you game at? That card/processor can max that game out easily.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
1920 x 1080 with everything maxed in game. The game runs fine, but in places like denerim market place frame rate can go between 28- 40 something using FRAPs.
I plan to upgrade when in the games I play I can no longer do 1920 x 1080 at medium settings with frame rates higher than 30 at least 95% of the time.
I'm just thinking that day maybe pretty close with all these new games (dragon age, metro 2033, Bad Company 2) starting to make use of quadcores.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
the problem is that plenty of people with really slow cpus think about getting top of the line cards

You make good arguments about balancing GPU with CPU power.

I'd also like to mention a surprising number of folks also claim to be using older low resolution 12x10 monitors (which further compounds the problem of value per dollar)

This makes me question the benefits of even running higher AA with a better card (assuming the CPU was up to the task)? Couldn't the monitor itself impact gains in image quality?

Electronic benchmarks are just that electronic benchmarks. I want to know at what level does the monitor quality itself need to be matched to the video cards capabilities? Unfortunately this is something that either needs a very trained eye or sophisticated equipment capable of measuring final image quality at the level of the LCD.
 

jtisgeek

Senior member
Jan 26, 2010
295
0
0
but does that ever really come up for debate? most of us know that if you have a decent cpu then a gpu upgrade will offer the best benefit in most cases. the problem is that plenty of people with really slow cpus think about getting top of the line cards which is silly. also sometimes those same people may already have a gpu that is being limited by the cpu. in many cases an upgrade of both is needed if playing modern games smoothly on high settings is the purpose.

Yeah it does sometime unless you have a really slow cpu there will always be a gain from going to say a 5850 now does it make since to us hardcore hardware guys no not really but hey to each there own.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
This guy always seems like he's defending himself in his articles
A lot of people don't like it when the data I present bucks commonly accepted beliefs. I can't help it if my testing proves me right.

Why dose my 4850 perform so much better when I overlcok my Phenom II x3 from 2.8ghz to 3.8?
Can you please list the games, their settings, and hard numbers to back that claim? Thanks.

He acts as if there's nothing to gain from more computational power, that simply isn't the case.
Nope, I'm just saying that the importance of the CPU for gaming is vastly overstated, especially with respect to having more than two cores. A GPU is far more important in most situations, and will almost always yield the biggest benefits to gaming.

I'd rather have a 2 GHz dual-core CPU paired with a Fermi. You can keep your 4850 and pair it with any processor you please. In the end, my gaming performance overall will be far better than yours.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
A lot of people don't like it when the data I present bucks commonly accepted beliefs. I can't help it if my testing proves me right.


Can you please list the games, their settings, and hard numbers to back that claim? Thanks.


Nope, I'm just saying that the importance of the CPU for gaming is vastly overstated, especially with respect to having more than two cores. A GPU is far more important in most situations, and will almost always yield the biggest benefits to gaming.

I'd rather have a 2 GHz dual-core CPU paired with a Fermi. You can keep your 4850 and pair it with any processor you please. In the end, my gaming performance overall will be far better than yours.
having a $500+ gpu running on a 2.0ghz cpu that stifles over 50% of its performance in many games would be asinine. to me having a $500+ video card and getting the same minimum framerate in many games that I would with a $200 card is stupid. lol at getting low teens and even single digits in cpu intensive games while using a $500+ video card.

you always act like its either get a cpu and stick with lower end gpu or buy a very high end gpu no matter how much performance gets wasted. there are options in between you know.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
For you to say adding two more cores does not help or hurt gaming performance you'd need to actually do just that.
Uh, no. I don’t think you understand how bottlenecking works. If two cores fully saturate my GTX285 so that it’s the primary performance limitation, why do you think an extra two cores will remove that bottleneck? Do you think those extra two cores will make the GPU run faster? Of course they won’t.

If I have two lanes merging into a single lane bridge, that bridge is always a single lane, even if I increase the lanes merging into it to four.

And for the record, my i5 750 also shows practically no performance difference over my E6850 at 3 GHz in exactly the same gaming situations, and that’s with four cores turboing to 3.2 GHz. I just haven’t gotten around to putting up the results yet.

Upgrading to an i5 750 has largely been a waste of time and money because it’s my GTX285 that is holding me back almost 100%.

Well, at least the upgrade will get the “you’re CPU limited with your E6850” people off my back, the sorts of people that refuse to believe the reality I show them.

Second your methodology is to always increase the graphics detail to the very limit of the GPU either by increasing the resolution or add increasing amounts of AA/ SAA so of course the GPU will be the limit in such a case. I think the goal of most gamers is to always have playable frame rates without sacrificing image quality or resolution.
I’m pretty sure I repeatedly mentioned in that article that the settings I used are the actual settings I play the games at. As such my goal is exactly the same as the goal of most gamers.

As such we need to know what fps you are actually getting in your real world games, and also your video card settings need to be more standardized.
We already know the real-world FPS because the settings I used are the actual settings I game on my GTX285.

“Standardizing” is just another word for “skewing to artificially inflate CPU differences”. I’m not going to run everything at 1680x1050 with no AA, because I don’t play games at those settings. Someone with a 4 GHz i7 won’t be playing games at such settings either.

The fact is, if you always configure your games to run at the highest playable settings for your particular GPU, the GPU will bottleneck you by far. I’m pretty sure I also mentioned this in the article too.

In other words there is already a flaw in your methodology once you had to use 1680 x 1050 for Stalker Clearsky instead of 1920 x 1200 since that means the game is unplayable either due to your GPU or CPU. I assume the former though.
You’re wrong about the first part, but right about the second. Yes, Clear Sky is too slow to run at 1920x1200 but that’s exactly the point, and it doesn’t make the methodology flawed. Like I said, I presented the highest playable settings for my GTX285, the same settings that I game at.

So when I play Clear Sky, those are exactly the settings I use (1680x1050 with 2xTrSS). The same applies to the rest of the games I tested.

Thus the main difference I'd like to see is not average fps, but minimum fps.
A minimum by definition is a single data point, and as such is useless without a plot point putting it into perspective. I can have a lower minimum at a single point, but if the rest of benchmark has a higher framerate, the minimum is useless.

That and in my experience there’s usually a strong correlation between average and minimum FPS, anyway.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
when I refer to minimum framerate I dont mean just one single dip. I am talking about continuously lower framerates in intensive parts of a game and low dips that effect playability. again there is no way I would buy a very high end gpu and still put up with that. that Far Cry 2 bench I posted should speak for itself in showing how much performance is wasted even with just a gtx260 and at pretty gpu limited settings. games like Ghostbusters and Red Faction Guerrilla are even worse when it comes to min framerate becoming basically unplayable at times with my cpu at 2.0.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
he acts like his cpu at 2.0 has very little effect even on games like Far Cry 2 which is completely false.
It isn’t false at all. The benchmark figures are derived from the game’s logs. If you think the game is lying then you need to provide evidence of that.

thats a 40% increase in average and 58% increase in minimum framerates by running my E8500 at 3.8 instead of 2.0. with a faster card and a faster quad the difference would have been even greater.
You used a different benchmark, a lower resolution, and likely no AF or TrAA either. You also didn’t test the effects of your GPU. Get back to me when you actually do it properly.

having a $500+ gpu running on a 2.0ghz cpu that stifles over 50% of its performance in many games would be asinine.
That wouldn’t happen if you were running games at their highest playable settings. We know this because the GTX285 was bottlenecking my E6850 @ 2 GHz by almost 100%, and moving to 3 GHz had little to no benefit.

We also know this because a 33% underclock to the GPU basically caused a linear performance drop in almost all of the 16 games that were tested.

This is really quite elementary.

to me having a $500+ video card and getting the same minimum framerate in many games that I would with a $200 card is stupid.
Are you under the mistaken impression that only the CPU can raise a minimum framerate?

there are options in between you know.
Yep, the options are to buy the biggest monitor as possible with the highest possible resolution, and the most powerful graphics card to feed it. For gaming, a CPU is far less important than a good display and a good GPU.

A 2 GHz dual-core CPU + Fermi + 30” LCD will be a far faster and more immersive gaming platform than a 4 GHz i7 + 4850 + 1080p LCD.

Maybe instead of worrying about CPUs so much, you might think about upgrading your display?

when I refer to minimum framerate I dont mean just one single dip. I am talking about continuously lower framerates in intensive parts of a game and low dips that effect playability.
The more continuous the duration is, the more it impacts the average framerate. That's my point.

games like Ghostbusters and Red Faction Guerrilla are even worse when it comes to min framerate becoming basically unplayable at times with my cpu at 2.0.
These games are but fringe titles in the grand scheme of things. 95% of games out there will cause the GPU to be the primary bottleneck if you configure them to run at their highest playable settings.
 

jlee

Lifer
Sep 12, 2001
48,518
223
106
My (stock) i5 750 made a definite difference over an E5200 @ 3.24Ghz (Sapphire Vapor-X 4890). I should've benched a few games before and after...oh well.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
sorry BFG10K but thats just more nonsense. I ran the Far Cry 2 bench at 1920x1080, very high settings and 2x AA which is a very gpu limited scenario for a 192sp gtx260 and still saw massive difference in framerates from 3.8 to 2.0 on my cpu.

yeah I guess those 5% dont mean much unless you are actually playing them though. gee 95% of the roads here dont have many potholes but the road I drive on does so do you think that the other 95% means squat at that time?

there are plenty of games I could show you benchmarks on either from my own tests or other site. all you will do is make up excuses though so whats the point?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
My (stock) i5 750 made a definite difference over an E5200 @ 3.24Ghz (Sapphire Vapor-X 4890). I should've benched a few games before and after...oh well.
it doesnt matter because even if you had numbers he would say you needed to have at least 8x AA or some other nonsense.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
sorry BFG10K but thats just more nonsense. I ran the Far Cry 2 bench at 1920x1080, very high settings and 2x AA which is a very gpu limited scenario for a 192sp gtx260 and still saw massive difference in framerates from 3.8 to 2.0 on my cpu.
If you don't want to do it properly then just say so, but don't pretend you’re doing it properly.

there are plenty of games I could show you benchmarks on either from my own tests or other site. all you will do is make up excuses though so whats the point?
We've already been over this repeatedly in other threads. Every time you show such results, I show you the same games at higher settings where the GPU matters more, but you refuse to acknowledge what I’m showing you.

Again, I never claimed the CPU makes no difference, just that the GPU is overwhelmingly more important in the vast majority of situations when the games are configured to run at their highest playable settings.

it doesnt matter because even if you had numbers he would say you needed to have at least 8x AA or some other nonsense.
If 8xAA is playable then of course it should be used. That’s one of the main purposes of a graphics card – to render better visuals.

People with 4 GHz i7 systems don’t trot around their games at 1680x1050 with no AA. If they do then they need a better monitor and a better graphics card, or they don’t know how to configure their system properly.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
If you don't want to do it properly then just say so, but don't pretend you’re doing it properly.


We've already been over this repeatedly in other threads. Every time you show such results, I show you the same games at higher settings where the GPU matters more, but you refuse to acknowledge what I’m showing you.

Again, I never claimed the CPU makes no difference, just that the GPU is overwhelmingly more important in the vast majority of situations when the games are configured to run at their highest playable settings.


If 8xAA is playable then of course it should be used. That’s one of the main purposes of a graphics card – to render better visuals.

People with 4 GHz i7 systems don’t trot around their games at 1680x1050 with no AA. If they do then they need a better monitor and a better graphics card, or they don’t know how to configure their system properly.
and again you have missed the point. most people buy high end gpus for better framerates first and better visuals second. how many threads are started where people are saying they want 8 or 16x AA? yet there are plenty of threads where people are complaining about framerate performance and wanting more though. ideally almost everyone wants great framerates with great visuals of course.

in addition to a good video card, most modern games require a decent cpu and some games even require a really good cpu(Prototype, GTA 4, RF Guerrilla, Ghostbusters, ARMA 2, BFBC 2 just to name a few) if you want smooth gameplay. I am playing some of those games that most certainly need a fast cpu. you keep playing those other 95% of games though.
 
Last edited:

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I've shown in CPU forum a athlon x2 5000 with a GTX280 was way faster in games than a i7 with a 7800GTX. This is nothing new video cards are most important for games, processors can wait.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I've shown in CPU forum a athlon x2 5000 with a GTX280 was way faster in games than a i7 with a 7800GTX. This is nothing new video cards are most important for games, processors can wait.
lol a 7800gtx compared to a gtx280. has a comparison that silly ever even come up in the forums? there would be many games where that i7 and a gtx260 would be better than a 5000 X2 gtx280 though. there are some games where the 5000 X2 doesnt even deliver very good framerates at all.
 
Last edited:

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
lol a 7800gtx compared to a gtx280. has a comparison that silly ever even come up in the forums? there would be many games where that i7 and a gtx260 would be better than a 5000 X2 gtx280 though. there are some games where the 5000 X2 doesnt even deliver decent min framerates at all.

Point was what was out 3 years ago and what should one upgrade given limited choices. Answaer was video card. everything was playable w/ new video card/
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Point was what was out 3 years ago and what should one upgrade given limited choices.
oh I see. the 5000 X2 is now pitiful in many current games though. again a gtx260 i7 setup would offer better overall playability in many more modern games than a gtx280 5000 X2 setup.
 
Last edited: