CPU or GPU for games? - Intel E6850 Bottleneck Investigation

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
http://alienbabeltech.com/main/?p=13454

Conclusion

After all of this, it should be obvious why I’m still using my E6850: because there’s no need for anything faster. At reasonably high detail levels the GPU is by far the most important equation to gaming, and hence my GTX285 is the primary bottleneck in every gaming situation I use it in.

[...]

If you have any kind of limited budget, sink as much money as you can into the graphics card, and also buy the biggest monitor with the biggest resolution you can afford.

Guess this will help future CPU vs GPU bottleneck discussions.

Any chance to include other CPU/GPU architectures in a future investigation?

Ty BFG10K

EDIT: I think adding this other article from tomshardware is pertinent to the debate at hand to reinforce and complement BFG10K article.

Part 1: Building A Balanced Gaming PC
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Any chance to include other CPU/GPU architectures in a future investigation?
I hope to test a Fermi with an E6850 (at stock) as I have no doubt I'll still get a massive performance gain over my GTX285 with said processor.

I also want to eventually get a Westmere @ 3.46 GHz as I’m bored of my current CPU/platform, so I'll do a CPU scaling test with Fermi.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
I hope to test a Fermi with an E6850 (at stock) as I have no doubt I'll still get a massive performance gain over my GTX285 with said processor.

I also want to eventually get a Westmere @ 3.46 GHz as I’m bored of my current CPU/platform, so I'll do a CPU scaling test with Fermi.

Guess I better wait sit then... :p
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
If you are doing this then please include the important min fps number, also I tend to find games much more demanding when played online with lots of people then with canned runs.

For example a ut2004 DM canned run would give 200+ average fps for his machine, but play it online 32 player ons and you'll find you are cpu bound and with a 2ghz cpu you'll have a min fps < 40.
 
Last edited:

WraithETC

Golden Member
May 15, 2005
1,464
1
81
Unless you play GTA4 on PC this is true. GTA4 jumps up a steady 20-30 frames going from even C2Q to I7/I5. Granted this probably says more about the coding of the game than the merits of the processor.

Another console part Red Faction Guerilla suffers from the same problem as GTA4 except that its also very buggy anyways.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Unless you play GTA4 on PC this is true. GTA4 jumps up a steady 20-30 frames going from even C2Q to I7/I5. Granted this probably says more about the coding of the game than the merits of the processor.

Another console part Red Faction Guerilla suffers from the same problem as GTA4 except that its also very buggy anyways.

Not at high resolutions. Check this:

http://www.tomshardware.com/reviews/build-balanced-platform,2469-11.html

Q9550 is pretty much on par to i7 920 at 2560x1600.
 

ShreddedWheat

Senior member
Apr 3, 2006
386
0
0
Thanks for the great article and the article by Tomshardware was great. Would have loved to see some benchmarks using the 5850 or 5870 but the articles still make the point.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I also want to eventually get a Westmere @ 3.46 GHz as I&#8217;m bored of my current CPU/platform

When do these chips come out? Next month right?

I can't wait to see the reviews (especially when quiet overclocking is taken into consideration with stock box coolers).
 

Indus

Lifer
May 11, 2002
16,601
11,405
136
Awesome article but I do have 1 question.

If I force AA by driver in crysis I get better framerate than forcing AA in the game. What gives?

Also by forcing driver to 4x AA overclocking makes a 17&#37; increase in framerate. However if I force it in the game stock is better.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
If you are doing this then please include the important min fps number, also I tend to find games much more demanding when played online with lots of people then with canned runs.
The benchmarks I run are meticulously selected and/or designed so they’re as stressful as possible and don’t produce inflated scores. That’s why my results tend to be much lower than others.

As for minimums, I think they’re only useful if there’s a plot putting them into context. Otherwise they’re just a single data point and could be benchmarking noise.
For example a ut2004 DM canned run would give 200+ average fps for his machine, but play it online 32 player ons and you'll find you are cpu bound and with a 2ghz cpu you'll have a min fps < 40.
I didn’t test UT2004 but I play it at 2560x1600 with 8xS and 16xAF. At that setting I can almost guarantee the GPU will be the primary bottleneck, even with 32 players.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
When do these chips come out? Next month right?

I can't wait to see the reviews (especially when quiet overclocking is taken into consideration with stock box coolers).
Q4 09 release and Q1 10 shipping. I can’t wait to finally get some new toys in the CPU/platform department, and I think I’ll get something like this:

http://ixbtlabs.com/articles3/mainboard/gigabyte-p55-ud3-i55p-p1.htmlhttp://www.ixbt.com/mainboard/gigabyte/p55-ud3/board.jpg

I love solid mainstream motherboards that don’t cost the Earth.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
If I force AA by driver in crysis I get better framerate than forcing AA in the game. What gives?

Also by forcing driver to 4x AA overclocking makes a 17% increase in framerate. However if I force it in the game stock is better.
Crysis (and likely Aion too) can’t have driver AA forced in it. When you do so you’re actually running the game without AA, so overclocking the CPU helps more.

When you use in-game AA you’re actually getting AA, so the bottleneck is with the GPU.
 

WildW

Senior member
Oct 3, 2008
984
20
81
evilpicard.com
I'm somewhat confused by the Crysis results, which showed that both CPU and GPU underclocking slowed down performance.

The way I figure it, given a CPU and GPU of a given performance, one or the other will be running flat out, and the other will be underutilized to some extent, be it the CPU or the GPU. . . unless they're in perfect balance I guess.

You slowed down the GPU and showed that performance dropped. You slowed down the CPU and showed that performance dropped. Fine.

So, if underclocking and overclocking illustrate the same thing, it logically follows that a faster GPU would increase performance. The GPU is running flat out while the CPU still has more to give.

However, since slowing the CPU also slows the game down, a faster CPU would also increase performance. Clearly the CPU is running flat out and not keeping the graphics card busy enough.

Ouch, my brain hurts. Those two statements can't both be true. I'm not saying you're wrong - clearly you've put a lot of thought into all this. But why am _I_ wrong?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
However, since slowing the CPU also slows the game down, a faster CPU would also increase performance. Clearly the CPU is running flat out and not keeping the graphics card busy enough.

Ouch, my brain hurts. Those two statements can't both be true. I'm not saying you're wrong - clearly you've put a lot of thought into all this. But why am _I_ wrong?

Crysis is weird! :D

Anyway, I guess BFG10K test rig is his personal rig- so I bet he has all sort of crap installed in there - maybe that could explain it.

Hey BFG10K, what are your test conditions? Did you use a clean install? If not did you stop all the other processes while running? (I'm sorry if these questions feel like I'm trying to call you noob benchmarker - it's nothing of the sort - actually I believe benchmarking while doing that kind of stuff is not real daily life conditions).

Anyway, a 33&#37; clock decrease (or a 50% clock increase if you prefer) only changes performance by about 8% (or 9%) compared to the steady 30% decrease that goes with 30% GPU power decrease.

Here are 2 more alien babel articles which show (or at least point to) that gaming doesn't need to be expensive:

Performance Meets Value &#8211; Core i7 vs. Penryn vs. Phenom II


AMD Value Platform Analysis Part 3:Athlon II X2 250 vs Phenom II X2 550


Of course if you do other heavy activities on your PC, like encoding i7 still beats the crap out of Core 2 and Phenom II, and Phenom II still beats the crap out of Athlon II, but for gaming, and especially at high resolutions and single GPU systems, differences aren't that big.

Hoping to see 5870 thrown in the next comparison apoppin (does he still post in here?).
 
Last edited:

Indus

Lifer
May 11, 2002
16,601
11,405
136
I'm somewhat confused by the Crysis results, which showed that both CPU and GPU underclocking slowed down performance.

The way I figure it, given a CPU and GPU of a given performance, one or the other will be running flat out, and the other will be underutilized to some extent, be it the CPU or the GPU. . . unless they're in perfect balance I guess.

You slowed down the GPU and showed that performance dropped. You slowed down the CPU and showed that performance dropped. Fine.

So, if underclocking and overclocking illustrate the same thing, it logically follows that a faster GPU would increase performance. The GPU is running flat out while the CPU still has more to give.

However, since slowing the CPU also slows the game down, a faster CPU would also increase performance. Clearly the CPU is running flat out and not keeping the graphics card busy enough.

Ouch, my brain hurts. Those two statements can't both be true. I'm not saying you're wrong - clearly you've put a lot of thought into all this. But why am _I_ wrong?

I 10 years ago heard something that rings quite true today. It was when I first started playing Everquest when it came out and I couldn't play it with my old pci card. Someone said, 3d gaming is 20% cpu, 80% video card. This article just proved that.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Anyway, I guess BFG10K test rig is his personal rig- so I bet he has all sort of crap installed in there - maybe that could explain it.

Hey BFG10K, what are your test conditions? Did you use a clean install? If not did you stop all the other processes while running? (I'm sorry if these questions feel like I'm trying to call you noob benchmarker - it's nothing of the sort - actually I believe benchmarking while doing that kind of stuff is not real daily life conditions).
Yes it’s my own rig, but I always keep it clean anyway. The only two tray icons I have running are my GPU one and the generic sound one from Windows. The only significant background process I have running is the iTunes one, but that has absolutely no impact on benchmarking.

The Crysis result is a little strange but there are a few others where the two figures added up exceed 33%. This probably means that if you were to underclock both by 33%, you’d lose more than 33% performance.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
I find your methodology a little messed up especially since your conclusion about quad cores is unsupported by your evidence. For you to say adding two more cores does not help or hurt gaming performance you'd need to actually do just that. So you'd need something like a quadcore where you could disable two cores and reenable them and test the difference between them. At most your evidence shows that overclocking a CPU makes little difference in most games.

Second your methodology is to always increase the graphics detail to the very limit of the GPU either by increasing the resolution or add increasing amounts of AA/ SAA so of course the GPU will be the limit in such a case. I think the goal of most gamers is to always have playable frame rates without sacrificing image quality or resolution. As such we need to know what fps you are actually getting in your real world games, and also your video card settings need to be more standardized.
In other words there is already a flaw in your methodology once you had to use 1680 x 1050 for Stalker Clearsky instead of 1920 x 1200 since that means the game is unplayable either due to your GPU or CPU. I assume the former though.

Thus the main difference I'd like to see is not average fps, but minimum fps. If the GPU settings you play with dip below 30 fps then it is not one I would use to start with. If lowering the GPU settings does not increase framerates in such cases it is a sign that the CPU is acting as a limitation.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
his numbers dont add up either. he acts like his cpu at 2.0 has very little effect even on games like Far Cry 2 which is completely false. I put my cpu at 2.0, which is faster than his at 2.0, and the difference was night and day even at 1920 with just a 192sp gtx260. I would have had even better results with a quad too.


Far Cry 2
Settings: Demo(Ranch Long), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(2x), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)

E8500 @ 2.0

Total Frames: 11808, Total Time: 284.02s
Average Framerate: 41.57
Max. Framerate: 84.90 (Frame:1851, 26.47s)
Min. Framerate: 23.28 (Frame:5683, 125.34s)


E8500 @ 3.8

Total Frames: 16568, Total Time: 284.01s
Average Framerate: 58.34
Max. Framerate: 114.58 (Frame:4, 0.04s)
Min. Framerate: 36.90 (Frame:7835, 125.13s)


thats a 40&#37; increase in average and 58% increase in minimum framerates by running my E8500 at 3.8 instead of 2.0. with a faster card and a faster quad the difference would have been even greater. :eek:
 
Last edited:

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
his numbers dont add up either. he acts like his cpu at 2.0 has very little effect even on games like Far Cry 2 which is completely false. I put my cpu at 2.0, which is faster than his at 2.0, and the difference was night and day even at 1920 with just a 192sp gtx260. I would have had even better results with a quad too.


Far Cry 2
Settings: Demo(Ranch Long), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(2x), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)

E8500 @ 2.0

Total Frames: 11808, Total Time: 284.02s
Average Framerate: 41.57
Max. Framerate: 84.90 (Frame:1851, 26.47s)
Min. Framerate: 23.28 (Frame:5683, 125.34s)


E8500 @ 3.8

Total Frames: 16568, Total Time: 284.01s
Average Framerate: 58.34
Max. Framerate: 114.58 (Frame:4, 0.04s)
Min. Framerate: 36.90 (Frame:7835, 125.13s)


thats a 40% increase in average and 58% increase in minimum framerates by running my E8500 at 3.8 instead of 2.0. :eek:

Well to be fair your CPU is going from 2.0 to 3.8 while his is going from 2.0 to 3.0 which is a 90% increase vs a 50% increase. However I agree it does show that the CPU does make a large difference though not quite linear. Can you try a few more clockspeeds in between those two? I'm thinking perhaps there are certain CPU limits where adding or decreasing the CPU won't make much of a difference till you get to the next limit point.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Well to be fair your CPU is going from 2.0 to 3.8 while his is going from 2.0 to 3.0 which is a 90&#37; increase vs a 50% increase. However I agree it does show that the CPU does make a large difference though not quite linear. Can you try a few more clockspeeds in between those two? I'm thinking perhaps there are certain CPU limits where adding or decreasing the CPU won't make much of a difference till you get to the next limit point.
well he points out in various threads that his cpu at 2.0 would not even be much of a bottleneck for a gtx285 or 5870. of course that claim is hilarious because it makes a noticeable difference in several games even with just a 192sp gtx260 at very gpu limited settings.

I dont really want to run benchmarks all day long since I pretty much gave up trying to convince people like him. for most games pretty much any modern dual core cpu will do but there are several games where a quad is really needed. IMO someone thats buying a 5850 or faster should have a quad or at least a very fast dual core.
 
Last edited:

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Ok, so what did we learn? For many first person shooters released in the last 4 years the CPU is not as important as a video card at high resolution when FSAA is enabled. I have no trouble believing that.

Now let's mix it up a bit with some RTS, MMOs, sims (flight and racing) including some poor console ports, shall we? I suggest:

GTA IV
Anno 1404
Supreme Commander
X3: reunion, final mission
X3: terran conflict, Aldrin system
MS Flight Sim X
Warcraft III, peak time in Dal, running 2 clients
Eve Online, running 4 clients, one of which is in a 800+ man fleet fight
Evony (a flash web game), running 2 or more browser tabs

It doesn't come as a big shock that most first person shooters don't need the cpu to do very much other than compute whether you properly shot something in the face. That problem was adequately solved to run on pentium IIIs with halflife. That doesn't mean that other game situations won't choke an overclocked i7.