CPU or GPU for games? - Intel E6850 Bottleneck Investigation

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
The reason I say this is is there appears to be differences between a dual core running at 3.6 Ghz and a quad core running at 2.4 Ghz according to some benchmarks I've seen. Since the quad is still faster despite being slower it stands to reason that adding or subtracting the number of cores can be different depending on how the games are programmed. In single threaded games clockspeed was all that mattered.
Thus I just wanted to see if in your results if changing the number of cores would have mattered in any particular game since it may change how the game works.
You’re still not getting it. In any given situation, you need X amount of CPU power to saturate a graphics card so it becomes the primary bottleneck. X is derived from the CPU’s clock-speed, number of cores, IPC, etc. Once you reach X, increasing the CPU power by increasing the number of cores will do nothing.

For my particular tests, I demonstrated X was often reached with just two cores running at 2 GHz. Throwing in more cores will simply increase X, and will have no impact since the graphics cards was already the primary bottleneck.

Or to put it another way, if the number of cores is making a difference, it means the GPU isn’t the primary bottleneck. That wasn’t the case in my tests, because I demonstrated the GPU was the primary bottleneck.

Perhaps I'm not being clear what I mean by standardizing. When I build a system I'm looking for a certain minimum level of performance, and anything extra is gravy.
That minimum level of performance might be triple 2560 x 1600 screen at high settings with 2xAA 16xAF at over 60 fps average, or it could be a 1360 x 768 screen at medium with 0xAA 0xAF that has enough fps to feel smooth. If a game cannot perform at the required minimum that means some component needs upgrading to play that game. So what I want to know is playability at some fixed standard which I can extrapolate to the parts I use when building or upgrading.
Sure, but this level of standardizing is based on the GPU, not on the CPU. It’s by reducing or increasing graphics details that most performance changes come from. This is my point – that it’s the available GPU power that influences how playable a game is and what settings can be used, by far.

Thus the problem I have with your test is it is too variable as it doesn't tell me if the equipment I buy can perform at the level I want throughout most games. You are using a subjective test based on what you consider playable. I think this basically involves sacrificing fps to raise the graphics options as high as they can go until the game becomes unplayable to you. You also lower resolution or certain settings to make a game playable.
In other words playability is your constant when it really should be what you are looking for (in the form of fps at some standard chosen for all games).
I agree it’s subjective, but again I’d challenge everyone to be always configuring their graphics card to the highest playable settings. I’d also be encourage everyone to buy the biggest monitors they can afford instead of running crappy 1680x1050 displays with 4 GHz i7 CPUs.

I agree that minimum FPS isn't a perfect indicator of performance. However more data is better than less data, and a lower minimum can show that the game has become unplayable.
No, not always. I can have a lower split-second minimum due to benchmarking noise, but the actual benchmark run as a whole has a higher framerate which makes the average higher. If you base your decision on the minimum, then it’s completely misguided.

In the absence of a benchmark plot putting a minimum into context, an average is the best single number you can use.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Yeah, that's because the GPU was holding me back, so a faster processor would do nothing in that situation. If I dropped a 5870 or Fermi into my system, all of those games would show very large performance gains. A quad-core OTOH would show practically no benefit, as I will show soon (I'm typing up the article now).

In your new test could you show graphs like hardOCP or at least provide a minimum?
At the very least we'd like a subjective affirmation that the game was still just as playable as with the quad/ stock clock speeds (not just the average fps). Although I've noted my qualms with your methodology in one of my previous posts, your results could still be helpful in figuring out what games can be played on a system with a weak CPU and good GPU if you could provide that kind of affirmation. Otherwise a small decrease in average fps could make a big difference if the action slows to a crawl during action scenes, but otherwise remains normal.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
yes most people want to get the most framerate possible at their native res. most people are not concerned with insane levels of AA though.
Putting AA aside, a faster CPU won’t help with higher game detail settings. A faster CPU won’t help with faster DX10 or DX11 performance.
and cranking the AA is certainly no excuse to buy a $400 video card when even a $300 video card would be noticeably bottlenecked by the cpu.
It won’t be bottlenecked by the CPU when AA is being used. But then I actually know this given I tested a GTX260+ against a GTX285 on an E6850 and got about a 30% performance gain on average.

But in the same titles the CPU was making little to no difference to performance. So yeah, the difference between even a GTX260+ and GTX285 in real world gaming is quite large, and easily justifies the extra cost.
overall playability is MUCH more important and whether you accept it or not the cpu plays a huge role in that.
No it doesn’t.

with having your cpu at 2.0 if you mean no change in framerate by "no differnce" then you are mistaken or lying. framerate is most certainly effected in most games.
I’ve actually tested a quad-core i5 750 against an E6850 in around 100 games in real-world gaming (a combination of subjective and objective testing). How many quad-core CPUs have you tested, Toyota? How many games have you tested, Toyota?
while you are doing that I will try to enjoy ALL my games. also in most of the threads where people are complaining about performance it usually involves one of those "fringe games". of course your advice would be to go play the other 95% of games or just ignore those low framerates and increase the AA until the gpu is the bottleneck. lol
I’ll be enjoying my games far more because I have a better monitor than you and I run far higher settings than you do.

I also know my i5 750 is making little to no difference to that fact because I've actually tested one, and I actually game on one.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Putting AA aside, a faster CPU won’t help with higher game detail settings. A faster CPU won’t help with faster DX10 or DX11 performance.

It won’t be bottlenecked by the CPU when AA is being used. But then I actually know this given I tested a GTX260+ against a GTX285 on an E6850 and got about a 30% performance gain on average.

But in the same titles the CPU was making little to no difference to performance. So yeah, the difference between even a GTX260+ and GTX285 in real world gaming is quite large, and easily justifies the extra cost.

No it doesn’t.


I’ve actually tested a quad-core i5 750 against an E6850 in around 100 games in real-world gaming (a combination of subjective and objective testing). How many quad-core CPUs have you tested, Toyota? How many games have you tested, Toyota?

I’ll be enjoying my games far more because I have a better monitor than you and I run far higher settings than you do.

I also know my i5 750 is making little to no difference to that fact because I've actually tested one, and I actually game on one.
I can put my cpu at 2.0 and see the difference and in those more cpu intensive games that you call "fringe games" playability is noticeably affected. other sites have shown the difference between cpus but all you do is make excuses about their results. its a waste of time with you so why bother running or linking to benchmarks?

keep living in your fantasyland that has a 2.0 Core 2 not being a bottleneck for any game or gpu. perhaps you should notify some of those game developers since a cpu like that is below the recommended requirements to play many newer games. and we all know that recommended requirements are usually the real world minimums for decent gameplay.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
holy crap toyota!!! Go prove him wrong already. Get bfg10k to give you a list of games that he tested and his actual results, duplicate his system EXACTLY, and prove him wrong. If you're not willing to spend all the time/money/effort to do that and instead prefer to pull benchmarks from other sites, don't call him a liar. If you believe so strongly that he is wrong get off your ass and PROVE IT. BFG10k didn't say that no games will ever be bottlenecked by his 2.0 core 2, just that at the settings that he played at the primary bottleneck the vast majority of the time is the gpu.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
holy crap toyota!!! Go prove him wrong already. Get bfg10k to give you a list of games that he tested and his actual results, duplicate his system EXACTLY, and prove him wrong. If you're not willing to spend all the time/money/effort to do that and instead prefer to pull benchmarks from other sites, don't call him a liar. If you believe so strongly that he is wrong get off your ass and PROVE IT. BFG10k didn't say that no games will ever be bottlenecked by his 2.0 core 2, just that at the settings that he played at the primary bottleneck the vast majority of the time is the gpu.
I have proved it numerous times and all he does is give bullshit excuses. either the game is not important or I dont have enough AA or some other lame shit is all he can come up with. please stay out of the thread if all you want to do is flame because even you know part of what he is saying is crap. you take every opportunity to jump on me even when I am correct and you know it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
I can put my cpu at 2.0 and see the difference and in those more cpu intensive games that you call "fringe games" playability is noticeably affected. other sites have shown the difference between cpus but all you do is make excuses about their results. its a waste of time with you so why bother running or linking to benchmarks?
I’m simply contesting that if higher detail levels were used, a lot of the differences would be reduced or even nullified. I’m also encouraging everyone to buy the biggest monitors possible, and configure their games to run at the highest detail levels.

Additionally, it’s really easy to focus on the handful of fringe titles where the CPU has an impact and ignore the overwhelming swathe of games where the GPU has the biggest impact.

It’s overwhelming alright, like a 95/5 ratio.

Let me put it to you this way: on my E6850 I’ve used an 8800 GTS (640 MB), 8800 Ultra, GTX260+, and a GTX285. After each upgrade I witnessed a large benefit which dramatically enhanced performance and/or image quality in almost every game I own (100+ titles). Had I stayed with that CPU, I would’ve seen exactly the same benefit from a Fermi.

But now I moved to an i5 750 with my GTX285, and I witnessed practically no performance change - subjectively or objectively - in any of my games.
keep living in your fantasyland that has a 2.0 Core 2 not being a bottleneck for any game or gpu.
Again, if you think the framerate logs are lying then the burden of proof is with you.
perhaps you should notify some of those game developers since a cpu like that is below the recommended requirements to play many newer games. we all know that recommended requirements is usually the real world minimums for decent gameplay.
It made no difference to 2009 titles like Fear 2, Wolfenstein and Call of Juarez 2 that I tested. It also made no subjective difference to Cryostasis, a game so GPU demanding that I have to drop back to 1680x1050 with no AA, and it still slideshows in places on my GTX285. I know it’s the GPU because dropping the resolution dramatically improves the performance.

But then the minimum requirement for these games is a Pentium 4, so that’s hardly surprising.

As for some of your games:
Dragon Age: Core 2 Duo 1.6 GHz.
Ghostbusters - any Core 2 Duo.
GTA 4 - Core 2 Duo 1.8 GHz.
Arma 2 – Core 2 Duo 2 GHz.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I’m simply contesting that if higher detail levels were used, a lot of the differences would be reduced or even nullified. I’m also encouraging everyone to buy the biggest monitors possible, and configure their games to run at the highest detail levels.

Additionally, it’s really easy to focus on the handful of fringe titles where the CPU has an impact and ignore the overwhelming swathe of games where the GPU has the biggest impact.

It’s overwhelming alright, like a 95/5 ratio.

Let me put it to you this way: on my E6850 I’ve used an 8800 GTS (640 MB), 8800 Ultra, GTX260+, and a GTX285. After each upgrade I witnessed a large benefit which dramatically enhanced performance and/or image quality in almost every game I own (100+ titles). Had I stayed with that CPU, I would’ve seen exactly the same benefit from a Fermi.

But now I moved to an i5 750 with my GTX285, and I witnessed practically no performance change - subjectively or objectively - in any of my games.

Again, if you think the framerate logs are lying then the burden of proof is with you.

It made no difference to 2009 titles like Fear 2, Wolfenstein and Call of Juarez 2 that I tested. It also made no subjective difference to Cryostasis, a game so GPU demanding that I have to drop back to 1680x1050 with no AA, and it still slideshows in places on my GTX285. I know it’s the GPU because dropping the resolution dramatically improves the performance.

But then the minimum requirement for these games is a Pentium 4, so that’s hardly surprising.

As for some of your games:
Dragon Age: Core 2 Duo 1.6 GHz.
Ghostbusters - any Core 2 Duo.
GTA 4 - Core 2 Duo 1.8 GHz.
Arma 2 – Core 2 Duo 2 GHz.

try playing most of those games at decent settings with min requirements whether gpu or cpu and get beck to me. you can get by with low end cpus when using low end gpus because many settings also affect the cpu. If I try to run GTA 4 or some other games at highest setting they are unplayable at times with my cpu at 2.0 but improve greatly when raising my cpu speed and that with just a gtx260 at very gpu limited settings to begin with. people that buy high end gpus are doing so to get the most out of them. if you want to crank the settings and get good gameplay then having a Core 2 2.0 is NOT going to give you a good experience in many modern games. and even in games that is does play okay a faster cpu will still give a large boost if using a very high end gpu.
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Let me put it to you this way: on my E6850 I’ve used an 8800 GTS (640 MB), 8800 Ultra, GTX260+, and a GTX285. After each upgrade I witnessed a large benefit which dramatically enhanced performance and/or image quality in almost every game I own (100+ titles). Had I stayed with that CPU, I would’ve seen exactly the same benefit from a Fermi.

But now I moved to an i5 750 with my GTX285, and I witnessed practically no performance change - subjectively or objectively - in any of my games.

PCGH benched a dozen CPUs with a GTX285 in World of Warcraft. Going from a Core2Duo E6600 to a 3.5ghz Core i7 rose framerates 74% from 38 fps to 66 fps. Just the CPU did this on identical systems with a GTX285.

http://www.pcgameshardware.com/aid,...nchmarks-with-Phenom-II-and-Core-i5/Practice/

Doesn't it make sense to pair the cpu and gpu accordingly? You wouldn't want to bottleneck either of them. Wow is probably the most cpu intense video game, with all the online battles between 1000's of individual entities. If you were to bench 1680 x 1050 0xAA 0xAF with these two CPUs and a GTX285 in Street Fighter 4, Half life 2, Fear 1, Resident Evil 5, or other games that are less taxing on the CPU, you would likely see <10% gain going to the 3.5ghz Corei7.

What cpu & gpu a guy needs depends on what games and apps the person plans to play.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
actually there are way more cpu intensive games than WoW. Prototype, GTA 4, Ghostbusters, Anno 1404, RF Guerrilla, Bad Company 2 and Arma 2 all come to mind. of course BFG10K doesnt consider those games important.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I can see BFG10K's point about the GPU being bottlenecked at the highest detail settings.

I just question if those detail settings are beyond the point of actually being useful to the human eye.

Heck, even BFG10K himself made a comment that Fermi's new 32xAA was a waste if it was just edge based.

P.S. The only type of anti-aliasing I know about is "edge anti-aliasing". However, I am always willing to learn.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I can see BFG10K's point about the GPU being bottlenecked at the highest detail settings.

I just question if those detail settings are beyond the point of actually being useful to the human eye.

Heck, even BFG10K himself made a comment that Fermi's new 32xAA was a complete waste if it was edge just anti-aliasing.

P.S. The only type of anti-aliasing I know about is "edge anti-aliasing". However, I am always willing to learn.
I only worry about running more AA once I have the game playing like I want. if the cpu is my limiting factor for having smooth gameplay or framerates in general then buying another video card just to crank AA is the least of my concerns. games like BC 2 , RF Guerrilla, Ghostbusters, GTA 4 and some others would play no better if I upgraded my gpu at this point.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I only worry about running more AA once I have the game playing like I want. if the cpu is my limiting factor for having smooth gameplay or framerates in general then buying another video card just to crank AA is the least of my concerns. games like BC 2 , RF Guerrilla, Ghostbusters, GTA 4 and some others would play no better if I upgraded my gpu at this point.

Yep, to me AA is just icing on the cake.

In fact, If I can turn AA up I almost consider it a sign I bought too much video card.

However, I reserve the right to change my opinion on this if I begin to learn more about image quality and find myself wrong. It might be I need to do more gaming comparisons.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Yep, to me AA is just icing on the cake.

In fact, If I can turn AA up I almost consider it a sign I bought too much video card.

However, I reserve the right to change my opinion on this if I begin to learn more about image quality and find myself wrong. It might be I need to do more gaming comparisons.
also him claiming that his Core 2 Duo at 2.0 has no effect is laughable. in several games it almost becomes the absolute limitation with my gtx260 even at 1920. Far Cry 2, GTA 4, Ghostbusters and RF Guerrilla DO NOT budge when overclocking my gtx260 while my cpu is at 2.0. GTA 4, Ghostbusters and RF Guerrilla become basically unplayable at times too with my cpu at 2.0. of course that would apply to games like Prototype also. even in other games the framerates are usually noticeably affected even if gameplay isnt.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
The article is up at the usual spot.
try playing most of those games at decent settings with min requirements whether gpu or cpu and get beck to me. you can get by with low end cpus when using low end gpus because many settings also affect the cpu.
I never claimed it would be a good experience, I simply debunked your comment about &#8220;telling the publishers I&#8217;m running below their specs&#8221;, or some-such nonsense that you stated.

What I actually claimed was in the 17 games I tested, 2 GHz had little to no performance impact, something you appear to have a hard time understanding. I&#8217;m also telling you I actually use the settings that I play the games at, including finishing them from start to finish.

And all of those titles were released in 2006 or later, and three of them came out in 2009.

Yes, there are a handful of titles that need the fastest CPU possible, but these are a drop in the bucket compared to the massive benefits reaped from a faster GPU and a large high resolution display.
people that buy high end gpus are doing so to get the most out of them.
Yeah, which means they won&#8217;t be using 1680x1050 with no AA or AF, like you keep linking to.
if you want to crank the settings and get good gameplay then having a Core 2 2.0 is NOT going to give you a good experience in many modern games. and even in games that is does play okay a faster cpu will still give a large boost if using a very high end gpu.
Not in the 17 games I tested, all released in 2006 or later. Not in any other 2009 title I have and play. In fact not in any of the 100+ titles I have installed right now under active play rotation.

if the cpu is my limiting factor for having smooth gameplay or framerates in general then buying another video card just to crank AA is the least of my concerns.
So upgrade your CPU and show us the results. What are you waiting for? That's what I did.

games like BC 2 , RF Guerrilla, Ghostbusters, GTA 4 and some others would play no better if I upgraded my gpu at this point.
That is simply untrue.

also him claiming that his Core 2 Duo at 2.0 has no effect is laughable.
Not in any of the tests I ran, no. Again, if you think the benchmark logs are lying then put up evidence, otherwise retract your claims.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
The article is up at the usual spot.

I never claimed it would be a good experience, I simply debunked your comment about &#8220;telling the publishers I&#8217;m running below their specs&#8221;, or some-such nonsense that you stated.

What I actually I claimed was in the 17 games I tested, 2 GHz had little to no performance impact, something you appear to have a hard time understanding. I&#8217;m also telling you I actually use the settings that I play the games at, including finishing them from start to finish.

And all of those titles were released in 2006 or later, and three of them came out in 2009.

Yes, there are a handful of titles that need the fastest CPU possible, but these are a drop in the bucket compared to the massive benefits reaped from a faster GPU and a large high resolution display.

Yeah, which means they won&#8217;t be using 1680x1050 with no AA or AF, like you keep linking to.

Not in the 17 games I tested, all released in 2006 or later. Not in any other 2009 title I have and play. In fact not in any of the 100+ titles I have installed right now under active play rotation.
keep on claiming that. most of the rest of the people on here know that its nonsense and are moving on to much faster cpus so they can fully enjoy their high end gpus in games like Bad Company 2, GTA 4 and many others. really if your Core 2 at 2.0 can do everything like you claim then why bother with an i5 like you did? just pop that Core 2 duo back in there at 2.0 and have fun trying to play those "fringe games" as you call them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
PCGH benched a dozen CPUs with a GTX285 in World of Warcraft. Going from a Core2Duo E6600 to a 3.5ghz Core i7 rose framerates 74% from 38 fps to 66 fps. Just the CPU did this on identical systems with a GTX285.

http://www.pcgameshardware.com/aid,...nchmarks-with-Phenom-II-and-Core-i5/Practice/
1680x1050 with no AA or AF? LMFAO.

What cpu & gpu a guy needs depends on what games and apps the person plans to play.
Agreed, but my point is that if you always configure your games to run at the highest playable settings, it’ll be the GPU that has the far biggest impact, by far.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
1680x1050 with no AA or AF? LMFAO.


Agreed, but my point is that if you always configure your games to run at the highest playable settings, it&#8217;ll be the GPU that has the far biggest impact, by far.
so I guess you laugh at these 1680 benches too? I guess in your mind going to 1920 will somehow make enough difference to make up for the fact that high end cpus are getting 2 and 3 times the framerates. not to mention that many of the lower ens cpus are not even proving a decent average much less min framerate to begin with.

http://www.pcgameshardware.com/aid,...arked-in-Anno-1404-Dawn-of-Discovery/Reviews/

http://www.pcgameshardware.com/aid,...System-Requirements-and-Screenshots/Practice/

even if these were at 1920 you would still make excuses like you do with me. something that you cant understand is that if a cpu is only providing 20-30fps while others are getting 50, 60, or 70fps then having a faster gpu with those low end cpus is NOT going to get you a better gaming experience.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I just went and played a steam HL2 mod I have called "Insurgency".

As you can imagine it is a game based on older technology.

With AA disabled it does seem certain edges are more "jagged" than what I have observed in Fallout 3 GOTY.

This makes me wonder if some image differences are game specific? In other words, it could be not all games benefit equally from AA.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I just went and played a steam HL2 mod I have called "Insurgency".

As you can imagine it is a game based on older technology.

With AA disabled it does seem certain edges are more "jagged" than what I have observed in Fallout 3 GOTY.

This makes me wonder if some image differences are game specific? In other words, it could be not all games benefit equally from AA.
of course every game is different. FEAR was the first game I ever played where I felt at least some AA was mandatory. that game had edges that just crawled terribly. FEAR was also the first game where I had to turn on vsync because the screen tearing was horrific when the lights were flickering. its best just to take it one game at a time and adjust your settings accordingly.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
keep on claiming that. most of the rest of the people on here know that its nonsense and are moving on to much faster cpus so they can fully enjoy their high end gpus in games like Bad Company 2, GTA 4 and many others.
Many others? In total you’ve probably listed ten games in total since you began this whole crusade, and many of those where at questionable settings I might add. 10 games is not “many games”. I can list 100 games where the GPU has a far bigger impact, at playable settings.

really if your Core 2 at 2.0 can do everything like you claim then why bother with an i5 like you did?
Because certain people online didn’t understand what I was showing them, hence instantly thought I was CPU “bottlenecked” whenever I wrote articles. So the upgrade is more about public appearances than anything else, and also because I was bored with my old setup.

Now your turn: you harp on about “many games” being held back by your CPU, so why aren’t you upgrading to a quad-core like you preach?

I guess in your mind going to 1920 will somehow make enough difference to make up for the fact that high end cpus are getting 2 and 3 times the framerates.
That depends on the game and the graphics card.

not to mention that many of the lower ens cpus are not even proving a decent average much less min framerate to begin with.
Again, I never claimed the CPU makes no difference, only that the number of titles it makes a difference in are a drop in the bucket.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Many others? In total you&#8217;ve probably listed ten games in total since you began this whole crusade, and many of those where at questionable settings I might add. 10 games is not &#8220;many games&#8221;. I can list 100 games where the GPU has a far bigger impact, at playable settings.


Because certain people online didn&#8217;t understand what I was showing them, hence instantly thought I was CPU &#8220;bottlenecked&#8221; whenever I wrote articles. So the upgrade is more about public appearances than anything else, and also because I was bored with my old setup.

Now your turn: you harp on about &#8220;many games&#8221; being held back by your CPU, so why aren&#8217;t you upgrading to a quad-core like you preach?


That depends on the game and the graphics card.


Again, I never claimed the CPU makes no difference, only that the number of titles it makes a difference in are a drop in the bucket.
I havent upgraded because I am at the point where both a gpu and cpu are needed to satisfy my performance needs. gpus are just way too expensive right now so I am waiting for the first decent deal to pop up.

so using your logic we should just ignore those 10 games and play all the other games that dont need a decent cpu? I guess I should not play Cryostasis, Crysis, Warhead, and STALKER Clear Sky since those are the only games my gpu cant max.

wow now that I think about it, that actually means there are MORE games where my cpu at 2.0 would affect gameplay than my gtx260 would. thanks BFG10K for helping to prove my point even better.
 
Last edited:
Jan 24, 2009
125
0
0
I'm inclined to agree with BFG, for the most part. Of course, this is entirely based on my subjective experiences and no recorded testing on my part (aside from synthetic benchmarks, which will obviously show a difference).

Now, clearly I don't have the most powerful graphics card or processor (although I like to think my processor is fairly decent). But, on moving to my Athlon 2 x4 620 @3.4 from my x2 5200+ @3 I have not noticed much of a performance difference in many of the games I have played on both or play regularly. Those being: Crysis Wars, both Modern Warfares, both Mass Effects, Company of Heroes, Empire Total War, Men of War, Bioshock (while I have the second one, I have not tried it on my x2 5200+), Left 4 Dead, Team Fortress 2 and Borderlands.

I did however notice a VERY perceivable difference in both Dragon Age and Red Faction: Guerrilla. But those are the only two. So obviously there is a difference in some games, but for most, not really. I'm inclined to believe BFG entirely based on his selection of games tested.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
1680x1050 with no AA or AF? LMFAO.


Agreed, but my point is that if you always configure your games to run at the highest playable settings, it&#8217;ll be the GPU that has the far biggest impact, by far.

I think we all understand your point that the GPU has the "biggest impact" if you want good eyecandy with high settings. However I and I think all gamers care about the total experience, and that means enough frames per second to have a smooth game experience and not a slideshow. In that regards I think their test is perfectly valid in that it shows the fps you can expect from a CPU no matter what kind of GPU you get. If that's your CPU then that's the maximum average fps you'll get in that game even should you get a faster video card (unless of course there is some kind of programming that can use the GPU instead of the CPU i.e. directcompute or something). Sure with a 5870 you can now have 3 screens and maxed settings, but if the fps doesn't cut it then that game is still unplayable in my book.

Thus the real value in these benchmarks is not figuring out whether or not the gpu is more important or not than the CPU, but in figuring out what the absolute limit of each is in a particular game. The resolution however does play a big factor since it's possible depending how the game is programmed for that to effect both CPU performance as well as GPU performance (though not necessarily). This is because a program could process just the objects you see which would mean higher resolution equals more work for the CPU since you see more, or it could just process everything wheras no matter the resolution the CPU is doing basically the same thing. So resolution should be established ahead of time.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think we all understand your point that the GPU has the "biggest impact" if you want good eyecandy with high settings. However I and I think all gamers care about the total experience, and that means enough frames per second to have a smooth game experience and not a slideshow. In that regards I think their test is perfectly valid in that it shows the fps you can expect from a CPU no matter what kind of GPU you get. If that's your CPU then that's the maximum average fps you'll get in that game even should you get a faster video card (unless of course there is some kind of programming that can use the GPU instead of the CPU i.e. directcompute or something). Sure with a 5870 you can now have 3 screens and maxed settings, but if the fps doesn't cut it then that game is still unplayable in my book.

Thus the real value in these benchmarks is not figuring out whether or not the gpu is more important or not than the CPU, but in figuring out what the absolute limit of each is in a particular game. The resolution however does play a big factor since it's possible depending how the game is programmed for that to effect both CPU performance as well as GPU performance (though not necessarily). This is because a program could process just the objects you see which would mean higher resolution equals more work for the CPU since you see more, or it could just process everything wheras no matter the resolution the CPU is doing basically the same thing. So resolution should be established ahead of time.
yep. if there are only a few fps difference then BFG10K would have a valid point. when you see high end cpus getting 2-3 times the framerates as lower end cpus at 1680 then going to 1920 isnt going to change that significantly enough. also if a cpu is only providing 20-30fps then there is no point in exploring a different res because that cpu has already proved that it is not going to provide sufficient performance. really he doesnt get that and never will so I dont think there is any point in linking to or even providing benchmarks of your own.
 
Last edited: