CPU effects on games, I disagree.

xFlankerx

Member
Aug 30, 2005
29
0
0
I just came from reading the Anandtech Oblivion CPU testing. I beg to differ with the conclusion that you need a fast CPU to get the most out of your GPU. I'm more of the opinion that the CPU and GPU handle different tasks. We know that the CPU is much stronger than the GPU in MOST cases. Now the CPU is what provides the GPU with instructions for what the GPU needs to do. If the CPU is already feeding the GPU more instructions than it can handle, then there is no point in having a faster CPU. See, there was hardly a difference in the FPS even with a 1.8GHz A64 when compared to a 2.6GHz A64, in areas where it's GPU-intensive. The Oblivion gates are by far the most intesive part of the game, GPU-wise. Now every single one of the single-card solutions bottlenecked the CPU in these cases. Only Dual X1900XTXs were able to take advantage of the CPU.

On the other hand, the Town benchmarks are influenced quite a bit by the CPU. This is easily explained when you notice that in a town, there is a lot for the CPU to calculate that is not in the Oblivion Gates. There are hordes of NPCs in a city, and the CPU has to control every single one of them (oblivion NPCs lead lives of their own, by far the most complicated AI in a game yet). Thus, the stronger the CPU, the better your frames per second in a crowded area will be. The clumpiness of all the videocards performance here somewhat reinforces the point.

The Dungeon benchmarks did surprise me. There is a decent amount for the GPU to render, though not as much as the other areas. However, there is very little for the CPU to render. And yet, we see quite a bit of improvement with a faster CPU. I'm not entirely sure how to explain that.

My point is, basically, that "You don't need a fast CPU to get the most out of your GPU." If anything, it should be the other way around, since the GPU is what can't keep up with the CPU. I think that the conclusion should be more like, "You need a fast GPU to handle the graphics of the game, but you will suffer a drop if FPS if your CPU cannot keep up with the AI in the game in crowded areas."
 

n19htmare

Senior member
Jan 12, 2005
275
0
0
You do need a faster cpu if the CPU is what's bottlenecking the performace. Take this for example.

My brother is a a die hard WoW player. His rig was my old p4 2.8 @3.2 with 6600GT vid card. he netted about 28-30 Fps... Then I bought the Pentium M...Overclocked it to 2.5 THAT was all i changed and his FPS jumped to 60 INSTANTLY.
Faster CPU sure did him wonders.
 

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
With a low to mid-range graphics solution, CPU speed really doesnt matter much, as the graphics card isn't fast enough to make the CPU the bottleneck. But with high-end setups like X1900XT Crossfire, the GPU is outpacing slower CPUs, making the CPU a bottleneck and slowing down performance. So, yes I guess the OP is right in that most people will never be in a situation whwere the CPU is their bottleneck. But those lucky ones with Dual X1900XTs just might see that if their CPU isn't top of the line or overclocked.
 

xFlankerx

Member
Aug 30, 2005
29
0
0
Indeed, it wasn't until the X1900XT Crossfire setup that there were significant improvement in FPS from a faster CPU.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
You need to understand that CPU performance and GPU performance are often hitting different parts of the frame rendering pipeline. There is a little overlap, because the CPU is involved in pumping data to the GPU some of the time.

The GPU renders vertex lists and applies textures, lighting, etc., then renders the frame to the device. The CPU assembles the vertext lists before they are sent to the GPU, and is also responsible for everything else in the game that isn't rendering-related, i.e. user input, the game model, sound, etc. In general each of these areas has to be updated once per frame.

gather input
update model
>collision detection
> scoring
> object interaction
> etc.
calculate view
> rotate
> translate
> scale
> calculate effects
> build vertex list
render frame
render sounds

The GPU can only offload one specific step of this pipeline, even though that step is very complex, and breaks down into many smaller steps. Regardless of how fast the GPU is, the CPU can limit performance if it can't get the rest of the work done fast enough to allow the GPU to render at its maximum framerate. This is one of the reasons for the Physx add-on board. Physics of motion, and "artificial intelligence" are two of the most CPU-intense areas of a game engine.
 

xFlankerx

Member
Aug 30, 2005
29
0
0
And the only time when even a 1.8GHz CPU was not enough was in the Town with an abnormal amount of NPCs.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: xFlankerx
My point is, basically, that "You don't need a fast CPU to get the most out of your GPU." If anything, it should be the other way around, since the GPU is what can't keep up with the CPU. I think that the conclusion should be more like, "You need a fast GPU to handle the graphics of the game, but you will suffer a drop if FPS if your CPU cannot keep up with the AI in the game in crowded areas."

This paragraph completely contradicts itself.

You say, "you will suffer a drop in FPS if your CPU cannot keep up with the AI in the game in crowded areas". Please explain exactly how you square this with "you don't need a fast CPU to get the most out of your GPU"? If the CPU is limiting the possible FPS, then you're not getting the most out of your GPU, are you?
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
The oblivion write up was crap in places.

More fps is always nice, but it's only really annoying when it drops down too low to be playable. You set your graphics settings so the game is playable in worst case scenarios. As such the only test that is of any use of the three they ran was the oblivion gate one, the worst case. There it was seen that even the highest end graphics cards were the bottleneck, in the town you benifit from a faster CPU, but even with the slowest one they have it's playable when the same parts at the oblivion gate are barely so.

Nice write up, utterly wrong conclusion.
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
I'd like to see the test done with single cards myself. I still don't consider multicard solutions mainstream in any shape or form.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
The tradeoff between CPU and GPU is simply a problem of bottlenecking. What you have to do is look at what your budget is and get a CPU and GPU which are equally powerfull in your price range. If you are upgrading then you have to realise that only certain games, or certain areas will be drastically affected. for example, if you have a moderate CPU, and upgrade to a very powerfull GPU solution, then you will do alot better in big open areas with no physics, or AIs, but in enclsoed corridors, or other areas with little eye candy you will be CPU bound. However, nowadays, I'd say that most people are gonna be more GPU bound than CPU bound in msot games. Especially when you consider that a CPU sorta is setting a maximum cap on your FPS, and if that is high enough (like 60), then it doesnt matter that a better CPU would set an even higher cap (at say 80), because you won't be able to tell the difference. It may seem great to have an average of 95 FPS over 70, but if the minimum FPS is only below 60 like 2% of the time then they will look the exact same.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The cpu does alot more in a game than just feed instructions to a gpu. Besides physics and AI calculations, it also feeds the geometry data for example, so combining a high end graphics card with a mediocre cpu is a sure way to waste performance.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Originally posted by: munky
The cpu does alot more in a game than just feed instructions to a gpu. Besides physics and AI calculations, it also feeds the geometry data for example, so combining a high end graphics card with a mediocre cpu is a sure way to waste performance.

Look at the framerates in the oblivion tests, only in the town is the CPU the bottleneck, and even with a 1.8Ghz A64 it's not a serrious bottleneck, the fps are more than adaquette compared to the oblivion gate, which is the one that will decide what graphics you're using.

Oblivion is so bloody graphics hungry that the CPU is almost irrelevant.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Originally posted by: n19htmare
You do need a faster cpu if the CPU is what's bottlenecking the performace. Take this for example.

THAT was all i changed and his FPS jumped to 60 INSTANTLY.
Faster CPU sure did him wonders.

what resolution does he play @? if it's anything above 1024x768, then the cpu aint gonna make much of a difference in most pixel shader intensive games. If it's an older game wherein said pixel shaders are'nt used as much, then sure the cpu might have an impact, but hardly a 100% increase mate. Most modern games show barely a 1fps difference @ 1280x1024 resolutions or above, the gpu is taking care of things @ that point.

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
meh, i'm working on a thread right now that should explain exactly what a CPU limitation in a game is ;)

To the guy above me, did you see the AT Conroe FEAR benchmarks? 'Nuff said ;)

 

xFlankerx

Member
Aug 30, 2005
29
0
0
Thank you Bobthelost, you've basically been saying what I thought.

This paragraph completely contradicts itself.

You say, "you will suffer a drop in FPS if your CPU cannot keep up with the AI in the game in crowded areas". Please explain exactly how you square this with "you don't need a fast CPU to get the most out of your GPU"? If the CPU is limiting the possible FPS, then you're not getting the most out of your GPU, are you?

I'm saying that the increase in FPS that you see in the Town benchmark IS NOT FROM THE X1900XT STRETCHING IT'S LEGS. Its from the CPU. Thus, you are getting no more out of your GPU than you would otherwise, but your CPU is handling things better and is providing better performance.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.
 

n19htmare

Senior member
Jan 12, 2005
275
0
0
Originally posted by: poohbear
Originally posted by: n19htmare
You do need a faster cpu if the CPU is what's bottlenecking the performace. Take this for example.

THAT was all i changed and his FPS jumped to 60 INSTANTLY.
Faster CPU sure did him wonders.

what resolution does he play @? if it's anything above 1024x768, then the cpu aint gonna make much of a difference in most pixel shader intensive games. If it's an older game wherein said pixel shaders are'nt used as much, then sure the cpu might have an impact, but hardly a 100% increase mate. Most modern games show barely a 1fps difference @ 1280x1024 resolutions or above, the gpu is taking care of things @ that point.


He plays at 1600x1200. I dunno but his frame rate did indeed double. Can't say what and how but it did... he got a new HD too but I don't know how a new HDD will affect the FPS.
 

DJExoddus

Member
Jan 28, 2005
28
0
0
As far as measurable fps goes a faster CPU can have a big effect. I think that as far as playablility it makes no difference unless you go from the extremely low end to the extremely high end. I have a socket A sempron at 1.75ghz or 2.1ghz, depending on if I am gaming or not. When I put my X800XL in my friends computer, a X2 3800 at stock, Oblivion was just as playable as in my computer.

I guess it might make a difference depending on who you are, but for me as long as it is a playable frame rate I will play regardless of what is in the computer.
 

xFlankerx

Member
Aug 30, 2005
29
0
0
Originally posted by: munky
Look at it from this point: except for the oblivion gate benchmark, the x1900 CF was getting the same fps as a single x1900xtx in the town and dundeon benches. That's a severe cpu limitation already. And even in oblivion gate the single x1900xtx got a slight increase with a faster cpu. What that means basically is that for any game that's less gpu-intensive than the oblivion gate scene (which goes for about 99% of current games), the 1.8ghz A64 will be a limitation to a single x1900xtx. And a mediocre P4 will be even more of a limitation.

The X1800XL was getting the same FPS as the X1900XT CF setup, lol, even when paired with a FX-60. Are you going to tell me that a FX-60 is not enough to take on these videocards? "A slight increase" isn't worthy of note, and is well within the margin of error.

And I've already told you that even though the Town benchmark wasn't as demanding as the Oblivion Gate benchmark on the GPU, it was the most pressure ANY GAME ON THE MARKET could have put on those processors. Your statement should be restated to say, "What that means basically is that for any game thats less CPU-intensive than the Oblivion Town scrne (which goes for about 99% of current games), the 1.8Ghz A64 will be just fine with a X1900XTX." And a 3.4Ghz P4, while not quite up to par with it's AMD counterparts, is hardly a "mediocre" CPU.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
In most of my games, I can't really tell any difference between having my processor at its stock 2ghz and the overclocked 3ghz speed during normal gameplay. The general tendency seems to be for the maximum framerates to shoot up while the minimums stay about the same, so the benchmark scores increase but the game feels no different. This varies a lot between games though. In UT2004 and the original Deus Ex, I saw very noticeable improvements with the overclock (even with graphics settings at which the overall framerates are low), and Oblivion seems to be similar in that respect.

I think for most games it's safe to say that the CPU speed doesn't matter much, even though there are some glaring exceptions like UT2004 and Oblivion.

I'll see if a dual core makes any difference once I get my 165 rig overclocked and tested, but based on my experience I'm not expecting much of an improvement in games.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
i gave up on my thread (too busy atm :p) but essentially i've discovered i'm totally CPU bound in Farcry even at 1024/everything maxed/water at ultra/4xAA/4xAF with my 6600GT and tbred-b at 2.1GHz (ie i can drop the 6600GT from 600/1150 down to 500/900 and i see no fps differences, i can switch between 4xAA and 2xAA and i see no fps differences at either clock speed either, i even dropped my 6600GT to 300/500 and my fps didn't change (although the max fell when staring at the ground))...

Similarly when i enable Bots in UT2004 i see a massive fps drop (usually over 20fps) between the flyover at the start and pressing 'play' (when the cpu has to start controlling say 15 other players :))...

I only see a limited cpu bottleneck in FEAR (my minimum FPS are slightly but noticeably lower than an equivalent card on a 3000+ A64) at 1024/all the effects on 0xAA/8xAF no soft shadows, but if i start turning stuff down from that (ie dropping the res) i quickly fall well behind a A64 with the same card.

:)
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: xFlankerx
How about we ignore on-the-fly benchmarks and focus more on the actual benchmarks?

:confused:

The FEAR stuff i mentioned was based on the FEAR benchmark. Farcry is a PITA to benchmark, but i fail to see how using the ingame FPS counter at the exact same spot is less relevant for the purposes of this thread. As far as UT2k4 is concerned, i again used the ingame FPS log, and i can't see why that isn't relevant...i merely look at the score when the game has loaded but before i press 'play', then the score after i press play. I'm in the exact same spot, so what the GPU has to render hasn't changed, therefore the drop in FPS by ~20fps MUST be due to the increased load on the CPU in controlling the bots...

 
S

SlitheryDee

Originally posted by: xFlankerx
If the CPU is already feeding the GPU more instructions than it can handle, then there is no point in having a faster CPU.

You're right, but every CPU isn't ALWAYS able to feed the GPU more instructions than it can handle. You could be strolling down a hallway and be nicely GPU bottlenecked and then emerge in a room full of NPCs and have the tables turned quite nastily. These lapses into slideshowness are very annoying at best and totally ruin the immersive quality of the game at worst. I agree that you can have a fine time with a less powerful CPU 90% of the time. At the same time that remaining 10%, which almost necessarily occurs during the important moments in the game, is a very compelling reason to upgrade your CPU.