• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GPU or CPU

Sentry11

Member
I am not saying it is a problem specific to COD5, it's just a good example to illustrate my point here.

What makes me difficult to understand is that both COD4 and COD5 are using the same graphics engine, but the frame rates I am getting from both games are different, given the same graphics settings. What I have noticed is that in COD4, I can manage to obtain around 60 fps running my Q6600 @ stock speed which is 2.4GHz without overclocking it. But then fps bleeds bitterly when running COD5 which givesme somewhere around 30 ~32.

I am actually amazed at how my HD 3850 performs having said all that. After migration from my prvious system which was only a Pentium D945 with ATi X1950Pro (AGP version) to my current rig which is a C2Q Q6600 with a PCI-Express HD 3850, the following games run pretty good with frame rates capped at 100 fps (no AA, no AF):

Battlefield 2
Battlefield 2142

Even talking about AA and AF applied, I can still manage to get 77~80 fps for both games with BF 2142 slightly lower.

My question being, is the Quad CPU at stock speed already suffices the need for good performance, or the HD 3850 GPU is contributing to the high fps so obtained?

Are leafs hurting the fps, COD5 is more outdoors with lots of jungle scene, whereas COD4 is pretty much like indoors with less leafs to deal with thus making HD 3850 stands out?
 
COD 4.5 aka COD 5 had some goodies that are updated to make it look better, hence it is more taxing. I would overclock that Q6600 you have there. They are usually great overclockers. Free performance upgrade. See how it runs with the overclock then start thinking about a video card upgrade.
 
I'd imagine Q6600 isn't the problem. What's most likely the problem is the 3850. I'd recommend looking into OCing your 3850 first.

There are few games really CPU limited. That said, cranking your Q6600 to 3GHz should be easy.
 
Originally posted by: katank
I'd imagine Q6600 isn't the problem. What's most likely the problem is the 3850. I'd recommend looking into OCing your 3850 first.

There are few games really CPU limited. That said, cranking your Q6600 to 3GHz should be easy.

Same engine doesn't tell you anything about the # of polygons on the screen, which is one of the main performance hits. Outdoor scenes tend to contain more polygons therefore it does make sense that CoD5 runs slower. They could have done any number of tweaks to the engine as well.

Lower your settings or upgrade to a 4870 🙂 (even a 4850 is twice as fast as the 3850)

Also, quads generally speaking provide less performance in most games than do equivalent dual cores, given the duals are usually clocked higher. Not that the Q6600 is a slow CPU. UT3 - Q6600 is 12% slower than an E8500, Crysis = 7%. Small but potentially noticeable. And yes, increasing the clock speed should eliminate the gap.
 
Talking about Q6600 and E8500, the difference in clock speed and thus their performance, is it because Q6600's FSB being 1066MHz whereas E8500 being 1333MHz?
 
Originally posted by: Sentry11
Talking about Q6600 and E8500, the difference in clock speed and thus their performance, is it because Q6600's FSB being 1066MHz whereas E8500 being 1333MHz?

the FSB would make a difference too yes, though not as great in most cases.
 
Don't touch your CPU. Look at your graphics card. That's where you're going to get the biggest performance boost in your situation.
 
As a general rule, GPU's are the workhorse for most games. RTS's are the type of games that benefit the most from better CPU's with all the calculations that are required for all the units.

For FPS games, skimp on the CPU and put the extra money towards a better GPU.
 
Originally posted by: ZzZGuy
As a general rule, GPU's are the workhorse for most games. RTS's are the type of games that benefit the most from better CPU's with all the calculations that are required for all the units.

For FPS games, skimp on the CPU and put the extra money towards a better GPU.

Just for discussion sake, is it not true that, CPU is the first component to do all the calculations before it passes things down to the GPU then for rendering?
 
Originally posted by: Sentry11
Originally posted by: ZzZGuy
As a general rule, GPU's are the workhorse for most games. RTS's are the type of games that benefit the most from better CPU's with all the calculations that are required for all the units.

For FPS games, skimp on the CPU and put the extra money towards a better GPU.

Just for discussion sake, is it not true that, CPU is the first component to do all the calculations before it passes things down to the GPU then for rendering?

The CPU does some 'framework' calculations before passing things onto the GPU for final rendering. It certainly doesn't do ALL the calculations, or there'd be no point in passing things on to the GPU.

Think of it more like this, the 'framework' calculations is like framing a house. Only the supporting beams are put in place. The GPU then comes along and does everything else, puts up the walls, the insulation, the roof, the siding, the windows and doors, all the fixturing, plumbing, electrical, flooring, painting, and trimwork required to actually make it a house.

So, you certainly need a CPU that's capable of handling the framing, but if the GPU isn't up to taking care of everything else, a more powerful CPU won't make a difference.
 
Originally posted by: bearxor
Don't touch your CPU. Look at your graphics card. That's where you're going to get the biggest performance boost in your situation.

This is the truth. Upgrade your GPU and your frame rate will increase.

Changing the cpu will not improve your performance, since that 3850 is a very slow card.
 
Originally posted by: jRaskell
Originally posted by: Sentry11
Originally posted by: ZzZGuy
As a general rule, GPU's are the workhorse for most games. RTS's are the type of games that benefit the most from better CPU's with all the calculations that are required for all the units.

For FPS games, skimp on the CPU and put the extra money towards a better GPU.

Just for discussion sake, is it not true that, CPU is the first component to do all the calculations before it passes things down to the GPU then for rendering?

The CPU does some 'framework' calculations before passing things onto the GPU for final rendering. It certainly doesn't do ALL the calculations, or there'd be no point in passing things on to the GPU.

Think of it more like this, the 'framework' calculations is like framing a house. Only the supporting beams are put in place. The GPU then comes along and does everything else, puts up the walls, the insulation, the roof, the siding, the windows and doors, all the fixturing, plumbing, electrical, flooring, painting, and trimwork required to actually make it a house.

So, you certainly need a CPU that's capable of handling the framing, but if the GPU isn't up to taking care of everything else, a more powerful CPU won't make a difference.

It's very good analogy. And by this argument, we need a CPU which can handle 1023 tasks/sec and a GPU which can handle 4078 tasks/sec.
 
Yikes, it all depends on the game.

People..

Look at the source engine! My processor is a HUGE bottleneck even at 2.7ghz, which is equal to a Q6600 at 2.8ghz. Even if i jack up my AA/AF to 4x i still get 251fps in CS:S's visual stress test. When i enable mat_queue_mode 2 in console while playing TF2 or DOD:S my fps nearly doubles! The 3850 is certainly not a slow card. It might seem that way if you are playing a super shader-intensive game, otherwise its not the bottleneck. Even if i overclock my GPU i still get the same fps in DiRT. Crysis:Warhead might benefit, but not one of my games beside that is it the bottleneck.
 
I recently upgraded from an X2 4400+ @2.6Ghz to an E8400 (stock for now). The difference is night and day. I would OC that Q6600 to 3Ghz just to be sure it's not bottlenecking at default speeds.
 
Originally posted by: Scholzpdx
Yikes, it all depends on the game.

People..

Look at the source engine! My processor is a HUGE bottleneck even at 2.7ghz, which is equal to a Q6600 at 2.8ghz. Even if i jack up my AA/AF to 4x i still get 251fps in CS:S's visual stress test. When i enable mat_queue_mode 2 in console while playing TF2 or DOD:S my fps nearly doubles! The 3850 is certainly not a slow card. It might seem that way if you are playing a super shader-intensive game, otherwise its not the bottleneck. Even if i overclock my GPU i still get the same fps in DiRT. Crysis:Warhead might benefit, but not one of my games beside that is it the bottleneck.

Wait, are you telling him it's NOT his GPU? It's not a slow GPU by any means, but he'd see a much bigger framerate increase upgrading that compared to the CPU for MOST games, shader intensive or not. If he wants more FPS, he'll need a better GPU...I don't think anyone could reasonably argue otherwise.

He's also talking about an engine that is much newer and more taxing than the Source engine, which is pretty outdated, IMO
 
Back
Top