How Important Is A CPU For Gaming, Really?

1337n00b

Member
Oct 11, 2008
38
0
0
I you have components such as the Radeon HD 4800 series and 4 GB DDR2 800, how much of a difference with a high end Core 2 Duo make for gaming? For example, is it really worth to get a E8000 series instead of an E4000 series?

 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Depends on the game and the resolution. Generally the lower the resolution the more important the CPU becomes. And for certain games (WoW, Supreme Commander) the CPU & amount of RAM available are just as important as the GPU.
 

brencat

Platinum Member
Feb 26, 2007
2,170
3
76
We need to know whether or not you plan to OC. If not, it could make a huge difference.

Also, some games make use of the higher cache of a chip like an E8400 -- such as MMORPGs, SupCom, World in Conflict, and many RTSs. IMO, for shooters, less cache is not as meaningful.

Given that you've chosen to go with a decent card in the Radeon 4850/70 series, you should really try to get your chip's speed @ 3ghz+ if you go with a budget chip to avoid any bottlenecks.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
They make a difference, but not as much as a good graphics card does. If the choice is between high end graphics card and high end cpu (for gaming) then graphics card should generally win out. Most modern CPUs can handle pretty much any game you throw at them.

I would recommend spending at least $100 on your CPU. Below that and you are going to start to see some performance issues. Intels q6600 for 179 is a steal IMO.
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
There are a handful of games out there that can be extremely CPU limited at times. For instance, in 32 player TF2 games, my FPS will dip into the teens every now and then, regardless of graphics settings. Even OCed to 3GHz, my CPU with its 1MB of L2 cache holds me back.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
And for another point, go for the e5xxx series not the old e4xxx series. e5200 is 45nm so it will run cooler/consume less energy and from comments around the net it overclocks pretty consistently to around 3.2GHz (higher if you push the voltage a bit or get lucky). With its 2MB cache that should give you quite a bit of performance for a low cost.
 

Liet

Golden Member
Jun 9, 2001
1,529
0
0
CPUs and graphics cards go hand-in-hand in that you can't expect adequate performance with only one good component out of the two. I remember upgrading to a fancy new 6600GT and being disappointed at the meager 5 fps increase. My old Athlon was really holding me back.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
Originally posted by: Liet
CPUs and graphics cards go hand-in-hand in that you can't expect adequate performance with only one good component out of the two. I remember upgrading to a fancy new 6600GT and being disappointed at the meager 5 fps increase. My old Athlon was really holding me back.

yea i know the feeling here lol. my a64 was super awesome when i got it, but that was before i was ripping all my music to FLAC format (free lossless audio codec). now whenever im playing games, even older games like fable, the audio will pause up to 15s between tracks, and if i try to use the in game audio it just doesnt work, horrible stutter, but if i pause the game (Which drops CPU usage) it is fine. im betting its just bad coding though. it didnt used to do this in the past, but i honestly cant remember if this was even on the same system rofl. my HD2900PRO could probably perform a lot better if i paired it with even an FX60 just for the dual core awesome. to bad i cant find one anywhere. will just build a new system in Q1 next year after deneb launches, the kinks are out of X58/Nehalem, and prices on 775 hardware are dirt cheap with the new stuff starting to drop in price. not sure what im going to build yet, i keep leaning back and forth and back again. i dont want to get crucified to a platform w/o an upgrade path for the CPU again the way i did with my 939 board. i know intel keeps production up for a while longer than AMD does since they have higher fab capacity and more resources to support older hardware, but its still fresh in my mind, with my hunt for an FX60 lasting up until mid august, when i just up and gave up.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I honestly belive that most people here put way too much importance on a CPU in a gaming rig. A middle of the road CPU with a higher end gaming card will give you a very, very good gaming experience in real world situations.
 

Psynaut

Senior member
Jan 6, 2008
653
1
0
Originally posted by: Denithor
Depends on the game and the resolution. Generally the lower the resolution the more important the CPU becomes. And for certain games (WoW, Supreme Commander) the CPU & amount of RAM available are just as important as the GPU.

Does having more than 4 GB of RAM provide any benefits? What about in MMO's like EQ2 or WOW?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Markfw900

Stop using those links, they don;t work. It says stop deeplinking is all the page says.

Might be your browser's cache, anyone else get this problem?

Anyways,
Gpu:http://www.anandtech.com/video/showdoc.aspx?i=3341&p=20
Cpu:http://www.pcgameshardware.com..._CPUs_reviewed/?page=4
Oc'd Cpus:http://www.pcgameshardware.com...rs_overclocked/?page=3
 

LS8

Golden Member
Jul 24, 2008
1,285
0
0
The processor definitely doesn't play the enormous role it once did in gaming performance but it is still important.

I am still running a S939 Opteron @ 3.0GHz and I can play any game on the market. I typically play at 1680x1050.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
If you look at the graphs i linked above you can see COD4 at 1920 x 1200 4xAA max settings shows a 24% increase moving from a stock hd4850 to a stock hd4870. In the cpu graph, COD4 at 1920 x 1200 4xAA max settings shows a 27% increase moving from a 2.0ghz E2180 to a 2.0ghz E8400.

If that tells you anything.

COD4 at 1920 x 1200 4xAA max settings:
HD4850: 66.4 FPS
HD4870: 82.4 FPS
% difference: 24.09%


COD4 at 1920 x 1200 4xAA max settings:
2.0ghz E2180: 47.4 FPS
2.0ghz E8400: 60.0 FPS
% difference: 26.58%

Edit:
just for kicks -

Race driver: grid at 1024 x 768 0xAA med:
3.2ghz E2180: 65 FPS
4.1ghz E8400: 91 FPS
% difference: 40%
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Im pretty sure you need a CPU to game. I'll confirm this by taking mine out later and trying to run Crysis.
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Originally posted by: Psynaut
Originally posted by: Denithor
Depends on the game and the resolution. Generally the lower the resolution the more important the CPU becomes. And for certain games (WoW, Supreme Commander) the CPU & amount of RAM available are just as important as the GPU.

Does having more than 4 GB of RAM provide any benefits? What about in MMO's like EQ2 or WOW?

In Vista it sure does.

Oblivion was tons faster after I got more RAM (using Vista x64, of course). Loading everywhere was near-instant, probably because it could keep most of the game textures in RAM. (I'm using the big texture mods...)
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: jaredpace
Might be your browser's cache, anyone else get this problem?

No, it's because you were deep linking on the last two original links, and that site doesn't let you deep link. It gave me the same message for them, yet worked fine for the first link.


This is the one that is most pertinent. A 3.4Ghz E6750 is nearly identical in performance to a 3.6Ghz E4700. Note though, 1337n00b, that this benchmark was done @ 1024x768 (as it should be for a CPU benchmark), but as you raise the resolution, the less important the CPU becomes. Well, not in all games, but nearly all.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Originally posted by: jaredpace
If you look at the graphs i linked above you can see COD4 at 1920 x 1200 4xAA max settings shows a 24% increase moving from a stock hd4850 to a stock hd4870. In the cpu graph, COD4 at 1920 x 1200 4xAA max settings shows a 27% increase moving from a 2.0ghz E2180 to a 2.0ghz E8400.

If that tells you anything.

COD4 at 1920 x 1200 4xAA max settings:
HD4850: 66.4 FPS
HD4870: 82.4 FPS
% difference: 24.09%


COD4 at 1920 x 1200 4xAA max settings:
2.0ghz E2180: 47.4 FPS
2.0ghz E8400: 60.0 FPS
% difference: 26.58%

Edit:
just for kicks -

Race driver: grid at 1024 x 768 0xAA med:
3.2ghz E2180: 65 FPS
4.1ghz E8400: 91 FPS
% difference: 40%

Lets see what graphic card they using on that cpu test ... there it is GTX 280.

Can we rerun that test with a HD 3850?

http://www.guru3d.com/article/...s-performance-review/5

OMG THE HD 4870 1GB has 150% increase compared to a 4870 512MB in 2560x1600.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: GaiaHunter
Lets see what graphic card they using on that cpu test ... there it is GTX 280.

Can we rerun that test with a HD 3850?

Sure, would you like the Sapphire HD3850, ASUS HD3850 Pro TOP or PowerColor HD3850 PCS?

Of course, it wouldn't be much of a cpu test with such a low-mid range card, now would it?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: GaiaHunter
Originally posted by: jaredpace
If you look at the graphs i linked above you can see COD4 at 1920 x 1200 4xAA max settings shows a 24% increase moving from a stock hd4850 to a stock hd4870. In the cpu graph, COD4 at 1920 x 1200 4xAA max settings shows a 27% increase moving from a 2.0ghz E2180 to a 2.0ghz E8400.

If that tells you anything.

COD4 at 1920 x 1200 4xAA max settings:
HD4850: 66.4 FPS
HD4870: 82.4 FPS
% difference: 24.09%


COD4 at 1920 x 1200 4xAA max settings:
2.0ghz E2180: 47.4 FPS
2.0ghz E8400: 60.0 FPS
% difference: 26.58%

Edit:
just for kicks -

Race driver: grid at 1024 x 768 0xAA med:
3.2ghz E2180: 65 FPS
4.1ghz E8400: 91 FPS
% difference: 40%

Lets see what graphic card they using on that cpu test ... there it is GTX 280.

Can we rerun that test with a HD 3850?

http://www.guru3d.com/article/...s-performance-review/5

OMG THE HD 4870 1GB has 150% increase compared to a 4870 512MB in 2560x1600.

You know maybe i shouldn't bother to post these benchmarks with stupid-ass responses like this.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Originally posted by: jaredpace

You know maybe i shouldn't bother to post these benchmarks with stupid-ass responses like this.

Well its what the numbers say right?

The CPU has an importance to a certain point. Had you decided to show Quadcore extremes compared to the E8600 you wouldn't see a big difference even though the price difference would suggest otherwise.

On the other hand you compared 2 very similar graphic cards, its the exact same chip there.

That choice of benchmarks give the impression that CPU is more important then GPU. Its not.

A 100+$/? CPU is quite decent to pair up with a 100-175$/? GPU.

More powerful graphic cards can use more CPU power, so if you going for the higher end, you need a bit more powerful CPU, but still no need to get the extreme HIGH-END CPUs.

The same way some games can use quads over duos and so on.

And I don't know what miracles can a very powerful CPU do with a low end graphic card.

At the moment GPUs are cheaper than CPUs. But a medium range CPU like the E8600 can be paired with extreme GPUs.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Originally posted by: betasub


Of course, it wouldn't be much of a cpu test with such a low-mid range card, now would it?

Probably not, but then you wouldn't see 20% performance gains from moving from a low C2D to a higher one. And if then you compared the gains of moving from a 3850 to a 4850 it wouldn't do much for the theory that changing CPU gives better gains than changing GPU, now would it?