• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Intel Core 2 Quad 9450 Gaming upgrade path – CPU or GPU ?

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
So you still have your venerable Core 2 Quad that has been overclocked to a mildly 3.2GHz and paired with an AMD HD6950.
A simple question, do you upgrade to a new CPU/platform or do you upgrade your Graphics Card ?? Can the Core 2 Quad leverage the performance of today’s middle-End Graphics cards or a new faster CPU will give you more performance in today’s Games??
This topic will investigate which of the two will give you the better Gaming experience.


For the CPU upgrade I will use the Intel Core i7 3770K overclocked to 4.44GHz. This will simulate new CPUs with very high single core performance.

For the GPU upgrade I will use the AMD HD7950 overclocked to 1GHz, this will simulate graphics cards like the R9 280/285 and R9 380.


System Specs

Socket 775
CPU : Intel Core 2 Quad 9450 @ 3.2GHz
Motherboard : ASUS Rampage Formula
Memory : 2x 2GB Gskill F2-8500 DDR-2 1066MHz
HDD : 1TB Seagate ST1000DM003 7200rpm Sata-6

Socket 1155
CPU : Intel Core i7 3770K @ 4.44GHz
Motherboard : ASUS Maximus V Gene
Memory : 2x 4GB Kingston KHX2133C11D3K4 DDR-3 2133MHz
HDD : 1TB Seagate ST1000DM003 7200rpm Sata-6

Graphics Cards used
ASUS HD6950 DCII/2DI4S/1GD5 (810MHz core, 1250MHz Memory)
ASUS HD7950 DC2T-3GD5-V2 (1000MHz core, 1500MHz memory)


For both Systems
Windows 8.1 Pro 64Bit
Catalyst 15.7
 

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Alien Isolation

The in-game benchmark was used. The game is not very graphically intensive and even with the HD6950 you can play with High settings.
But it still is GPU limited with High settings since you will gain nothing by upgrading to a faster CPU. Upgrading the GPU will increase the performance but even at Ultra settings the HD6950 is more than capable to play the game.
There is a decrease of minimum FPS when the HD7950 was used but that only seen in one part of the benchmark. This may be a driver issue or it could mean that the CPU is not fast enough to drive the GPU.
But since using the HD7950 will raise the average fps by a large amount I believe the lower minimum fps could only be a driver issue.







Bioshock Infinity

The in-game benchmark was used. The game is playable at Very High settings with the Core 2 Quad 9450 but again at those settings it’s only GPU limited, so upgrading to a faster CPU will gain you nothing.
Upgrading the GPU will let you increase the Image Quality to Ultra with DDOF and still have more than average of 60fps. At those settings the HD6950 will strangle to pass the 30fps mark.







Civilization Beyond Earth

The in-game benchmark was used. Since this is a turned based game high fps is not required to play the game, but higher than 30 fps will give you a better gaming experience.
Also using Mantle will make it even better, so upgrading the GPU is the best path for this game.



 
Last edited:

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Company of Heroes II

The in-game benchmark was used. Well this game will benefit with a faster CPU at lower image quality settings. But it will also benefit more from a GPU upgrade if you will increase the image Quality settings.
If you are in to a RTS based games, a CPU upgrade may be better for those DX-11 games.







Dragon Age : Inquisition

The in-game benchmark was used. Dragon Age is very graphically intensive game. Using the HD6950 or any other GPU in the same performance level you will need to lower the image quality settings to Medium if you want to play without dropping to sub 30fps.
Upgrading the GPU and using Mantle will let you play with Ultra settings. This game definitely will need a faster GPU upgrade.






Formula 1 2014

The in-game benchmark was used. This game can run at ultra settings with the HD6950 even when EQAA filters are used.
An upgrade for this game is not needed but the CPU plays a big part here.



 
Last edited:

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Hitman Absolution

The in-game benchmark was used. At 1080p with High settings the CPU upgrade will give you more performance. But GPU upgrade will let you play the game at higher image quality settings.
With Hitman we can also observe a myth buster. The one that says the slower CPU will not be able to get higher FPS if paired with a high performance GPU.
As you can see at 1080p High settings, using the HD6950 we are CPU limited as the Core i7 3770K is faster than the Core 2 Quad paired with the HD7950.
But look what happens when we raise the Image Quality settings to Ultra with 4x MSAA, the Core 2 Quad paired with the HD7950 is faster than the Core 2 Quad paired with the HD6950 at lower settings.






Metro Last Light Redux
The in-game benchmark was used. Metro is very graphically intensive and a GPU upgrade will let you increase the performance or increase the image quality settings.
This game definitely will benefit more with a GPU upgrade path. It will take a very High-End GPU to really feel the need to use a high performance CPU in Metro.





Sleeping Dogs

The in-game benchmark was used. At 1080p High settings the game is more than playable with the Core 2 Quad and the HD6950.
But you will need a faster GPU to raise the Image Quality so the GPU upgrade path is what you need for this Title.



 
Last edited:

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Sniper Elite III

The in-game benchmark was used. Even at 1080p and Medium Settings the game is GPU limited using the HD6950.
Mantle has a tremendous performance affect and upgrading the GPU will let you play the game at Ultra settings and perhaps even with Supersampling if you use the R9 290/390.





Thief

The in-game benchmark was used. Thief is using a modified Unreal Engine 3 Game Engine and although its very CPU intensive, by using Mantle can allow you to pair a slow or older CPU like the Core 2 Quad 9450 with a very fast GPU to increase the Image Quality settings.
This is another Mantle title to showcase the tremendous effect the API has on game CPU utilization and performance.







Total War : Rome II

The in-game benchmark was used. Rome II can become very CPU limited or very GPU limited depending on the Map you are going to play.
The in-game benchmark is more GPU intensive than other maps but at higher Image Quality settings the CPU plays a big part as shadows are using the CPU a lot.
For this game I would go for the CPU upgrade because you will need the higher CPU performance for every map.



 
Last edited:

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Conclusion

Well I will have to say that I wasn’t expecting the Core 2 Quad to perform that good in today’s games. In most cases the Games are GPU limited and a faster GPU will let you increase the image quality and still play the game at acceptable frame rates. There are games that will benefit more with a CPU upgrade, like RTS or MMORGs but for the rest of the games it seems that the GPU upgrade is the better one to take. Especially if you are on a low budget.

The big eye opener is Mantle, this and DX-12 it’s by far the best thing happened to PC gaming the last years. It will transform your gaming experience using a slow or older CPU if you upgrade to a Mantle/DX-12 GPU for the coming 2015 and onwards PC Games.

[FONT=&quot]So to conclude, the best path is to upgrade the GPU unless you play DX-11 RTS and/or MMORGs that really need a faster single core CPU performance. For the rest of the games the [/FONT]GPU upgrade is by far the best you can do if you still using an older CPU like the Core 2 Quad 9450. That will give a few more years of life to your system.
 
Last edited:

Denithor

Diamond Member
Apr 11, 2004
6,302
23
81
Wow. Lots of data here, excellent work!

Very surprising how well those old C2Q chips hold up even to today's games.
 
Oct 16, 1999
10,497
3
0
Yeah, that's some good info there. This is what I had my fingers crossed for when I overclocked my Q6600 to 3GHz and dropped a 7850 @ 1050/1250 in my den machine.

Is Sleeping Dogs the original release or definitive edition in your test? I've read the DE performs worse and that's the version I've been playing on the above hardware. Judging from my utilization numbers when the fps drop it's usually my CPU holding me back, the GPU only seems to get maxed during cutscenes. Still, it plays well enough to be enjoyable.
 

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
Yeah, that's some good info there. This is what I had my fingers crossed for when I overclocked my Q6600 to 3GHz and dropped a 7850 @ 1050/1250 in my den machine.

Is Sleeping Dogs the original release or definitive edition in your test? I've read the DE performs worse and that's the version I've been playing on the above hardware. Judging from my utilization numbers when the fps drop it's usually my CPU holding me back, the GPU only seems to get maxed during cutscenes. Still, it plays well enough to be enjoyable.
Its the original release.
 

UglyDuckling

Senior member
May 6, 2015
390
35
61
Cool thread,wish i had a 7950 or i would post up some Phenom II numbers for you as i have most of the games tested here.
 

VirtualLarry

Lifer
Aug 25, 2001
46,947
4,591
126
AtenRa, thanks for all the data, that proves that the venerable C2Q 45nm CPUs aren't totally obsolete yet! My BIL will be happy to hear that, I think. He's still rocking a Q9550 rig I built him a few (or more) years ago, with CF 6870 cards.

If you have the parts, and the opportunity, would you consider doing the same testing, with a G3258 @ 4.0? Maybe in an H81 mobo (PCI-E limited to 2.0)?

I think that data might be valuable, for people with existing C2Q rigs, considering a G3258 combo as an upgrade.

Or are there enough G3258 OC reviews already on the internet that you could point me to?
 

MajinCry

Platinum Member
Jul 28, 2015
2,486
555
136
I say give the Elder Scrolls and Fallout games a shot. Would be interesting to see the differences in those CPU hogs; Morrowind, Oblivion, Skyrim and Fallout New Vegas.

Particularly with an intensive ENB.
 

AtenRa

Lifer
Feb 2, 2009
13,177
2,007
126
AtenRa, thanks for all the data, that proves that the venerable C2Q 45nm CPUs aren't totally obsolete yet! My BIL will be happy to hear that, I think. He's still rocking a Q9550 rig I built him a few (or more) years ago, with CF 6870 cards.

If you have the parts, and the opportunity, would you consider doing the same testing, with a G3258 @ 4.0? Maybe in an H81 mobo (PCI-E limited to 2.0)?

I think that data might be valuable, for people with existing C2Q rigs, considering a G3258 combo as an upgrade.

Or are there enough G3258 OC reviews already on the internet that you could point me to?
You can compare most of the games vs the Kaveri APUs and Core i3 4330 in the link bellow. The only difference is the driver 15.5beta vs 15.7

http://forums.anandtech.com/showthread.php?p=37524322

I have started a new project and i dont have time to do the Pentium G3258 now. But i have it in mind for the next review ;)
 

cbn

Lifer
Mar 27, 2009
12,968
220
106
Thanks for these results.

With a high clocked C2Q still being viable for certain games, I would hope AMD and Nvidia would reconsider the bottom end of the dGPU market again. IMO So many capable SFF Pre-builts that could use a boost in graphics, but for under 40W AMD only has one offering: R7 240 (which is really slow). Instead of just R7 240, maybe AMD could also field a clocked/lower voltage version of Bonaire too.
 

VirtualLarry

Lifer
Aug 25, 2001
46,947
4,591
126
Thanks for these results.
Instead of just R7 240, maybe AMD could also field a clocked/lower voltage version of Bonaire too.
How many GPUs has AMD produced from Bonaire? 7790, R7 260X... was the R7 260 also Bonaire? And isn't there an R7 3xx GPU based on Bonaire too?
 

cbn

Lifer
Mar 27, 2009
12,968
220
106
How many GPUs has AMD produced from Bonaire? 7790, R7 260X... was the R7 260 also Bonaire? And isn't there an R7 3xx GPU based on Bonaire too?
Yep, HD 7790, R7 260, R7 260X and R7 360. Probably be a R7 360X too (when the R7 260X supply dries up).
 

Absolute0

Senior member
Nov 9, 2005
714
21
81
Sounds good! I'm having an issue with the images now, they need to be re-linked or something?
 

dark zero

Platinum Member
Jun 2, 2015
2,526
95
91
Thanks for these results.

With a high clocked C2Q still being viable for certain games, I would hope AMD and Nvidia would reconsider the bottom end of the dGPU market again. IMO So many capable SFF Pre-builts that could use a boost in graphics, but for under 40W AMD only has one offering: R7 240 (which is really slow). Instead of just R7 240, maybe AMD could also field a clocked/lower voltage version of Bonaire too.
Sorry, but Intel Skylake won't allow to resurrect the low tier GPU anymore... The current IGPU from Intel is as strong as a GT740 and yeah... Costing a lot less.

And also.. If the G3220 crushes the Q6600, don't be surprised if the new Celerons are as strong as a Q8K series...
 

cbn

Lifer
Mar 27, 2009
12,968
220
106
Sorry, but Intel Skylake won't allow to resurrect the low tier GPU anymore... The current IGPU from Intel is as strong as a GT740 and yeah... Costing a lot less.
That is an extreme case though. How many people are going to own a Broadwell C?

And also.. If the G3220 crushes the Q6600, don't be surprised if the new Celerons are as strong as a Q8K series...
Bonaire (896sp @ 500Mhz) would be a good GPU for a Haswell Pentium SFF Pre-built.
 

skipsneeky2

Diamond Member
May 21, 2011
5,037
0
71
Sorry, but Intel Skylake won't allow to resurrect the low tier GPU anymore... The current IGPU from Intel is as strong as a GT740 and yeah... Costing a lot less.

And also.. If the G3220 crushes the Q6600, don't be surprised if the new Celerons are as strong as a Q8K series...
Willing to bet games that demand a quad will run as good as a G3220 if not a bit better on chips like the Q6600.I wouldn't mind benchmarks to prove otherwise and i mean big games like BF4 and GTA V.Nvidia and 337.50 also gave some life to older chips so i think they may be feasible options still.
 

cbn

Lifer
Mar 27, 2009
12,968
220
106
If you have the parts, and the opportunity, would you consider doing the same testing, with a G3258 @ 4.0? Maybe in an H81 mobo (PCI-E limited to 2.0)?

I think that data might be valuable, for people with existing C2Q rigs, considering a G3258 combo as an upgrade.
Yes, that data would be very useful.

P.S. In general I'm pretty happy with my stock speed Xeon E5440 (same clocks and cache as Q9550) on casual gaming level (ie, 30+ average FPS), but one game that just gives it a lot of trouble is Crysis 3.

In "welcome to the Jungle level" right after the missile tower, I'll get FPS in the teens with E5440, 4GB and GT 730 GDDR5 (even at 800 x 600 low). When I retested that same scene today with my G3258 @ 4.0 Ghz, 4GB RAM, GT 730 GDDR5 I get FPS in the 40s (at a higher resolution than used with the E5440).

So for me, the G3258 is a very noticeable upgrade over the Xeon in that game. (And, of course, other games too where I want faster FPS).
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,526
95
91
In general I'm pretty happy with my stock speed Xeon E5440 (same clocks and cache as Q9550), but one game that just gives it a lot of trouble is Crysis 3.

In "welcome to the Jungle level" right after the missile tower, I'll get FPS in the teens with E5440, 4GB and GT 730 GDDR5 (even at 800 x 600 low). When I retested that same scene today with my G3258 @ 4.0 Ghz, 4GB RAM, GT 730 GDDR5 I get FPS in the 40s (at a higher resolution than used with the E5440).

So for me, the G3258 is a big upgrade over the Xeon in that game.
I feel that it comes from the newer instructions the Pentium has... retest it with a GTX 750 or a R7 260X. It could change...

Also the difference between PCI 2.0 and 3.0 are not as dramatic like 1.0 to 2.0.
 

escrow4

Diamond Member
Feb 4, 2013
3,331
112
106
Its obsolete junk. A modern i5 will sustain way more consistent framerates with way less dips. Given AtenRa is AMD biased too, I wouldn't trust those graphs. If you want 60/60 FPS you don't want a chip from 2007.
 

dark zero

Platinum Member
Jun 2, 2015
2,526
95
91
Its obsolete junk. A modern i5 will sustain way more consistent framerates with way less dips. Given AtenRa is AMD biased too, I wouldn't trust those graphs. If you want 60/60 FPS you don't want a chip from 2007.
And who said otherwise?
Even an i5 CAN'T go 60/60, even OC'ed.
Also most games are GPU GAMEWORKS limited now, so you need a GTX 970 at minimun to get that at 1080p.

And those games who aren't good with neither AMD or nVIDIA are pretty badly coded.
 
Oct 16, 1999
10,497
3
0
Its obsolete junk. A modern i5 will sustain way more consistent framerates with way less dips. Given AtenRa is AMD biased too, I wouldn't trust those graphs. If you want 60/60 FPS you don't want a chip from 2007.
You can buy me an i5 system to replace my C2D one to really prove your point.
 

ASK THE COMMUNITY