Intel Core 2 Quad 9450 Gaming upgrade path – CPU or GPU ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
You can buy me an i5 system to replace my C2D one to really prove your point.

ME TOO, MOTHER!


But really, obsolete junk? Aren't the Core2Q's just a tad slower than the Phenom II's? This 'ere proccy o' mine is trucking along just fine, the limit in my system is the ol' GPU.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I play lots of CS:GO and COD BO2 and my stock Q6600 delivers well over 60fps in both games with the right gpu.Had my buddies gtx970 in here with maxed settings pulling those frames.WOT is very enjoyable too on low and another frequent game i play.

Not so great in BF4 which is another fps i play but there is still enjoyment to be had if your a CS:GO or older COD player.Beats the E5200 i had in here by about 2x the fps and not bad for the $21 spent.
 

UglyDuckling

Senior member
May 6, 2015
390
35
61
I play lots of CS:GO and COD BO2 and my stock Q6600 delivers well over 60fps in both games with the right gpu.Had my buddies gtx970 in here with maxed settings pulling those frames.WOT is very enjoyable too on low and another frequent game i play.

Not so great in BF4 which is another fps i play but there is still enjoyment to be had if your a CS:GO or older COD player.Beats the E5200 i had in here by about 2x the fps and not bad for the $21 spent.

My Phenom II pushes well over 160FPS in CSGO and above 100FPS when recording it...

https://www.youtube.com/watch?v=8S6Usjwphm4


That video above i used a far more intensive workload 1080 at full quality.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
It isnt too surprising. This data shows that stream processors are just so much more valuable than CPU IPC.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
My Phenom II pushes well over 160FPS in CSGO and above 100FPS when recording it...

https://www.youtube.com/watch?v=8S6Usjwphm4


That video above i used a far more intensive workload 1080 at full quality.

Not bad at all actually,my DDR2 1gb 9500gt enables 896x504 with 60+ on low which honestly is pretty good and enables my Q6600 to be the bottleneck.Plays decent enough at 1280x720 too but its often sitting in the 40-50fps range.

Tossing something like a 750 in here soon enough,i know its more then enough for 1366x768 which i run on my t.v . :)

Good video btw,very enjoyable to watch.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
thanks for the test, I think for the most part it shows well your argument but, it also
shows the nice jump from the 6900 to the 7900 series well, also that lots of games are "OK" with slower CPUs, but your test kind of leads to the expected result because of the high number of Mantle titles, and some use of settings a person actually playing a game with the 6950 would not use, I think the C2Q will struggle a lot more with some other games, as you mentioned RTS/MMO but also some BF4? Crysis 3 welcome to the jungle? GTA 5? witcher 3? anyway, the 6950 would also struggle more here.

but yes, I would, recommend someone with a 6950/GTX 570 and lower but a decent enough CPU to upgrade the GPU first most of the time, just not one of the overpriced VGAs (the $300+ class), because at this point you can easily accommodate a CPU upgrade + good VGA, so you better already have a nice CPU, and not consider a "C2Q with Fury" like someone did in some other thread.

Not bad at all actually,my DDR2 1gb 9500gt enables 896x504 with 60+ on low which honestly is pretty good and enables my Q6600 to be the bottleneck.Plays decent enough at 1280x720 too but its often sitting in the 40-50fps range.

Tossing something like a 750 in here soon enough,i know its more then enough for 1366x768 which i run on my t.v . :)

Good video btw,very enjoyable to watch.

I've played TF2 with a 8600GT DDR2 back in the day (same shaders configurations, a little slower overall), I used to play at 1024x768 with no AA and it was also bottlenecked by the CPU, but, I was running an Athlon 64 X2 :biggrin: also, I think TF2 is still the worst Source game in terms of CPU performance?

anyway, kind of interesting to see people still using these cards to play relatively new games, I think even the IGP on the Haswell Celeron is faster at this point? your C2Q would love a new card for sure, a 750 sounds like a good choice,
 
Aug 11, 2008
10,451
642
126
Yea, he didn't test a lot of current CPU demanding games. The problem with a slow CPU is that there is not an easy way to change settings to compensate for it, while you can lower settings or resolution to compensate for a weak gpu.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
ME TOO, MOTHER!


But really, obsolete junk? Aren't the Core2Q's just a tad slower than the Phenom II's? This 'ere proccy o' mine is trucking along just fine, the limit in my system is the ol' GPU.

Actually, clock for clock the C2 Quad and Phenom II X4 would be about the same in most applications.

I love tests like this one - well done AtenRa.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I've played TF2 with a 8600GT DDR2 back in the day (same shaders configurations, a little slower overall), I used to play at 1024x768 with no AA and it was also bottlenecked by the CPU, but, I was running an Athlon 64 X2 :biggrin: also, I think TF2 is still the worst Source game in terms of CPU performance?

anyway, kind of interesting to see people still using these cards to play relatively new games, I think even the IGP on the Haswell Celeron is faster at this point? your C2Q would love a new card for sure, a 750 sounds like a good choice,

Never played much of TF2,it was alright but that was during the time i was heavily into BC2 so you can guess where my time went into.Thought about jumping back in.

Performance is about HD4000 from my experience with a i7 3770 non k rig i sold some time back.Except i much prefer the nvidia experience over the god awful Intel and unless its HD4600 or better i rather run this.I play a few other games that the card excels in all the way upwards of 1080p but those are older titles.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
So, I managed to interest a friend of mine in one of my older (now mothballed) Q9300 rigs. (Maybe two.)

He currently has a Q9400 rig that's somewhat flaking out. (GTX460 2GB driver recovery issues in Win7 64-bit.)

The rigs currently have a Zotac 1GB 64-bit DDR3 384CC Kepler GPU. I was wondering, are those enough for lower-end gaming, or should I tell him to buy one of my R7 260X 2GB cards for $80 off of me (my cost on the card)?

Is a R7 260X 2GB card comparable to the 6950 in your charts, AtenRa?
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
So, I managed to interest a friend of mine in one of my older (now mothballed) Q9300 rigs. (Maybe two.)

He currently has a Q9400 rig that's somewhat flaking out. (GTX460 2GB driver recovery issues in Win7 64-bit.)

The rigs currently have a Zotac 1GB 64-bit DDR3 384CC Kepler GPU. I was wondering, are those enough for lower-end gaming, or should I tell him to buy one of my R7 260X 2GB cards for $80 off of me (my cost on the card)?

Is a R7 260X 2GB card comparable to the 6950 in your charts, AtenRa?

In games that need memory bandwidth the HD6950 will be faster due to 256bit vs 128bit on the 265X. But in Mantle games the R7 265X will be faster and it will also be able to game with DX-12.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,712
142
106
It still amazes me how well these Q9xxx have held up.
Strong IPC and 12MB (had a 9450 myself) of fast L2 really gave them some teeth.

If it wasn't for the temptation of lower power usage, specifically at idle, i'd likely still be using mine.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
So, I managed to interest a friend of mine in one of my older (now mothballed) Q9300 rigs. (Maybe two.)

He currently has a Q9400 rig that's somewhat flaking out. (GTX460 2GB driver recovery issues in Win7 64-bit.)

The rigs currently have a Zotac 1GB 64-bit DDR3 384CC Kepler GPU. I was wondering, are those enough for lower-end gaming, or should I tell him to buy one of my R7 260X 2GB cards for $80 off of me (my cost on the card)?

Is a R7 260X 2GB card comparable to the 6950 in your charts, AtenRa?

q9300 had half the l2 and 2.5Ghz so it's a lot slower, but a 64bit DDR3 card is to slow, a 260x would improve things significantly.

260X will beat the 6950 on most newer games, pre GCN AMD cards don't get to much attention from their drivers team and game devs...

for example on Witcher 3 the 260X is up to 2x the 6970
http://pclab.pl/art66374-5.html



It still amazes me how well these Q9xxx have held up.
Strong IPC and 12MB (had a 9450 myself) of fast L2 really gave them some teeth.

If it wasn't for the temptation of lower power usage, specifically at idle, i'd likely still be using mine.

you can get decent (around, maybe even lower than 40W for the entire PC) idle power usage with them, by using a G41 motherboard with the IGP for example, under load if you keep it at 3GHz with low voltage is also not to bad.
 

ibex333

Diamond Member
Mar 26, 2005
4,090
119
106
The OP is exactly why I keep saying that newer CPUs are not required for today's games, and basic apps such as word, email, internet.

There is no need to upgrade to one of the newer CPUs. No pressing need at all, unless you are after playing at extremely high resolutions with all settings on. People keep talking about some kind of bottlenecks... That's nonsense. I use an e6300 Conroe @ 3.2GHz. With an SSD, 4GB RAM and Windows 10, it's actually very very descent. There are hardly any games that I cannot play on medium to high settings with this CPU and a AMD 6950.

I have all Windows 10 junk disabled, all those apps designed for tablets, Cortana(useless), improved search(not necessary), and all kinds of other services one can live without. I end up usually using no more than 1-2GB RAM unless I am playing a game. If I have nothing except a single game running, I usually stay well under my RAM limit.

Why do people have 16GB+ RAM? So they can have 100 tabs open in Chrome? Why?

Those who say it takes a lot of money to build a gaming system that can handle most games available today are extremely ignorant...
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
The OP is exactly why I keep saying that newer CPUs are not required for today's games, and basic apps such as word, email, internet.

There is no need to upgrade to one of the newer CPUs. No pressing need at all, unless you are after playing at extremely high resolutions with all settings on. People keep talking about some kind of bottlenecks... That's nonsense. I use an e6300 Conroe @ 3.2GHz. With an SSD, 4GB RAM and Windows 10, it's actually very very descent. There are hardly any games that I cannot play on medium to high settings with this CPU and a AMD 6950.

I have all Windows 10 junk disabled, all those apps designed for tablets, Cortana(useless), improved search(not necessary), and all kinds of other services one can live without. I end up usually using no more than 1-2GB RAM unless I am playing a game. If I have nothing except a single game running, I usually stay well under my RAM limit.

Why do people have 16GB+ RAM? So they can have 100 tabs open in Chrome? Why?

Those who say it takes a lot of money to build a gaming system that can handle most games available today are extremely ignorant...

Its rubbish. A modern i3 would obliterate it. Any 2015 AAA game even at 1080p would bring your system to its knees.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
The OP is exactly why I keep saying that newer CPUs are not required for today's games, and basic apps such as word, email, internet.

There is no need to upgrade to one of the newer CPUs. No pressing need at all, unless you are after playing at extremely high resolutions with all settings on. People keep talking about some kind of bottlenecks... That's nonsense. I use an e6300 Conroe @ 3.2GHz. With an SSD, 4GB RAM and Windows 10, it's actually very very descent. There are hardly any games that I cannot play on medium to high settings with this CPU and a AMD 6950.

I have all Windows 10 junk disabled, all those apps designed for tablets, Cortana(useless), improved search(not necessary), and all kinds of other services one can live without. I end up usually using no more than 1-2GB RAM unless I am playing a game. If I have nothing except a single game running, I usually stay well under my RAM limit.

Why do people have 16GB+ RAM? So they can have 100 tabs open in Chrome? Why?

Those who say it takes a lot of money to build a gaming system that can handle most games available today are extremely ignorant...

Its rubbish. A modern i3 would obliterate it. Any 2015 AAA game even at 1080p would bring your system to its knees.

But, is PC gaming meant to be an exclusive club, where only people that can afford a high-end rig to play games at 8K DSR res on a 4K monitor, at 60fps or better locked, are allowed to play?

Or is it for people that own PC's, with a (semi-)modern CPU, and a (semi-)modern GPU, can play whatever game they want, because they were designed to be scalable, unlike consoles, and are playable on a range of machines, using a range of settings?

I mean, are PC's just meant to be 4K-only console rigs, or are they meant to be flexible machines, capable of playing games, and doing other useful things too?
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
But, is PC gaming meant to be an exclusive club, where only people that can afford a high-end rig to play games at 8K DSR res on a 4K monitor, at 60fps or better locked, are allowed to play?

Or is it for people that own PC's, with a (semi-)modern CPU, and a (semi-)modern GPU, can play whatever game they want, because they were designed to be scalable, unlike consoles, and are playable on a range of machines, using a range of settings?

I mean, are PC's just meant to be 4K-only console rigs, or are they meant to be flexible machines, capable of playing games, and doing other useful things too?
You can build a system with a pentium, and not even the 3258 one but any one, and have a faster gaming rig on a current platform, s775 is just ancient,yes it does play some games quite well but so does any 2 bit 2 core out there today.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Very timely thread! Just last night I was finishing my secondary Core 2 Duo gaming rig. I did the 771 to 755 conversion and upgraded my q9550 (that could NEVER overclock worth a damn) for a x5460@3.8GHz. I paired it with my 7850 and discovered it could play most of my library well, and would tear through the main game I want to play on it (CS: Go).

This thread shows me I need to use Mantle when I can. Honestly my 7950 is a 1GB model so I am probably more limited by VRAM than CPU.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Interesting thread and nice to see the results across a wide variety of games.

C2D/Q is a hit and miss depending on the title. TechSpot did a good investigation. While some games can be perfectly playable with E6600/Q9550:

Gaming_01.png

Gaming_04.png


Other games are more or less unplayable/console level fps:

Gaming_02.png

Gaming_03.png

Gaming_05.png


Source

Even an i5 760/i7 860 @ 3.8-3.9Ghz would provide a HUGE increase in performance and actually provide >> PS4 level of graphics/performance with an HD7950/GTX760 level GPU. Q9550 OC cannot claim that in a wide variety of games.

for example on Witcher 3 the 260X is up to 2x the 6970
http://pclab.pl/art66374-5.html

This is why I cannot trust anything that ever comes out of that site. Since the first person ever linked reviews from there, all their data constantly contradicts all other professional sites.

Computerbase - average 1080P performance

260X = 100%
HD6970 = 100%
http://www.computerbase.de/2015-08/nvidia-geforce-gtx-950-test/3/#abschnitt_tests_in_1920__1080

Specifically TW3:

260X = 18.1 fps (+24%)
6970 = 14.6 fps
http://www.computerbase.de/2015-08/nvidia-geforce-gtx-950-test/3/#diagramm-the-witcher-3-1920-1080

In any event, TW3 is an extremely punishing game for lower end/slower GPUs. More or less to have a great gaming experience in TW3 requires a far more powerful/modern GPU than 260X/6970/580.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt_v._1.06-w_1920_h.png

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt_v._1.06-w_1920_u.png


Time and time again I see comments how AMD abandoned pre-GCN graphics card but the truth is the GPUs of that era are simply too slow to keep up with modern games at decent settings. GTX580 is only 9% faster on average than the HD6970 at 1080P in modern games. So in fact, AMD didn't abandon pre-GCN GPU optimizations because 580 bombs just as much -- the simple truth is those GPUs/architectures are just too slow for modern titles. In comparison, from my links above, the 280X is now 84% faster than the 6970 and 68% faster than the GTX580 in modern titles. Modern games basically wipe the floor with older GPUs, which is to be expected since 6970/480/580 are all almost 5 years old. At least most people on this forum were smart enough to buy a $230-300 6950 and unlock it instead of spending extra on the 6970. Looking at relative standing, compared to that, the $450-500 580 looks horrendous since it's just as unplayable but cost way more than the unlocked 6950 2GB. :D

260X will beat the 6950 on most newer games, pre GCN AMD cards don't get to much attention from their drivers team and game devs....

The first part of your statement is basically false as Computerbase has 260X and 6970 exactly tied on average in 18 modern games. That's as conclusive and scientific as it possibly gets. The second part of your statement is completely misleading because it applies to both NV's Fermi and AMD pre-GCN cards and has been shown conclusively in this video that beyond the 1st year driver improvements brought very little performance increases for Fermi as well. This is actually completely in-line with 580 beating 6970 by only 9% on average at 1080P in modern games.

Why do people have 16GB+ RAM? So they can have 100 tabs open in Chrome? Why?.

Actually having an SSD helps way more with 100 chrome tabs than having 16GB of RAM.

The only scientific tests of a game I've seen that shows a benefit over 8GB of system memory is SW:BF Beta.

StarWars_-_Frametimes_Hoth20vs20_-_8_GiByte_RAM-pcgh.png

StarWars_-_Frametimes_Hoth20vs20_-_16_GiByte_RAM-pcgh.png

http://www.pcgameshardware.de/Star-...950/Specials/Beta-Technik-Benchmarks-1173656/

Up until that point I've never seen any game that actually benefits from more than 8GB of system memory as far as performance went. It's possible there are such games but I haven't seen the data and if anyone has, please add to the knowledge base.

I think most people buy 16GB of memory because DDR3 has been very cheap in recent years and 16GB sounds nicer than 8. hehe. Up until SW:BF, that $$$ would have been better spent moving from i5 to an i7, getting a faster GPU, SSD, better monitor, etc. That's why I've stuck to 8GB for as long as I could as why waste $ for no benefit?
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
This is why I cannot trust anything that ever comes out of that site. Since the first person ever linked reviews from there, all their data constantly contradicts all other professional sites.



Specifically TW3:

260X = 18.1 fps (+24%)
6970 = 14.6 fps


why would you test these cards at sub 20FPS framerate at max (or close to it) settings?

why would you use highest settings for low end hardware?! that's why it's extremely punishing and they got irrelevant results, while the link I posted shows relevant results, with minimum and med settings, with the 260x running the game at over 30FPS easily, to punishing for low end hardware?
adjust the settings!
https://www.youtube.com/watch?v=Gqzd2uiTIlA


also... the links you posted are difficult to understand and I'm not seeing a lot of 6970 over there!?

So in fact, AMD didn't abandon pre-GCN GPU optimizations because 580 bombs just as much
no it doesn't, check the Witcher 3 test again,

or check Battlefront, oh wait, the pre GCN AMD cards are to glitched to test it (same thing I noticed with my 5850)
http://pclab.pl/art66213-4.html

or let's try a random game like mad max!
http://pclab.pl/art65638-10.html

nope, 260x beats the 6970 and the 580 is way faster than both.

maybe Project Cars?
http://pclab.pl/art63572-7.html

nope, 6970 is much slower.

but it's not always to bad

http://pclab.pl/art65645-10.html

on the latest MGS the 6970 is doing decently, but as you can see there is to much variation with the 6970 and it's overall a lesser card than the 260X for most current games.

I suspect the power is there, the optimizations not quite,

oh and perhaps if other websites included a 6970, or had tests with decent settings and newer games I would use them, but for now this one will have to do.

gamegpu is useless for lower end hardware, they use 980 TI settings for a 260X and are happy with just that.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I personally find gaming to be pretty crappy on C2Q's these days. I kept my old Q6600 around for a while after I build my i7 3770k and just came to the conclusion that gaming on it wasn't enjoyable and sold it off. Friend of mine upgraded is OC'd Q6600 to a 4790 and even with his paltry HD 6870 he noticed significant improvement with his gaming experience.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Wow, I just tested my rig and it barely gets over half the physics score in 3DMark that my OCed 2600k does. That is much better than the GPU though, poor thing get a score that is a third my 970.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
Wow, I just tested my rig and it barely gets over half the physics score in 3DMark that my OCed 2600k does. That is much better than the GPU though, poor thing get a score that is a third my 970.

Food for thought: The 9600GSO and 8800GT (same GPU chip), were contemporaries of the Core2Quad CPUs. That's how old they really are. (The 8800GT is pretty-much ancient in GPU terms.)