Minimum CPU necessary to drive new 28nm high-end GPUs?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
I would get the fastest CPU within reason. I think the CPU is just as important in a system as the GPU is. Why only spend $200 on a CPU when your spending $500 on a GPU? For games it doesn't make a whole lot of sense. To me you'd be better off buying a $350 CPU (i5 2500k + quality water cooling system) and a $350 on the GPU like a couple of 6850's. That way you can run the i5 2500k at stock for awhile and then overclock it higher and higher as the games demand.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Yup, I did glance at the graph too fast. But take a look what happens to Q6600 when it has cache limitations as well as using slower IPC cores. So I actually predict it would do just as poorly in Dirt 3 as those Athlon II X4's did.

On one hand, we talk about how most games are consoles ports that don't really need high-end GPUs, and on the other hand we are saying that there is no CPU bottleneck with the Q6600 since most of the load goes to the GPU. Something has to give, no?

People need to understand that sometimes there is no absolute rule and exceptions do happen.

For example how does one explain those Dirt3 benches? In one hand it seems to be completely GPU limited for the i7-750/920, Phenom II an i7-2600K/2500K. On the other hand it seems completely CPU limited on core count and cache.

About those SC2 benches I keep seeing - those seem to be absolute worst case scenarios. I played a fair share of SC2 with a phenom II X2/X4 4GHz/3.6GHz with a 6850 @1680x1050 and if I just looked at those graphs I would believe the game would be unplayable, which is far from truth (as in rare under 30 FPS slowdowns both in 1vs1 and campaign). And then you see pros playing with i7 systems still getting slowdowns when a single mothership pops out. One thing I can say though is that a PH2 @3GHz is about twice as fast as an Athlon X2 @3GHz in SC2.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If he is talking generally, the later C2Qs at 3.5 GHz do fine in modern games with 6970-TriFire and GTX 580 SLI and will also do fine with next gen single-GPU.

OP has a Q6600 @ 3.5ghz.

Click the link for Witcher 2 I posted in Post #24. In their testing, they got 2x less frame rate at 1920x1080 with a Q9550 over an i7 @ 4.2ghz when paired with a GTX580 SLI. The Q9550 2.8ghz was only able to get 45 fps. The i7 @ 4.2ghz got 99 fps.

There is just no way you can reasonably expect to pair a Q6600 @ 3.5ghz with a 6990 Tri-Fire. Even you wouldn't do something like that ;) If you knew you are going to lose 30-40% performance, if not more at 1080P, you'd be better off just getting 6950 CF and getting identical performance.
 
Last edited:

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
CPU2.png

The CPU is just as important as the GPU. END OF STORY!
There can be no argument against this graph it is full of truth.
It boldly flies in the face of people who believe a cheap CPU should be paired with large GPU horsepower.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
CPU2.png

The CPU is just as important as the GPU. END OF STORY!
There can be no argument against this graph it is full of truth.
It boldly flies in the face of people who believe a cheap CPU should be paired with large GPU horsepower.

1920_HighUltra.png


SLI scaling is quite bad though.

Shame they don't test single GPUs as well for the CPU performance.

Multi-gpu configurations are a different type of beast.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The CPU is just as important as the GPU. END OF STORY!
There can be no argument against this graph it is full of truth.
It boldly flies in the face of people who believe a cheap CPU should be paired with large GPU horsepower.
LOL. One graph doesn’t make it the norm; I can show dozens of other graphs showing the GPU making more difference while the CPUs flatline.

Also a GTX590 is an SLI system, and they’re only running 1920x1200 with no AA. So yeah, in that case more CPU would help.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
In your Witcher link, they use all different CPUs - way too many variables to relate one result to another except by a stretch. And the Witcher is going to be Intel's game. :p
- let me also give you Civ V and World in Conflict .. there is always going to be a game that is faster on one platform over another.

However, a Q6600 in your link is just about as fast as a stock Phenom II 965 X4. And a Q6600 @ 3.0 GHz is just a little slower then a Ph II 965 @ 3.4 . The OP can clock the Q6600 to 3.5 GHz which the to match the 965.
http://www.techspot.com/review/305-starcraft2-performance/page13.html

My Q9550 was slightly faster clock-for-clock than my Phenom II 955-X4 although the Ph II did win in some games; look at the price difference.

Now look at your other links:

A GTX 480 with the test systems i7 920 @ 3.7 gets 58 fps.
http://www.techspot.com/review/403-dirt-3-performance/page4.html

The Phenom II 980 BE @ 3.7 stock gets 96 fps with a GTX 590 while the i7 920 @ 3.8 gets 108 fps with a the same GPU. Your bottleneck is only 12 FPS; less if you clocked the Phenom II to 3.8GHz like the i7.

http://www.techspot.com/review/403-dirt-3-performance/page7.html

i'd say Q6600 at 3.5GHz might be the minimum CPU to pair for next gen. The OP can certainly try it and if he feels bottlenecked, upgrade then.

Remember that $1000 TriFire performance today is still more powerful than $450 next gen in a single high end card. It doesn't look so ridiculous as long as it performs well with the games the OP play then (in the future).




OP has a Q6600 @ 3.5ghz.

Click the link for Witcher 2 I posted in Post #24. In their testing, they got 2x less frame rate at 1920x1080 with a Q9550 over an i7 @ 4.2ghz when paired with a GTX580 SLI. The Q9550 2.8ghz was only able to get 45 fps. The i7 @ 4.2ghz got 99 fps.

There is just no way you can reasonably expect to pair a Q6600 @ 3.5ghz with a 6990 Tri-Fire. Even you wouldn't do something like that ;) If you knew you are going to lose 30-40% performance, if not more at 1080P, you'd be better off just getting 6950 CF and getting identical performance.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The OP can certainly try it and if he feels bottlenecked, upgrade then.

I agree with that. Nothing wrong with trying it out for himself. If he feels the performance is sufficient, he can hold out until Haswell/Bulldozer II. Plus, he can just grab a single 28nm card to start, rather than going all out with 2 from the beginning. :D

I certainly agree that GPU is the most limiting component in a modern PC system. But I also am hesitant to recommend $1000 GPU setup with a CPU from 2007. If you can afford a $1000 GPU setup, you should be able to sell your current parts and upgrade on the cheap to at least a 2500k, esp. if you have a Microcenter/Fry's near you.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I agree with that. Nothing wrong with trying it out for himself. If he feels the performance is sufficient, he can hold out until Haswell/Bulldozer II. Plus, he can just grab a single 28nm card to start, rather than going all out with 2 from the beginning. :D

I certainly agree that GPU is the most limiting component in a modern PC system. But I also am hesitant to recommend $1000 GPU setup with a CPU from 2007. If you can afford a $1000 GPU setup, you should be able to sell your current parts and upgrade on the cheap to at least a 2500k, esp. if you have a Microcenter/Fry's near you.
OK ! We have this come up on our own forum and people are itching to upgrade without knowing what BD performance will be like nor knowing about next gen GPUs. My advice over there is the same as here (except i get to liberally quote my own recent research with Phenom II 980 BE vs. Core i7-920). If they are getting decent framerates now, just wait a bit - prices will definitely come down on at least one platform.

i am also hesitant to recommend spending $1000 for graphics now - best bang for buck seems to be HD 6950 CF or GTX 570 SLI. That will satisfy most at even 2560x1600 - and it will pair nicely with a Penryn or a Phenom II.
 

MangoX

Senior member
Feb 13, 2001
623
165
116
i am also hesitant to recommend spending $1000 for graphics now - best bang for buck seems to be HD 6950 CF or GTX 570 SLI. That will satisfy most at even 2560x1600 - and it will pair nicely with a Penryn or a Phenom II.

For the majority of games :sneaky:
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Let's get the OP back in here. Is he specifically talking about his Q6600 and 1920x1080? If so, it is slower than Penryn and Phenom II 980 BE in gaming. If he is talking generally, the later C2Qs at 3.5 GHz do fine in modern games with 6970-TriFire and GTX 580 SLI and will also do fine with next gen single-GPU.

I was thinking of my BIL's machine. It's a Q9550 @ 3.5 (could possibly be tweaked higher, I stopped at 3.5 when I was building it), with CF HD6870 cards currently. Was wondering if it would be worth dropping in a pair of whatever AMD 28nm GPUs come out in a few months, or whether should scrap the whole thing and go for an entire platform upgrade then.
He games at 1920x1080, and has 8GB of RAM on Win7 64-bit.
 

ed29a

Senior member
Mar 15, 2011
212
0
0
If CPU speed didn't matter, why does Intel have > 80% market share? You think most people sit there and count SuperPi times with a stopwatch?

You answered your own question there. For the vast majority of Intel's and AMD's customers who buy desktop PCs, speed means nothing, nada, zilch. They are not going to benchmark or worry if their memory is properly set dual channel and whatnot. For the very tiny subset of customers, known as enthusiasts, it matters. However, we are a dying breed and our already insignificant importance shrinks more and more day after day. When mom and pop buy a PC from Dell, they don't care if this is the best CPU, they want to be able to do their stuff.

Now for the 80% market share, I'll answer with questions. Windows has 90%ish market share, does that make it the best operating system? Brittney Spears sells millions of albums, does that make her better than (insert favorite indie band here)?

Popularity doesn't mean better quality. It doesn't mean it was achieved with better quality. Especially not Intel who had to deal with anti competitive accusations quite a few times.

And finally CPU speed doesn't matter, even most enthusiasts can get by fine with a 2-3 year old quad cores. The difference between 55 fps and a 10% theoretical gain you get to 61 fps is exactly zero, for your eyes. And rest of the time your PC is running idle anyways.