C2Q clockspeed to beat an i7

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,117
3,641
126
Originally posted by: Idontcare
Originally posted by: apoppin
Who cares about gaming at 640x480 ?
:confused:

(in my best Eric Cartman voice impression)

True...but think of the POWAR!

OMG I KILLED KENNY

Actually EVA did. :p

Haruhi will kill EVA.. well, i got something special coming.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ArchAngel777
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: Provenone
Originally posted by: IntelUser2000
Originally posted by: RaptureMe
Originally posted by: aigomorla
well to answer your topic thread.

If we look at 100% utilization.

the i7 is 55-75% faster then a C2Q when utilizing all 8 threads.

Since when?? I have both and the core i7 isnt even close to 50% faster at any thing..
If anything at all the Q9650 is 50% fast then the core i7.
The only thing I noticed that the core i7 does better is multi tasking without freezing one program while opening another..
Can you show me some benches or something to prove me wrong??
I can show you screenies where apps I use say nero and winrar are up to 7 mins faster using q9650 vs the core i7 920..

What's this?? Why is this BS being spouted here??

Core i7 is equal or faster than Core 2 Quad. Sure your "experience" might be different but benchmarks show it otherwise.

Core i7 without HT is faster in both single thread and multi-thread apps. Multi-thread more, because it has more bandwidth and its scales better with quad core than Core 2 Quad does.

Single thread should be minorly faster on PURE single thread apps with HT off which makes your point even more invalid.

See until you are putting two fair systems where the only variable is the CPU/chipset/memory, your point is moot.

Possibly your cheap G.Skill SSD is making it worse.


OP is asking for gaming, not necessarily heavy multitasking. remember this please ^.^

Please remember that he was responding to another user, not the OP, and he is quite correct. If you are throwing out a term like the bolded, then you better be prepared to defend it. Lost Planet (incredable engine, BTW) is a good example of what Nehalem is capable of. Lost Planet CPU Benchmark 120 versus 188 for the 3.2Ghz parts. That is a 57% performance increase. (HardOCP failed at math it looks like). This mimics the results at 105 versus 165 which translates again, into a 57% performance increase clock for clock.

If an application can utilize 8+ threads, the Core i7 will dominate. Now that was just an example of a 'game' engine that did it. But run Cinebench, POV Ray, Encoding, etc.. and you see the core i7 just dominate the Core 2.

However, with that said, none of these CPU's appears to be bottleneck graphics performance. It is possible that a Tri-Sli 285 setup would start to see performance level off, but that is by no means a 'bottleneck'. Besides, crank up AA, buy a pair of 3D Glasses, etc...

The Core 2 Duo's and Quads are no slouches. There is very little reason for most people to upgrade from a Core 2 Quad to a Core i7. I think that is a given. But if you are currently on a X2, or the even older, I'd skip the Core 2 family and head straight for Core i7, especially since you *should* be able to drop in a Gulftown (32nm 1366 varient) sometime next year.

Who cares about gaming at 640x480 ?
:confused:

i7 better do better than that to get me to buy one

and don't show me FC2 with Tri sli either :p

No one cares about gaming at 640x480. But that isn't the point. The reasoning behind it is quite clear: to show the power of the CPU. Lets face it, with a single GPU Setup, you don't even need much better than a Core 2 Duo @ 3.2Ghz. Heck, even most games will play just fine on a X2 and will be very enjoyable - but that isn't the point.

If I were to only buy a CPU to match my GPU's power right now, I'd have to upgrade them BOTH when it came time to pop in my new GT300. If you buy a faster CPU, you can get by with a few GPU upgrades before you need to overhaul the main system.

Example - I purchased a Q6600 and run it at 3.6Ghz. Overkill for the 8800GTS 640MB I had, way overkill. Then I plopped in a nice 8800GTS 512MB. Still have plenty of CPU power. I then plopped in a 280TX, which is still the bottleneck in my system. I was able to get (so far) three GPU upgrades out of my system so far and I expect to get a 4th before I need to do a system overhaul. If I simply adopted the the mindset you are proposing I would have had to overhaul my system at least once so far, or pop in another CPU.
Who cares about "the power of the CPU" if the one you have is good enough?

i have 4870 CrossfireX-3 and i am testing it now with q9550s at 4 Ghz. Clearly where multi-core benefits the game, it has an advantage over my e8600 at 4.25Ghz. But those games are really few and far between and i had to go "looking" for some - like FarCry2, WiC, SC and LP. And none of them [so far] really appear to make much practical difference as i game at 19x12 with as maxed DX10 setting with maxed filtering as i can get away with.

Now i have tested with 4870 and GTX280 a bit more extensively and q9550 is a bit of a bottleneck at 2.83 Ghz especially with GTX280. However, get it up to 3.4Ghz and the frame rates increase at mostly the minimums. Going up *further* to 3.6 and 4.0 Ghz make very little difference and going from 4 to 4.25Ghz for my e8600 is pretty useless for any practical difference

now for the less than 1% of us that game at 25x16 and with Tri-SLi, an i7 might make a bit of a difference .. but i haven't gone there yet. i am still exploring penryn difference between Quad and Dual core .. and you *can* easily get away with a fast e8600 at 3.6 or 3.8Ghz and a x2 - depending on what game you play

clearly as time goes on, more and more games will take advantage of multi-core .. but not yet; not for normal resolutions and single GPUs
rose.gif


i don't think your GTX280 is a bottleneck for your Q6600 at 3.6Ghz .. it is a good match up

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: apoppin

i don't think your GTX280 is a bottleneck for your Q6600 at 3.6Ghz .. it is a good match up


Don't agree here. But we will see. I will be willing to bet that the next nVidia GPU will grant me a substantial boost in FPS at 1680X1050 with 4X/16XAF and TSAA. I will be sure to run those tests when the times comes. If it is a true bottleneck, I won't see a performance increase.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ArchAngel777
Originally posted by: apoppin

i don't think your GTX280 is a bottleneck for your Q6600 at 3.6Ghz .. it is a good match up


Don't agree here. But we will see. I will be willing to bet that the next nVidia GPU will grant me a substantial boost in FPS at 1680X1050 with 4X/16XAF and TSAA. I will be sure to run those tests when the times comes. If it is a true bottleneck, I won't see a performance increase.
well the "next Nvidia GPU" is here already and it is called GTX295; i am sure it is at least 50% faster than your Old GTX is now; stick it in
. . . and don't think we mean "bottleneck" the same way :p

a true bottleneck will kill performance

However, stick in that GTX295 right now .. you WILL get better FPS; but maybe not so much at the minimum; you will be able to Max details and filtering but you cannot really improve the bottom FPS very much as it will be waiting on your CPU

. . . and GTX295 certainly will NOT be as fast as if you stick in an i7 instead of your old q6600

Your GTX is - right now - very slightly held back by your Q6600 .. your CPU is a bottleneck in some situations in some games and at some resolutions
- you would do "better" if you went with i7 or a faster quad - right now

why wait ?

:D


 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: apoppin

i don't think your GTX280 is a bottleneck for your Q6600 at 3.6Ghz .. it is a good match up


Don't agree here. But we will see. I will be willing to bet that the next nVidia GPU will grant me a substantial boost in FPS at 1680X1050 with 4X/16XAF and TSAA. I will be sure to run those tests when the times comes. If it is a true bottleneck, I won't see a performance increase.
well the "next Nvidia GPU" is here already and it is called GTX295; i am sure it is at least 50% faster than your Old GTX is now; stick it in
. . . and don't think we mean "bottleneck" the same way :p

a true bottleneck will kill performance

However, stick in that GTX295 right now .. you WILL get better FPS; but maybe not so much at the minimum; you will be able to Max details and filtering but you cannot really improve the bottom FPS very much as it will be waiting on your CPU

. . . and GTX295 certainly will NOT be as fast as if you stick in an i7 instead of your old q6600

Your GTX is - right now - very slightly held back by your Q6600 .. your CPU is a bottleneck in some situations in some games and at some resolutions
- you would do "better" if you went with i7 or a faster quad - right now

why wait ?

:D

The way I see it, if we take my setup currently, which is a Q6600 @ 3.6Ghz, and a 280GTX we have the following scenarios.

A) Single Thread Games = CPU Bound.
B) Dual Threaded Game = Perfect Match.
C) Quad Threaded Games = GPU Bound.

Now, keep in mind very few games 'currently' utilize more than two threads. So right now, my system is a 'perfect' match or a 'good' match as you had mentioned, but if I were to keep my current setup, eventually I will be seriously GPU bound as games engines start to utilize more threads. So what we have 'right' now, 'this' minute is a good match, but in the future I will be GPU bound again. That is what is so great about this Multi-Core tech - The true power has yet to be tapped into for gaming.

I still don't agree that the Q6600 is holding me back, but I also know you have done a lot of testing. As you noted, we use the word bottlebeck differently. My opinion is that if I drop in a 295GTX and my frames per second increase 40%, there was a bottleneck on my previous GPU. If I am capable of getting a 60% increase, but only get a 40% increase due to my CPU holding me back, I still have a substantial upgrade. Is it worth a system overhaul for that extra 20%? That depends on who you are, I guess.

I also don't agree that it helps much with minimum frame rates, I have tested this myself and found this unsubstantiated in my own personal testing. I believe Keys was one of the first to say that a faster CPU really helped the minimum frame rates as for a while it made sense, but my testing showed that to be false. It appears in most situations, the minimum frame rate is the result of a choking GPU:A scene in the game that has abnormally higher polygon counts or shader requirements.

As always, I reserve the right to remain unconvinced, but if I can be shown more than a few isolated examples with the same platform and GPU, I'd be willing to look into the situation again.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ArchAngel777
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: apoppin

i don't think your GTX280 is a bottleneck for your Q6600 at 3.6Ghz .. it is a good match up


Don't agree here. But we will see. I will be willing to bet that the next nVidia GPU will grant me a substantial boost in FPS at 1680X1050 with 4X/16XAF and TSAA. I will be sure to run those tests when the times comes. If it is a true bottleneck, I won't see a performance increase.
well the "next Nvidia GPU" is here already and it is called GTX295; i am sure it is at least 50% faster than your Old GTX is now; stick it in
. . . and don't think we mean "bottleneck" the same way :p

a true bottleneck will kill performance

However, stick in that GTX295 right now .. you WILL get better FPS; but maybe not so much at the minimum; you will be able to Max details and filtering but you cannot really improve the bottom FPS very much as it will be waiting on your CPU

. . . and GTX295 certainly will NOT be as fast as if you stick in an i7 instead of your old q6600

Your GTX is - right now - very slightly held back by your Q6600 .. your CPU is a bottleneck in some situations in some games and at some resolutions
- you would do "better" if you went with i7 or a faster quad - right now

why wait ?

:D

The way I see it, if we take my setup currently, which is a Q6600 @ 3.6Ghz, and a 280GTX we have the following scenarios.

A) Single Thread Games = CPU Bound.
B) Dual Threaded Game = Perfect Match.
C) Quad Threaded Games = GPU Bound.

Now, keep in mind very few games 'currently' utilize more than two threads. So right now, my system is a 'perfect' match or a 'good' match as you had mentioned, but if I were to keep my current setup, eventually I will be seriously GPU bound as games engines start to utilize more threads. So what we have 'right' now, 'this' minute is a good match, but in the future I will be GPU bound again. That is what is so great about this Multi-Core tech - The true power has yet to be tapped into for gaming.

I still don't agree that the Q6600 is holding me back, but I also know you have done a lot of testing. As you noted, we use the word bottlebeck differently. My opinion is that if I drop in a 295GTX and my frames per second increase 40%, there was a bottleneck on my previous GPU. If I am capable of getting a 60% increase, but only get a 40% increase due to my CPU holding me back, I still have a substantial upgrade. Is it worth a system overhaul for that extra 20%? That depends on who you are, I guess.

I also don't agree that it helps much with minimum frame rates, I have tested this myself and found this unsubstantiated in my own personal testing. I believe Keys was one of the first to say that a faster CPU really helped the minimum frame rates as for a while it made sense, but my testing showed that to be false. It appears in most situations, the minimum frame rate is the result of a choking GPU:A scene in the game that has abnormally higher polygon counts or shader requirements.

As always, I reserve the right to remain unconvinced, but if I can be shown more than a few isolated examples with the same platform and GPU, I'd be willing to look into the situation again.

Well, AA i see you changed your stance and are now agreeing with me that a GTX280 IS a good match for q6600 at 3.6 GHZ
- it is an excellent match!!

we are also agreeing on our definition of bottlenecking now .. yes, it depends on your definition - i tend to look at the "ultimate" improvement possible vs what the compromises one makes

the minimums are often affected by a slow CPU; it will depend on your game, the resolution, and level of details .. so it can be shown many ways

RIGHT NOW .. i am extensively testing q9550s vs e8600 .. using 2 GPUs; 4870 and GTX280 .. and 8 games
- my e8600 is at 3.99 Ghz and 4.25 GHZ
- my q9550 is tested at 2.83 Ghz, 3.4 Ghz, 3.6 Ghz and 3.99 Ghz

these results will be published this weekend; my NEXT series of benchmarking [going on now] is a heckofalot more ambitious as it uses 16 games and adds X2 and Crossfire X3 into the mix ... that should be next weekend; perhaps the weekend after if it gets even more detailed
rose.gif
 

ochadd

Senior member
May 27, 2004
408
0
76
Great replies, thanks. My take on it all is that my 3.4ghz e6750 is a good match for my GTX 260. Might as well wait to upgrade the CPU until I'm ready to do the GPU as well?