Is my 2.7 ghz C2D going to starve my new GTX 260?

mazeroth

Golden Member
Jan 31, 2006
1,821
2
81
I've spent hours and hours trying to get my E4300 back to the 3.3 ghz. she used to run at but I can't, for the life of me, bring her back. I've tried running only 2 sticks of RAM, different RAM, etc. No go. The best I've been able to get her to run lately is 2.7 ghz. Anywho...

I've been running an HD3870 for a while now and just snagged a new GTX 260 for under $150 shipped on ebay with the 30% back. I was wondering how much more horsepower I should be throwing at this new card? Is there a point in which enough CPU is enough? Say, 3.5 ghz for example? Or can you never give your GPU enough?

I play on a 24" Soyo at 1920x1200.

Thanks.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Nah don't worry about it, esp at 19x12. Crysis will be GPU limited, as will most other games at that resolution. For the ones that aren't, you'll still be getting 60+ fps.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I would say yes. I experienced significant performance differences going from a 2.8Ghz Core 2 to a 3.4Ghz one with my HD4870. UT3 went from having very noticeable fps drops all the time to almost never ever going below 60.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Afaik you'll be fine. Sure you might get a few fps more here and there, but you should be 30fps or more pretty much all the time.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: dguy6789
I would say yes. I experienced significant performance differences going from a 2.8Ghz Core 2 to a 3.4Ghz one with my HD4870. UT3 went from having very noticeable fps drops all the time to almost never ever going below 60.

The bold part here is the key reason to have a faster processor for playing games.

Take Crysis / Crysis Warhead for example - because it is GPU limited more than any other game:
Crysis

And here is UT3:
UT3

The gray bars represent what dguy6789 experienced in UT3. What is quoted in bold is where the CPU will make the greatest difference.

Originally posted by: mazeroth
Or can you never give your GPU enough?

:music:
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
How in HEAVENS name can avg FPS remain EXACTLY the same, when the minimum FPS go up. I call BS on those graphs lol.

As for the UT3 graph, minimum FPS is 54, with a 2.0ghz c2d!!! His 2.7ghz c2d will give roughly 60fps MINIMUM. Chances are he has a 60mhz refresh LCD, so hes not even going to NOTICE the difference between a 2.7ghz and a 4.0ghz C2D. So I call even more bullshit. Very minor difference is his e4300 having less cache then a e8500. I wouldn't upgrade the CPU for a while ...
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Well, my logic suggests that until the game is GPU limited, it must be CPU limited. A faster CPU as shown in the graphs will throw more data for the GPU to process.

Whether or not you notice the difference or the slower performance is the question....
 

alkalinetaupehat

Senior member
Mar 3, 2008
839
0
0
When I first got my new rig, it was Christmas in July. It was (still is) awesome, etc. Played around with overclocking a bit (3.2Ghz-ish), and noticed a good 10 FPS jump in my average FPS on Lost Planet, which I used as a part of my stability testing. Does this mean I'm CPU-limited? I don't really think so and would reason to say I have a nice balance; my framerates would probably go up about the same amount if I went to a GTX 280. The thing with minimum framerates going up and the average staying the same is that the maximum framerates also likely rose, which isn't shown. So, if the amount of data I can process goes up significantly, then it is less likely that I will slow down as often because of too much work at once.
Corsair has a really neat series of guides which have a remarkable lack of bias or sales-pitch tone to them, one in particular shows jumping from 2 gigs to 4 gigs of RAM causing the minimum framerates to rise significantly. This time we see an increase in the amount of data able to be held in memory at once having a similar effect to OC'ing a CPU to a higher clock; in both cases these two act as a specific form of a buffer. Weird connection but it proves my point I believe.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Originally posted by: alkalinetaupehat

Corsair has a really neat series of guides which have a remarkable lack of bias or sales-pitch tone to them


lol well if they're trying to sell you something and show you how good it is and how much u need it, then there's obviously gonna be quite a bit of bias. there's simply too much conflict of interest to present their evidence without any "bias".