With a Geforce 8800GT, what CPU is "Fast enough"

wildside50

Member
Nov 19, 2007
43
0
0
Okay, it always seems like you can find out two pieces of seperate information.

1) CPU scaling in gaming at 1024x768
2) GPU scaling at lots of resolutions with one specific CPU

The information I need is the combination of the two. I don't care if the Core 2 E8750 runs a game at 170 fps at 1024, and the Athlon x2 4000+ only does it at 120 fps. I need to know, with a very high end GPU (ala 8800 GTX, GT, GTS, Radeon 3870), at high resolutions (1600x1200+), what CPU is "fast enough" that only the GPU is the limiting factor. If anyone has seen some recent reviews/information like that, please gimme the link.

I currently have an Opteron 170 (2.0 Ghz dual core), and I have an 8800 GT on the way. I need to know if upgrading my CPU is worth it (since I'll need a new Mobo, RAM, and the CPU -- that's a hefty 500+ dollar upgrade if all I get is 50 fps at 1024 and 5 fps at 1600...)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
what games do you play? an x2 cpu at 2.0 ghz is going to limit you in a lot of scenarios, esp a med-high res like 1600x1200.
 

tshen83

Member
Apr 8, 2001
176
0
0
wildside50: your dual core Opteron is decent. If you want anything faster for your 8800gt, I recommend the Q6600 overclocked to 3Ghz on Gigabyte GA-P35-DS3L motherboard.
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: tshen83
wildside50: your dual core Opteron is decent. If you want anything faster for your 8800gt, I recommend the Q6600 overclocked to 3Ghz on Gigabyte GA-P35-DS3L motherboard.

Right. That sounds well and good. But you're talking about a 250 dollar CPU, 100 dollar mobo, and I'll need some RAM.

I'm going from an x1950 pro to an 8800GT. I feel comfortable saying that upgrade will give me around an 80% performance increase for 250 dollars. I want to know if I can get even a 30-50% performance increase for 500 dollars worth of CPU upgrade. Or will it be more like 10-20. That just isn't worth it. I'd have been better served getting a 600 dollar 8800 Ultra in that case.
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: tshen83
wildside50: your dual core Opteron is decent. If you want anything faster for your 8800gt, I recommend the Q6600 overclocked to 3Ghz on Gigabyte GA-P35-DS3L motherboard.

I play all kinds of games. I realise the CPU will have more of an impact on games like Company of Heroes and World in Conflict. I think I am more concerned with FPS games as far as frame rate improvement. It would be nice for World in Conflict to run nincely at 1600 too, but I'm okay with running that game in 1024.

I just remember in the old days of video card reviews, there was always a page reserved for CPU scaling. They just don't seem to do that anymore. I miss it.
 

xColdSteelx

Member
Nov 22, 2007
25
0
0
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

 
Oct 4, 2004
10,515
6
81
What you are looking for can be found in articles like these - unfortunately, it's terribly outdated. They compared four Core 2 Duo chips from four different product series, all running @ 266x9=2.4GHz:

E6xxx = 4MB L2 | E4xxx = 2MB L2 | E2xxx = 1MB L2 | Celeron 440 = 512KB L2

An update to that article, with a Q6600 (also 2.4GHz) thrown in would be nice, along with some current games like Crysis, CoD4, Unreal Tournament 3, World in Conflict etc. Throw in a CPU scaling chart or two and that would be awesome.

This is a CPU scaling chart for Company of Heroes from a different article on the same site, running at three different resolutions (on a 8800GTX) - your chip is represented by the X2 4000+. The fastest CPU in this chart is a X2 6000+ . We're talking 70 vs. 76 fps increase (8%) at 1920x1200 for a full gigahertz (50%) boost in frequency.

Other games may have different performance curves but generally speaking, I wouldn't worry too much. Then again, I have been away from PC gaming for a while and don't know how current games tax the CPU.
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: vanvock
Why not just overclock what you've got & see how it plays?

Yeah, I tried when I first setup this system (A year ago), and I did not have good luck. I have the best overclock stepping for an opty 170, a DFI mobo with about 1,672,247 overclocking options, and RAM that had a reputation for overclocking well.

I got to 2.1 ghz before one of my RAM sticks died. I do believe it was faulty the whole time though, because it would never run at the listed timing (2-3-2-5). I would have to relax the timing to rediculous amounts to overclock at all (like, 3-5-4-8 2T). I did, however, just realize I never changed the HTT multiplier which might have been part of my problem, but my opty runs hot at stock speed (50-52 C). Apropos to your post, I just started trying to overclock again last night. I presently have a table fan blowing on my open case, running the opty at 2.2 ghz, and the core temp went down to 40C, the HDD went from 57C to 26C (!!!), and everything else on the system went down a good 5-10C. Needless to say, I need better case cooling. Maybe I'll get a new, big, many-fanned case. That way I dont' have to keep a fan on it.

Nevertheless, I'm going to see what speed I can hit WITH the fan on, just to be on the safe side. I'll let ya know. Still, a 3ghz Opteron is still no match for a 3ghz E6850. It's closer, but its still not the same as an all new rig.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
quad cores have minimum impact in games today.

Stick with your x2 and overclock. You are not missing much.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: wildside50
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?

Come again? Your radeon 1950pro spits out 27fps @ 1280x1024 with 4xAA and 16xAF. Either that or you did not set it to maxium in game settings.

http://www.firingsquad.com/har...nce_preview/page12.asp
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: Azn
Originally posted by: wildside50
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?

Come again? Your radeon 1950pro spits out 27fps @ 1280x1024 with 4xAA and 16xAF. Either that or you did not set it to maxium in game settings.

http://www.firingsquad.com/har...nce_preview/page12.asp

Well, while I appreciate you letting me know what frame rate I am getting when i play a game on my system -- you're just flat wrong.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: wildside50
Originally posted by: Azn
Originally posted by: wildside50
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?

Come again? Your radeon 1950pro spits out 27fps @ 1280x1024 with 4xAA and 16xAF. Either that or you did not set it to maxium in game settings.

http://www.firingsquad.com/har...nce_preview/page12.asp

Well, while I appreciate you letting me know what frame rate I am getting when i play a game on my system -- you're just flat wrong.

Your 1950pro must be on steroids. Either that you are on drugs.



 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
To make a long story short, as you increase the resolution, the load increases on the video card. So if you play games at 1600x1200 or higher you will see a huge difference by upgrading to a 8800gt, even with your current cpu.
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: Azn
Originally posted by: wildside50
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?

Come again? Your radeon 1950pro spits out 27fps @ 1280x1024 with 4xAA and 16xAF. Either that or you did not set it to maxium in game settings.

http://www.firingsquad.com/har...nce_preview/page12.asp

Well, just ran FRAPS during COD4. I stand corrected. 35-45fps. I guess I have forgotten what 60fps looks like. In a funny way, that lead me to the answer to my question though. They did their tests with the QX6800. So, I can compare what I get with my x1950 to what they got with an x1950, and get an idea the CPU impact. At least, on that card. I will just have to extrapolate for faster cards, and hope it holds.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: wildside50
Originally posted by: Azn
Originally posted by: wildside50
Originally posted by: xColdSteelx
I have a year old (almost) AMD X2 4200+ 2.2GHz with 2 gigs of PC6400 ram and an Nvidia 7950. In COD4 I get about 35-45 fps on average in most maps. During heavy graphics it drops to about 25 fps. This is with texture settings on normal, no AA, and things like "glow" off. It's unreal how sluggish it can be.

Since I'm not about to buy a new mobo/ram I'll just upgrade the CPU to a 6000 or 6400. By January or Feb I'll get an 8800GT. I hope that will help.

Really?? That kinda suprises me. My 2.0 ghz Opteron with an x1950 pro, I run Call of Duty 4 at 1360x1024, with 4x AA, 16x antiscopic filtering, an everything set to on or high, and I get a fine frame rate. I wouldn't play the game if it wasn't around 60 fps. It dips from time to time in heavily smoky areas, but most of the time it is solid. The 7950 GT and x1950 pro aren't that far from eachother in performance... something sounds kinda wrong there. Do you have the most recent drivers?

Come again? Your radeon 1950pro spits out 27fps @ 1280x1024 with 4xAA and 16xAF. Either that or you did not set it to maxium in game settings.

http://www.firingsquad.com/har...nce_preview/page12.asp

Well, just ran FRAPS during COD4. I stand corrected. 35-45fps. I guess I have forgotten what 60fps looks like. In a funny way, that lead me to the answer to my question though. They did their tests with the QX6800. So, I can compare what I get with my x1950 to what they got with an x1950, and get an idea the CPU impact. At least, on that card. I will just have to extrapolate for faster cards, and hope it holds.


CPU would have no impact on a game like COD4 with 1280x1024 4xAA 16xAF with your 1950pro. You are video card limited particularly that 256mb of vram. Perhaps you might get 1 or 2fps less with a lower end processor.

So you are getting 35-45fps and not 60fps. What does that average 40fps? So you are saying you get 40fps at the same resolution and settings when a AMD HD3850 gets lower than that? Either way you are not playing maximum ingame settings if your 1950pro is getting more than 30fps.
 

wildside50

Member
Nov 19, 2007
43
0
0
Okay -- I'll say it again:

Resolution: 1152x864
AA: 4x
AF: 16x
Everything on (except soften smoke edges)
Texture settings at High

FPS 35-45. I played for like 5 minutes on one level. I'm sure there are levels that will have worse frame rate, and some that will be better.

In any event, I would call it "smooth", and I think it looks good. The other guy seemed to be dissapoined in the way the game was playing, and I couldn't commiserate, despite having similarly performing hardware.
 

cubeless

Diamond Member
Sep 17, 2001
4,295
1
81
went from 2.6 x2 to 3.0 c2d with a 8800gts 640 and it smoothed out the lags in a number of games @ 1680... seems that that's 'enough' cpu for most things (crysis excluded...)...

but it seems that my 7950 doesn't use all of a x2@2.75 @ 1440, but is overwhelmed @ 1600...

what did make a difference was 2gb of mem, let the cpu run instead of page...

u will see a much bigger bump with vid upgrade than cpu upgrade, though...
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Check out the three charts at the bottom of this page. And note that AT did not label these charts correctly, read the sentence above the first:

We then cranked up the resolution to 1920 x 1200, and increased the world detail slider up to 5 to give us a more realistic situation for this clock speed comparison.

This is not a comprehensive comparison but it does show that in a GPU bound setting (with an 8800GTX) there is very little difference between a 2.33GHz e6550 and a 3GHz e6850 and only a small drop from that level to an X2 6000+ (3GHz). If you can get your processor up to 2.4-2.6 range you are probably just fine with an 8800GT.
 

wildside50

Member
Nov 19, 2007
43
0
0
Originally posted by: Denithor
Check out the three charts at the bottom of this page. And note that AT did not label these charts correctly, read the sentence above the first:

We then cranked up the resolution to 1920 x 1200, and increased the world detail slider up to 5 to give us a more realistic situation for this clock speed comparison.

This is not a comprehensive comparison but it does show that in a GPU bound setting (with an 8800GTX) there is very little difference between a 2.33GHz e6550 and a 3GHz e6850 and only a small drop from that level to an X2 6000+ (3GHz). If you can get your processor up to 2.4-2.6 range you are probably just fine with an 8800GT.

Thank you sir. That was exactly what I was looking for. That's convincing enough that my CPU is fine. Thank you kindly.