How CPU bound would a single GTX 280 be on a E6600

n7

Elite Member
Jan 4, 2004
21,281
4
81
It wouldn't really be with most games.

For most UE3 games, a quad is ideal though.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
E6600 .... mmm, I would buy a 4870 / GTX260 (270-300$) and a Q6600 (185$) for he same price as a single GTX 280 (450$ - 500$) if I were you. Better overall upgrade.

Actually I see that you already have a 4870, upgrade your CPU before you upgrade your GPU from a 4870
 

TC91

Golden Member
Jul 9, 2007
1,164
0
0
i will assume you re using 1920x1200 or higher resolution if you are considering a gtx 280, and at those higher resolutions i really dont think your e6600 is gona be much of a bottleneck
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: TC91
i will assume you re using 1920x1200 or higher resolution if you are considering a gtx 280, and at those higher resolutions i really dont think your e6600 is gona be much of a bottleneck

its the same bottleneck regardless of resolution. I got choppy play on mass effect at 720x480 resolution on E8400 @ 3.6ghz, and i have the same choppiness at 1920x1200. They test at super low res because they are trying to eliminate the GPU and test COMPARATIVE CPU performance, and get hundreds of FPS... but choppy gameplay is not the same as those tests.

I am talking about taking as long as 41ms to render a frame... thats 23fps... @720x480... Flight sim X and oblivion are reportedly the same way. as well as supreme commander. (i didnt personally test them).

Sure you could raise the AA and graphics so that you get GPU bound at 15fps and the CPU will not be hit, but who wants to do that? you need a min CPU for smooth play, and then the GPU determines how high a resolution and AA you can use. But the first step is getting a CPU that provides smooth gameplay (at any resolution).

I say that if you are a gamer, you should get a quad core at 3+ GHZ.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.
 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
Originally posted by: MarcVenice
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.
QFT :roll:

Also, if you guys look at OP's sig, he's already OCed it past 3.0Ghz.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Marc, it was not just me. It was me and n7 and several other guys who all got the same result. The thread had two types of posters. Thread crappers who said "you are lying" (2 or 3 posters). And people who actually performed their own benchmarks (4 or 5 of them) and all of them got the same result as me, no exceptions.

our recorded tests with programs like FRAPS and the like showing dips into the low 20 FPS in some games are just lies then eh?
And while I only TESTED it in one game, other people there reported getting the same in other games, such as flight sim X, supreme commander, every other UE3 engine game (unreal 3) and ofcourse, oblivion.

I have an E8400 with more cache and higher OC then him. If it is not enough for me, it isn't enough for him...
And while I say "upgrade often, buy cheap", I prefer to upgrade once a year. And buying something that is too slow TODAY in some games will definitely not last a whole year.
Not to mention that you can always lower the resolution and get better FPS.

The dips into 23fps were both in 1920x1200 and in 720x480 resolution, there is absolutely nothing you can do to reduce them except upgrading your CPU.

This is simply science folks. two weeks ago I vehemently argued that a quad core is a waste of money for a gamer and they should just get a faster dual core. Since then I have seen EVIDENCE to the contrary and changed my tune. If you choose to not believe me, do so. I am here to learn and teach, but I am not here to preach. I ask for people's benchmarks, I provide my own. If you have none to provide and work solely on FAITH then enjoy your dual core religion. I have nothing to add to that.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: MarcVenice
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.

Yup same here. I have absolutely no problems with any of UT3 engine with a core2duo clocked @ 3.0ghz with a 8800gs. 100fps in Ut3, 70fps in bioshock, and 50fps in mass effect. That's with 4xAA.
 

Mandin62

Member
Mar 24, 2007
157
0
0
i have an 8800 GT and an E6600@3.4Ghz and it does great. I game at 1440x900 most of the time for 100+ frame rates. but most games i can do 1680x1050 with decent rates. i wouldnt worry about upgrading right now. it will simply be a waste of money. Save your money and buy something that will actually improve your performance in a couple of months or a year. I rarely hit the limits of my cpu unless playing Supreme commander. Quads are overrated when it comes to gaming. Relax and let your wallet stay fat.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Azn
Originally posted by: MarcVenice
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.

Yup same here. I have absolutely no problems with any of UT3 engine with a core2duo clocked @ 3.0ghz with a 8800gs. 100fps in Ut3, 70fps in bioshock, and 50fps in mass effect. That's with 4xAA.

Yet in that thread you REFUSED to perform any tests, gave only average FPS (that matched mine btw, same with that 50fps figure), and then finally insisted that it is an imaginary "overhead" that is neither related to CPU nor GPU (but did not specificy to what when asked) that is causing that. TWO SEPARATE moderators called you out for your misconduct there and instead of backing anything up you bickered with them.

And then you accuse me of spreading FUD? (well agree with an accusation of me).
Now I am NOT letting this escalate to a flame ware, as I said, I am not here to preach.
But to call me a FUD spreader like that is insulting.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
Originally posted by: Azn
Originally posted by: MarcVenice
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.

Yup same here. I have absolutely no problems with any of UT3 engine with a core2duo clocked @ 3.0ghz with a 8800gs. 100fps in Ut3, 70fps in bioshock, and 50fps in mass effect. That's with 4xAA.

Yet in that thread you REFUSED to perform any tests, gave only average FPS (that matched mine btw, same with that 50fps figure), and then finally insisted that it is an imaginary "overhead" that is neither related to CPU nor GPU (but did not specificy to what when asked) that is causing that. TWO SEPARATE moderators called you out for your misconduct there and instead of backing anything up you bickered with them.

And then you accuse me of spreading FUD? (well agree with an accusation of me).
Now I am NOT letting this escalate to a flame ware, as I said, I am not here to preach.
But to call me a FUD spreader like that is insulting.

You must be a newb PC gamer because over the last 15~20 years of PC gaming there were games that were programmed to have your so called "imaginary" over head. I could name 1 modern game right now that show this behavior but it's for you to find out on your own.

How did I refuse any tests? Weren't you the one who refused to downclock your CPU to see your CPU is bottleneck? yet you insist lowering resolution and point to the CPU which doesn't prove mass effect was limited by your CPU with a 4850. it's not science. it's your inability to figure out problems in hand.

Who were these 2 moderators? I didn't call you a fud spreader either. I think you have mistaken me with somebody else. I feel you are misinformed and CPU isn't the bottleneck in those situations. Perhaps limited to a certain degree.
 

toslat

Senior member
Jul 26, 2007
216
0
76
There is more to system performance than the CPU and GPU. There other hardware components, and software, that could result in choppiness. Even if a faster processor improves things, that does not imply that your system was performing efficiently in the first place. For all you know, you might be hiding some other bottleneck under CPU power.

Mass Effect System Requirements (trimmed):

Minimum Spec:
* 2.4+ GHz Intel or 2.0+ GHz AMD
* 1GB RAM (XP)/2GB Ram (Vista)
* NVIDIA GeForce 6 series (6800 GT or better), ATI X1300 XT or better (X1550, X1600 Pro and HD2400 are below minimum system requirements)
* Hard Drive Space - English: 12 GB; French, Italian, German: 14 GB
* 100% DirectX 9.0c compatible sound card and drivers

Recommended Spec:
* 2.6+ GHz Intel or 2.4+ GHz AMD
* 2GB RAM
* ATI X1800 XL series or higher, NVIDIA GeForce 7900 GTX or higher
* Hardrive - English: 12 GB; French, Italian, German: 14 GB
* 100% DirectX 9.0c compatible sound card and drivers ? 5.1 sound card recommended

PS The recommended spec is recommended for a reason - wanna guess why?
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
I game at 1920 x 1200 and I like to turn everything on. My 4870 is still under 30 days at Fry's, and I called them and they have BFG GTX 280's for $449 in stock. So I figured I might as well get one since it's the best single card out there, and reading about all the AFR microstutter stuff has scared me away from a 4870X2.

But I wanted to be sure my E6600 wouldn't have me cpu bound on the 280, and from the posts so far it seems like it won't?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I don't now about your CPU (though I doubt it to be honest) but I just finished Mass Effect at 1920x1200 all maxed and the game dipped ONCE to 38FPS during the entire game. I had FRAPS running the whole time, was curious about my new card :p 99% of the time it was capped at 62FPS, no matter where I was and what I did. When it went lower it was like 55-ish. So I would say your GPU is fine and you don't really need to change it. Unless you really need to spend some $$.

Mind you, I'm runing a Q9450 at 3.2GHz. That's why I really don't know how your rig will fare. But your GPU is sufficient to run everything at 1920x1200.
 

unr3al

Senior member
Jun 10, 2008
214
1
81
www.link-up.co.za
I don't even have choppy framerates in Oblivion and I've got an X1250 IGP! My CPU at 2.6GHz is completely adequate for any game that my IGP can handle (in other words, I don't get CPU limitations).
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
This is why I think you're spreading fud taltamir. Yes, you might have established a sudden drop in ONE single frame, per how many framerates exactly? Not sure about this one, but quads at 3.0ghz seem to get rid of this sudden drop in a single framerate. Now, I'd like to compare it to microstutter on multi gpu setups, it's something that has been around for years, yet no-one noticed it, no-one ever came to a forum and said, jesus, I've got this weird microstutter but I don't know why it's happening. Then someone proved it to be there, and all of a sudden it's a big hype. Now, your 'thing' is more or less the same, but also 10 less significant.

I've NEVER seen anyone complain about this 'stutter', I haven't noticed it myself, on an inferior CPU ( perhaps AMD cpu's are not susceptible to this problem? ) and yet here you are, telling people to get a 3.0ghz quad for UT3 games. Do you know what a 3.0ghz quadcore costs? It ain't cheap. To me it's like you are saying to someone who drives a Ford, a very decent car ( lets just assume it is ) and the only problem it has, is that it makes a squeaky kind of noise, but you can't hear it when you drive faster then 20mph. It's a very minor inconvenience, 99% of the Ford drivers might not even know about. Yet you are saying, lose the Ford, buy a much more expensive Lexus, or Mercedes, to get rid of something most ppl won't even notice is there.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@taltamir I did some more testing for mass effect and I found some new things.

1) There seems some sort of load points for stream data. I start to walk on a bridge in the citadel and felt my fps drop, but it's hard to see. After fraps recording this spot 3 times I notice that all 3 time my fps drop to about 20FPS. From a chart not only could I see that drop, but I seen how my fps had spiked up and down for a frames ( 45 - 75fps ) for about 24 frames.

2) That the difference in fps is very small to not even noticable with only 2 cores at 1080p. At 864p I be cpu bound with only 2 cores, but was still getting 85% FPS. ( 65fps 4 cores vs 55fps 2 cores )

Again shows that Mass Effect is somewhat CPU bound for dual core, but not that bad at all. The game does suffer low fps during, what I think is stream data. It's clearly a bad port from the 360. Again EA is known for bad ports.

@regarding this topic. Yes the CPU will bottleneck the GTX280, when it comes to some heavy CPU games or bad ports. Newer games might make things worst, but for now it's not that bad.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
Hmm, so many conflicting view points *boggle*

How bad would I bottleneck my GTX280 if I used it on a PCIe 1.0 mobo?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Why don't you just lower the resolution in your games and find out where they're cpu bound? At least that way you know where you stand.

PCIe 1.0 bottlenecking would depends how much the video card needs to access system memory. Flight Simulator is a good example of how bad it can be.
Link
 

sliderule

Member
May 13, 2007
75
0
0
Originally posted by: MarcVenice
Disclaimer, Taltamir is talking about ONE game. Something only HE experiences, because I get no choppy gameplay, albeit with lower settings, using a x2 3800+ @ 2.6ghz and a 8800gts 320mb, @ 1680*1050.

I'd say your fine with a e6600. If you're unsure, OC it to 3.0ghz ... But then again, I'd stick with your HD4870. And for most UT3 games a 3.0ghz DUALCORE is just fine. Jesus christ, you guys are spreading FUD if you ask me. I've played BioShock, GoW, Mass Effect, UT3, and god knows what other unreal engine based game, rainbow six vegas 2 for example, and my CPU performed just FINE !!! And you're telling people to invest 100's of dollars into a new cpu, when my crappy CPU is doing just fine? Blegh. I'm on a crusade from now on, and copy paste this disclaimer everytime you claim something like that, or tell ppl to get a quad at 3.0ghz for gaming.

I played through Mass Effect twice on the highest setting with no lag at 1680x1050 on a budget e4500 clocked at 3.0ghz(would go higher but stock cooler), with 4gig ddr2 1066 ram, hd3870, and vista.

The game crashed to desktop twice, but other than that I had no complaints. Other newer games that played flawless were Grid, GoW, and Cod4. The only game that I've tried that is laggy is the Crysis demo...oh, and AoC, but that was unstable beta at the time so I cut it some slack.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Hey OP, I am waiting for eVGA to ship me my GTX 280 and then I can help you by running some benchmarks. I have an E6600 @ 3.3GHz, 4GB RAM and a P5B-Deluxe (P965) mobo which I believe is PCI-e 1.0 (not sure if makes a difference for single card). I should have the card next week sometime.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SSChevy2001
Why don't you just lower the resolution in your games and find out where they're cpu bound? At least that way you know where you stand.

PCIe 1.0 bottlenecking would depends how much the video card needs to access system memory. Flight Simulator is a good example of how bad it can be.
Link

That's not exactly testing pciE 1.0 vs 2.0. It's testing PCI-E 1.0 @ 1x 4x 8x 16x