CPU bound on mass effect, E8400 @ 3.6ghz

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I just realized something.. Mass Effect uses physX!
And I remember things improved when I got the 8800GTS 512 w/ the hacked drivers. (probably lowered max and average fps a little to increase min FPS a lot).

I remember also that the physX on GPU + 8x force AA from drivers caused a blue screen, and people said they had some issues on early physX on CUDA... (i didn't see any without force AA or with only 2x).

The workaround was to disable physX in the ini files.. I am going to look into disabling it and seeing what happens.
From what i hear it does improve quality. Supposedly blowing up a crate for example, has pieces of it flying off in physx, vs it just disappearing without. I will see.

EDIT: oh wait, I just remembered, it was "game automatically uses physX if it detects that it is available"

And I don't recall EVER seeing an item explode the way they describe it does with physX, so most likely I am getting this heavy cpu usage even with physX disabled. I will test it out to see if that is so.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Taltamir why don't you just downclock your core 2 duo to see if it is bottlenecking you at that resolution? UT3 that is less stressful than Mass effect. It only gained 1-3 fps with a 8800gtx with nearly 700mhz difference in CPU clocks at that resolution. I doubt you are going to gain a whole lot with a faster CPU.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
So you want to blame Anandtech's article on me now. Excuse me for not reading the article but the graph clearly says 1024x768 resolution. :roll:
No, I'm laughing at you for ignorantly linking something as evidence without even understanding what it was you were linking. I'm also laughing at you for refusing to acknowledge the linked article had nothing to do with what you were assuming, even after I had pointed out to you the caption was incorrect. If you're going to link an article as evidence you had better damn well make sure you understand what it says.

UT3 is clearly less stressful on the video card than mass effect. If a 8800gtx can get 90fps in UT3 and 9800gtx only gets 50fps obviously CPU could be a factor with UT3 but lesser of that effect with Mass effect. The graph shows cpu isn't bottlenecking at 1920x1200. Core 2 duo @ 3.0ghz is getting whole 1-3 fps faster than 2.33ghz. :roll:
And none of this is news, we knew this 10 months ago when that article was written. The 4850 is supposedly ~20% faster than those last-gen parts, so if you had a 4850 and only got 90FPS in UT3 at 1920 or 50FPS in Mass Effect with a 3GHz CPU what would you assume?

Because it's already proven your ROP isn't the limiting factor. :laugh:
Really? Where?

Look at 4870 with 16ROP breathing down on 260gtx with 28ROP. :brokenheart: :(

Just admit you were wrong and move on, no need to bring out an old thread just to prove you were wrong. :D
And none of that has to do with GT200's design. The 4870 addressed its two biggest deficiencies, TMUs (more than 2x) and SPs (almost 3x) but fell short of doubling 3870 performance because it didn't increase ROPs. It did improve the existing ROPs however, which is why we see improved AA performance.

This thread is obviously about 4850 first off. The limitations between 4850 and 280gtx is different or multi-gpu configurations. You are going to bring some card twice as fast to prove that it's CPU limited @ 1920x1200 resolution?
Actually the thread title says "CPU bound on mass effect, E8400 @ 3.6ghz" and the OP asks others to "please post any other game that you know that would be CPU bound even on 1920x1200, high settings, and an OCed E8400... " It just so happens he is using a 4850, but that would be ridiculous to limit the discussion to the 4850. Almost as ridiculous as you linking a 10 month old review on July 11, 2008 and limiting the discussion to last-gen parts.

Its relevant because up until recently I doubt anyone would've thought a 3GHz Core 2 CPU would bottleneck you at 1920, a historically GPU bound resolution. But that is increasingly the case with the high-end and multi-GPU solutions out there today, some of which are priced at $250-300.

Second you have brought no proof that 4850 is bottle-necking by CPU in mass effect @ 1920x1200. :disgust:
And where did I say the 4850 was CPU bottlenecked? I said Mass Effect is a very CPU intensive game in which you replied some nonsense about a dual core being enough with some link that didn't even say what you thought it said. :roll:
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: taltamir
FPS jumps between 20 and 120, usually I see a "6#"... again it is moving far too fast for me to be too clear here, but it definitely dips into the 20s, just like before. (before it jumped between 20 and 60).

If this is not a clear cut case of being CPU bound I don't know what is.. the CPU is obviously the cause of my "jitters", at 1920x1200 and at 720x480.

I am now going to check the wattage impact of speed step since it supposedly lowers performance by 3% or so... And I am gonna try to push my OC to 4ghz (which I thought was pointless before)... or trade up to a quad core and OC it past 3.5...

What are you using to record and monitor FPS? If you're using FRAPs you can just record frames to log instead of having to constantly eye-ball it. Also, make sure you turn off frame rate smoothing, as that may be artificially limiting your FPS with a 4850.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i turned off frame rate smoothing, it improved min and max FPS, from what I read frame rate smoothing only helps in a few rare cases, and is actually detrimental under general circumstance, and should be off by default instead of on and undisableable without modding ini files..
Basically it is a broken, retarded, version of vsync. employed by the game engine instead of the driver.
I am using the in game FPS measurement, because riva tuner does not yet work with the 48xx series.
Thanks for reminding me about fraps, i completely forgot about it. I will do some runs with FRAPS when I get home to get hard numbers, finally!

BTW, interesting thing about the 4870 vs the GTX 280... the 4870 has MORE memory bandwith and more raw SP power (1.2 TFLOPS vs 900 GFLOPS on the 280). The 280 also increased the texturing units more then the 4870 did... so that its AA performance could increase (and did icnrease compared to the G92) greatly...
So I am wondering if may chizow claim of ROPs being invalueable has merit, since it is one of the few things in which the G280 is NOT speced lower then the 4870, as most people will agree that the 280 is faster then the 4870.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Alright! I tested it with fraps, I got hard data!

The FPS barely increases at all going from 720x480 to 1920x1200...
that is going from 345,600 pixels to 2,304,000 pixels. or... 6.6666666666 times the pixels!

Tests done with FRAPS:
1920x1200 window mode:
Min Max Avg
39 65 52.252

720x480 window mode:
Min Max Avg
47 96 67.797


Both of those exhibit the "jitter", something I normally do not see in games until they fall into the low 10s in FPS, but is seen in mass effect even in the 50s. I don't know why FRAPS doesn't show it, it seems to be something other then an FPS drop, as in, something else in the engine is causing that effect rather then low FPS. (that or I am really really sensitive to sudden drops).
Notice that both are on a 3.6ghz C2D E8400... CPU is a constant 100%, sometimes drops to 95%, I have seen it go as low as 80 once... The GPU is 70-100% on 1920x1200, and at 20% on 720x480... ckearly it is not being taxed at that resolution.

These are tested with frame smoothing off. Which I think improved performance across the board. (but I haven't tested with fraps yet, it was before when I was using the inaccurate bioware tool)...

An interesting thing, the game built in frame counter DID show what appeared to be 10s and 20s on the FPS... fraps never showed it going that low.

however... I think it is a flaw with how fraps calculates it...
digging through the data for the 720x480 resolution I found the following:
605 8385.076 10.98
606 8426.748 41.672
607 8441.728 14.98

This is not the only example, but it is the worst (there were many which were 40 and 39 ms)...

As you can see frame 605 and 607 both took 11 and 15ms respectively to draw... while frame 606 took a whopping 41.672 ms to draw. That is undoubtly the source of the "jitters" I am noticing both in the high and low resolutions. Also... 1000/42 = 23fps... so at that instantanous frame, I had an FPS of 23, just like I said I noticed in the built in counter (which updates EVERY FRAME!) I will try it again with frame rate smoothing turned back on, but as far as I can tell the jitters decreased with it off...
And just to make things clear, this is at 720x480 resolution!

Either way...
Conclusion: Mass Effect is CPU bound as hell on a "mere" 4850 with an E8400 C2D oced to 3.6ghz!
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir


Either way...
Conclusion: Mass Effect is CPU bound as hell on a "mere" 4850 with an E8400 C2D oced to 3.6ghz!

How can you have that conclusion when you haven't underclocked your CPU to test it?

UT3 engine show bigger cache or architectural differences with CPU makes the bigger difference than raw clock speed differences.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Mass Effect runs quite smooth on my Opteron 165 @ 2.7ghz with my 8800GTS 320mb, even with all settings maxed out at 1920x1200 resolution. Generally my FPS run from 30fps to 50fps...in some areas they drop into the 20s but its rare. I'll run some Fraps benchmarks to compare.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Azn

How can you have that conclusion when you haven't underclocked your CPU to test it?
He doesn't have to overclock his CPU, all he has to do is change the resolution and demonstrate the change in performance is less than the change in resolution, which he's done.

67.797 FPS to 52.252 FPS is a ~23% drop in performance while increasing the resolution by ~6.67 times. So when rendering over six times the pixels he?s still seeing more than 75% of his previous performance.

That?s a textbook CPU limitation, assuming no other factors like the game capping the framerate for example.

UT3 engine show bigger cache or architectural differences with CPU makes the bigger difference than raw clock speed differences.
If clock speed differences don't make much difference (according to you) then why ask him to overclock his CPU?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Originally posted by: Azn
So you want to blame Anandtech's article on me now. Excuse me for not reading the article but the graph clearly says 1024x768 resolution. :roll:
No, I'm laughing at you for ignorantly linking something as evidence without even understanding what it was you were linking. I'm also laughing at you for refusing to acknowledge the linked article had nothing to do with what you were assuming, even after I had pointed out to you the caption was incorrect. If you're going to link an article as evidence you had better damn well make sure you understand what it says.

What was I assuming? :roll: I posted to show the OP that UT3 engine isn't really being bottlenecked by CPU. It's obviously showing 1024x768 benches and the difference between an AMD CPU vs Intel CPU and scroll down it says 1024x768 resolution where Anandtech editor made an error with labeling their graphs. 1920x1200 is even better. I'm glad you caught it because it's even showing less constraint on CPU. :laugh:


UT3 is clearly less stressful on the video card than mass effect. If a 8800gtx can get 90fps in UT3 and 9800gtx only gets 50fps obviously CPU could be a factor with UT3 but lesser of that effect with Mass effect. The graph shows cpu isn't bottlenecking at 1920x1200. Core 2 duo @ 3.0ghz is getting whole 1-3 fps faster than 2.33ghz. :roll:
And none of this is news, we knew this 10 months ago when that article was written. The 4850 is supposedly ~20% faster than those last-gen parts, so if you had a 4850 and only got 90FPS in UT3 at 1920 or 50FPS in Mass Effect with a 3GHz CPU what would you assume?

20% faster? In raw frame rates it's about same as an ultra or 9800gtx. Only with AA is it faster across the board. You'd assume it's being bottlenecked by CPU. :eek: I assume the game is getting more gpu intensive at that resolution.

Because it's already proven your ROP isn't the limiting factor. :laugh:
Really? Where?

260gtx with 28rops that is barely faster than 9800gtx with 16. :laugh:


Look at 4870 with 16ROP breathing down on 260gtx with 28ROP. :brokenheart: :(

Just admit you were wrong and move on, no need to bring out an old thread just to prove you were wrong. :D
And none of that has to do with GT200's design. The 4870 addressed its two biggest deficiencies, TMUs (more than 2x) and SPs (almost 3x) but fell short of doubling 3870 performance because it didn't increase ROPs. It did improve the existing ROPs however, which is why we see improved AA performance.

See above post about 9800gtx vs GTX260. :brokenheart:

First off 4870 is easily neck and neck with 3870x2 and at times downright destroying 3870x2. Didn't you say something like when GT200 is released you were going to prove that ROP makes the biggest impact in performance? :laugh:


This thread is obviously about 4850 first off. The limitations between 4850 and 280gtx is different or multi-gpu configurations. You are going to bring some card twice as fast to prove that it's CPU limited @ 1920x1200 resolution?
Actually the thread title says "CPU bound on mass effect, E8400 @ 3.6ghz" and the OP asks others to "please post any other game that you know that would be CPU bound even on 1920x1200, high settings, and an OCed E8400... " It just so happens he is using a 4850, but that would be ridiculous to limit the discussion to the 4850. Almost as ridiculous as you linking a 10 month old review on July 11, 2008 and limiting the discussion to last-gen parts.

Its relevant because up until recently I doubt anyone would've thought a 3GHz Core 2 CPU would bottleneck you at 1920, a historically GPU bound resolution. But that is increasingly the case with the high-end and multi-GPU solutions out there today, some of which are priced at $250-300.

What does cards that are twice as fast have anything to do with limits of OP's 4850 and being bottlenecked at 1920x1200 resolution? Considering 4850 has a limit of of 1920x1200 resolution in modern games you are going to compare it to multi-card or cards that are $300 with higher threshold? :roll:

Considering 8800gtx is very close to 4850 raw performance the article I linked has relevance to what kind of bottlenecks he is going to be getting. Not to mention it's based on the same ut3 engine although Mass effect has more overhead and more GPU intensive.


Second you have brought no proof that 4850 is bottle-necking by CPU in mass effect @ 1920x1200. :disgust:
And where did I say the 4850 was CPU bottlenecked? I said Mass Effect is a very CPU intensive game in which you replied some nonsense about a dual core being enough with some link that didn't even say what you thought it said. :roll:

In that article with the same UT3 engine core 2 duo 2.33ghz is getting much fps as 3.0ghz core 2 duo. :light:

So much for your CPU being bottlenecked with mass effect. :laugh:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Originally posted by: Azn

How can you have that conclusion when you haven't underclocked your CPU to test it?
He doesn't have to overclock his CPU, all he has to do is change the resolution and demonstrate the change in performance is less than the change in resolution, which he's done.

67.797 FPS to 52.252 FPS is a ~23% drop in performance while increasing the resolution by ~6.67 times. So when rendering over six times the pixels he?s still seeing more than 75% of his previous performance.

That?s a textbook CPU limitation, assuming no other factors like the game capping the framerate for example.

It could also be that mass effect is much more GPU intensive and has an overhead or that Radeon 4850 doesn't scale as well when lowering resolution. Underclocking or overclocking his CPU would be the easiest way to find if he's being bottlenecked by CPU.


UT3 engine show bigger cache or architectural differences with CPU makes the bigger difference than raw clock speed differences.
If clock speed differences don't make much difference (according to you) then why ask him to overclock his CPU?

Not according to me in Anandtech's article and I said underclock his cpu so he can judge it for himself.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Azn

It could also be that mass effect is much more GPU intensive
What are you talking about? It can't possibly be "GPU intensive? if a massive change in resolution (~6.67 times) isn't impacting the framerate much (less that one quarter performance loss).

and has an overhead or that Radeon 4850 doesn't scale as well when lowering resolution.
Well I did say assuming no other external factors.

Underclocking or overclocking his CPU would be the easiest way to find if he's being bottlenecked by CPU.
If the external factors interfere with GPU scaling then they'll also likely interfere when it comes to showing CPU clock changes.

The fact is we have no idea how he has the game configured (he?s only just turned off frame smoothing), but based on the figures he?s provided the most likely explanation is that he?s text-book CPU limited.

Not according to me in Anandtech's article and I said underclock his cpu so he can judge it for himself.
Huh? First you told him to underclock his CPU and then you said clock speed doesn't impact the UT3 engine much. This is a direct contradiction. What are you saying exactly?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I figured out something, I was calling them "jitters"... but actually they are textbook cases of "micro-stutter"... I heard about single card micro-stutter before, and now I see it. Obviously the cause of this is not the GPU by the CPU... since the example I found was (11 to 41.6 to 15 ms to draw a frame; in FPS that is 90, 23, 66 instantaneous FPS) at the 720x480 resolution, where my GPU shows a utilization in the low 20%, while the CPU is 100%. Unless there is a bug or something causing it. Who knows. I personally felt that the micro-stutter was actually lessened when I turned off frame-smoothing, I will try it again with frame smoothing as it is supposed to combat exactly that.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Originally posted by: Azn

It could also be that mass effect is much more GPU intensive
What are you talking about? It can't possibly be "GPU intensive? if a massive change in resolution (~6.67 times) isn't impacting the framerate much (less that one quarter performance loss).

Obviously you are guessing it's not gpu intensive or never seen a game only spit out certain fps no matter how fast the video card or CPU is. Not all games scale this way but some do.

and has an overhead or that Radeon 4850 doesn't scale as well when lowering resolution.
Well I did say assuming no other external factors.

I think there's some problematic issues where mass effect is not scaling. My 8800gs was actually faster with AA than without AA at 1440x900 resolution.


Underclocking or overclocking his CPU would be the easiest way to find if he's being bottlenecked by CPU.
If the external factors interfere with GPU scaling then they'll also likely interfere when it comes to showing CPU clock changes.

The fact is we have no idea how he has the game configured (he?s only just turned off frame smoothing), but based on the figures he?s provided the most likely explanation is that he?s text-book CPU limited.

Text book CPU limited @ 1920x1200 resolution where a 4850 starts being GPU limited? :roll:

http://www.tomshardware.com/re...n-hd-4850,1957-18.html


Not according to me in Anandtech's article and I said underclock his cpu so he can judge it for himself.
Huh? First you told him to underclock his CPU and then you said clock speed doesn't impact the UT3 engine much. This is a direct contradiction. What are you saying exactly?

So he can test it himself. This is real way to test CPU bottlenecks. Not lowering resolution. There's a problem with just lowering resolution and trying to find CPU bottlenecks. Read above.

When he did lower resolution he did have faster FPS. Sure it didn't spit out 100 more fps but the difference is there. If he was being bottlenecked than he should be getting same fps as whatever resolution but it's not.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
So I am wondering if may chizow claim of ROPs being invalueable has merit, since it is one of the few things in which the G280 is NOT speced lower then the 4870, as most people will agree that the 280 is faster then the 4870.

But G260 isn't really faster than 4870. They are about even while 4870 has less ROP.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
What the heck does the 260 have with my comparison? The 280 has LESS memory bandwidth and LESS shader power then the 4870. Yet it outperforms it... there needs to be a reason.
The 260 has even less of both then the 280, and it gets outperformed, so there is nothing fishy here. Why are you even bringing up something so completely unrelated?

Also, the toms hardware tests seem to match my own... except they:
1. Forgot to turn off vsync. (60fps cap)
2. Show how things are with forcing AA/AF in driver, despite those giving atrocious performance typically due to the game not being designed for it...

Sure, I can set it to force 8x edge detect aka 24x AA in driver and watch the frame rate tank due to GPU limitations, that is completely unrelated to what I was testing here or saying here.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: taltamir
What the heck does the 260 have with my comparison? The 280 has LESS memory bandwidth and LESS shader power then the 4870. Yet it outperforms it... there needs to be a reason.
...

Actually the GTX280 has 140GB/s bandwidth, the 4870 has 115GB/s bandwidth.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ah, i see then. For some reason I remembered it having 110 GB/s vs 115GB/s of athe 4870
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Taltamir I get the same feeling of microshutter with this game, but in my case I know it's not CPU related. Personally I just think it's just a bad port, and this has to do with the way it streams data. I would of much rather microsoft ported this game than EA. Here's my numbers to prove I'm having the same problem.

13.473
12.961
14.07
25.865
7.146
9.4
13.989
12.548

or

13.884
13.606
28.725
6.962
8.992
13.121
13.598

Edit - this was taken with framerate smoothing off.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
A bunch of gibberish

Rofl its obvious you have no clue what you're talking about. Its quite clear at this point you don't own Mass Effect, don't know how performance is impacted in games in general, and have already been proven wrong on so many counts to substantiate this. Its amazing how you will argue completely different things to suit whatever you think you're arguing because in reality you don't really know yourself. But at least I'm not asking you to compare Apples to Apples this time LMAO.

In any case, the OP and others have already shown CPU bottlenecking with mid-range single GPU solutions with fast dual core CPUs so there's really no reason to continue deciphering your gibberish.

For anyone else looking to upgrade from a single card to multi-GPU or a faster single GPU, CPU bottlenecking is certainly something to be aware of. When looking at reviews with 3GHz CPUs its quite obvious there's considerable bottlenecking going on. Take a look at FPS scaling between resolution as mentioned here, but also keep a close eye on maximum FPS averages amongst all high-end/multi-GPU parts. This is most obvious when you see say, 4850CF and 4870CF capping out at the same FPS. Even a single GTX 280 may approach this limit along with slower NV solutions like the GX2 even at resolutions up to 1920.

Then compare to some of the 4GHz reviews out there and look for the solutions you expected were most bottlenecked, like 4870CF or GTX 280 and you should see them continue to scale and differentiate themselves from the slower parts where previously they were CPU bound. Unfortunately most review sites do not use highly OC'd CPUs in their tests, which makes sense since you can't get a retail CPU anywhere close to 4GHz without heavily OC"ing. But at the same time CPU and GPU bottlenecks can't be exposed without analyzing the results from high-end solutions, which is why it took this last round of GPUs to really show CPU bottlenecking was occurring even at traditionally GPU bound resolutions like 1920.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Azn

Obviously you are guessing it's not gpu intensive or never seen a game only spit out certain fps no matter how fast the video card or CPU is. Not all games scale this way but some do.
Well like I said, assuming no external factors his results demonstrate a CPU limitation.

Text book CPU limited @ 1920x1200 resolution where a 4850 starts being GPU limited?

http://www.tomshardware.com/re...n-hd-4850,1957-18.html
All of the figures on that page are clearly influenced by a cap of ~60 FPS or thereabouts.

Now look at taltamir's average and max scores for 720x480: 96 FPS and 67.797 FPS respectively.

Clearly the cap isn't affecting him.

If he was being bottlenecked than he should be getting same fps as whatever resolution but it's not.
No, this is false. The scores don't have to be exactly the same, just disproportionate to the change in settings.

Rendering 6.67 times the amount of pixels for a performance loss of less than 25% tells me the GPU is not the primary bottleneck here.

Seldom is a rendering system 100% bottlenecked in only one place but rather there are multiple bottlenecks with one in particular having the most impact.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
This is very confusing for a newbie like me.

On one hand people are saying that the CPU is limiting the 4850, but on the other hand a 4870 will still get a better performance on the same CPU.

So if I need to decide between spending more money on a faster CPU or on a faster graphic card (considering my interest is gaming only) its better to spend my money on 2x 4850 and pair it with a crappier/cheaper/slower CPU, because even if the CPU is bottle necking it, the overall frame rate is higher with 2x 4850 than with a faster CPU and 4870 of the same combined price. Right?

Basically the conclusion is that for games, money wise, you get a much better performance/price ratio from GPUs than CPU's?

Or the only conclusion is that a faster CPU would get more out of a 4850 in this game? And what is the practical conclusion? That is still better to upgrade the GPU to get more performance then spend the same on the CPU?

This reminds me of when I got a K7 that was slower on everything than the P4 but were dirty cheap, so for the same money I would get a much better FPS on a K7 platform than on P4.

Or am I just wrong?

 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@gaiahunter

It's not confusing. There will be times in this game where it will be GPU limited and times when it will be CPU limited. So if your put a fast GPU you'll still see a higher avg score, but it still doesn't help when the scenes are CPU limited.

If you go 2x 4850 your max fps will go up, but your min fps will still be the same. Min FPS are what you need to focus on, not max FPS.

Before going 2x 4850 I would recommend at least a 3.2ghz quad core or higher. If you can't afford both I would go one 4850 and the cpu.

Always Min FPS before max. Hopefully more review sites will focus on posting the min fps, as they determine how smooth gameplay really is.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: GaiaHunter
This is very confusing for a newbie like me...

I don't think there's a hard-and-fast answer for you. It comes down to the system you have now (i.e. what socket/platform, psu, etc), what games you like to play, and what resolution and settings you play them at.

As you can imagine, all that adds up to a complex equation.

In the past, articles have shown that, when choosing between a CPU or GPU upgrade, it is often best to go with a GPU upgrade, regardless of the amount of "bottlenecking" that slower CPU puts on the card. The benchmarks in the article showed a faster CPU (depending on the game and settings) adding some performance gains, but a faster GPU adding A LOT of performance gains.

That may not still be the case with these new cards. Hopefully, some review site will test these GPUs with a *wide range* of CPUs across several resolutions in many games and give us a good picture of what we're dealing with.

In the mean time though, tell us what your current setup is, what games you play (or would like to play soon), and what resolution/settings you play at. Also, what budget are you looking to spend. All of that info will help us make some suggestions.