CPU bound on mass effect, E8400 @ 3.6ghz

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K

No, this is false. The scores don't have to be exactly the same, just disproportionate to the change in settings.

Rendering 6.67 times the amount of pixels for a performance loss of less than 25% tells me the GPU is not the primary bottleneck here.

Seldom is a rendering system 100% bottlenecked in only one place but rather there are multiple bottlenecks with one in particular having the most impact.

Not necessarily CPU bound. It could just be overhead of the game.

When Anandtech did their article the CPU clock speeds wasn't generating real performance gains. Only thing it mattered was architectural and cache.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
What the heck does the 260 have with my comparison? The 280 has LESS memory bandwidth and LESS shader power then the 4870. Yet it outperforms it... there needs to be a reason.
The 260 has even less of both then the 280, and it gets outperformed, so there is nothing fishy here. Why are you even bringing up something so completely unrelated?

Also, the toms hardware tests seem to match my own... except they:
1. Forgot to turn off vsync. (60fps cap)
2. Show how things are with forcing AA/AF in driver, despite those giving atrocious performance typically due to the game not being designed for it...

Sure, I can set it to force 8x edge detect aka 24x AA in driver and watch the frame rate tank due to GPU limitations, that is completely unrelated to what I was testing here or saying here.

How does 280 have less bandwidth than 4870? :laugh:

GTX280 = 141.7 GB/s
GTX260 = 111.9 GB/s
HD4870 = 115.2 GB/s

4870 might have powerful shader but majority of the games we have today weren't shader bound to begin with.

I only compared to 260 because it really is similar in bandwidth and have more rop than 4870. Yet 4870 easily paces with 260gtx.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Some NVDA fanboi who has inferiority complex towards AZN :laugh:

I think you are just another version of wreckage. You stopped listening to your senses or reason long time ago.

I have mass effect and benchmarked with AA on and off few times. My theories have always been correct far as I'm concerned and been proven multiple times when tested even though I get flamed buy non educated guys like you. That's the life of person who can see a bit more than others. :)
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: GaiaHunter
This is very confusing for a newbie like me.

On one hand people are saying that the CPU is limiting the 4850, but on the other hand a 4870 will still get a better performance on the same CPU.

So if I need to decide between spending more money on a faster CPU or on a faster graphic card (considering my interest is gaming only) its better to spend my money on 2x 4850 and pair it with a crappier/cheaper/slower CPU, because even if the CPU is bottle necking it, the overall frame rate is higher with 2x 4850 than with a faster CPU and 4870 of the same combined price. Right?

Basically the conclusion is that for games, money wise, you get a much better performance/price ratio from GPUs than CPU's?

Or the only conclusion is that a faster CPU would get more out of a 4850 in this game? And what is the practical conclusion? That is still better to upgrade the GPU to get more performance then spend the same on the CPU?

This reminds me of when I got a K7 that was slower on everything than the P4 but were dirty cheap, so for the same money I would get a much better FPS on a K7 platform than on P4.

Or am I just wrong?

Take it from a PC gamer for nearly 20 years.

Get the fastest GPU you can afford. Forget about this nonsense about being bottlenecked by CPU. Only real bottlenecks is when you have p4 or something 4 years ago. Long as you have modern dual core you are safe from gaming. Quad core is slowly being utilized but the benefit is mostly non existent.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
We have to put a few things in perspective here. CPU bound or not ( I'm not convinced tbh ) you simply want the best performance. A faster CPU 'might' give 100 fps instead of 80fps, but do you care? Your LCD is probably limited at 60fps anyways. A faster GPU might give 80fps instead of 40 fps though, which is a significant, and noticable improvement. So even if the world screams, you're HD4850 is CPU limited with a e8400 at 3.6ghz, it still doesn't mean anything. You have to find the right combination between gpu and cpu. I'd say for ANY single videocard, a e7200 is going to give you all the performance you need for a while.

I still wonder how taltamir is going to explain higher FPS with a HD4870 or gtx280 running with the same CPU. How CPU limited are those cards, if they show improvements with the same CPU which you say is bottlenecking a HD4850?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: MarcVenice
We have to put a few things in perspective here. CPU bound or not ( I'm not convinced tbh ) you simply want the best performance. A faster CPU 'might' give 100 fps instead of 80fps, but do you care? Your LCD is probably limited at 60fps anyways. A faster GPU might give 80fps instead of 40 fps though, which is a significant, and noticable improvement. So even if the world screams, you're HD4850 is CPU limited with a e8400 at 3.6ghz, it still doesn't mean anything. You have to find the right combination between gpu and cpu. I'd say for ANY single videocard, a e7200 is going to give you all the performance you need for a while.

I still wonder how taltamir is going to explain higher FPS with a HD4870 or gtx280 running with the same CPU. How CPU limited are those cards, if they show improvements with the same CPU which you say is bottlenecking a HD4850?

@ 720x480 resolution the video card is at 20% GPU usage... it is absolutely limited by CPU...
Yet the game does not feel smooth.
80 to 100 fps? my MAX fps went from 65 to 96... my average FPS went from 52 to 67, and my per second rounded FPS, went from a measured 39 to 47... but the REAL min, the instantaneous.. was at a pathetic 41ms, which is 23FPS in that instant. (you ROUND the per frame FPS rate for all the frames rendered within one second to get the min FPS...).

There is nothing to explain here really. The GPU was completely out of the equation at that resolution. And yet I still get micro-stutter one a single card.
The key is that we are discussing under 60fps here... in a monitor bound game I would have a min, average, and max FPS of above 60.


@GaiaHunter

What do I recommend? My CPU recommendation just changed from an E8400 for everyone to a Q6600 OCed to at least 3ghz (SAME PRICE), if you are willing to pay a little extra, then get a Yorkfield instead (2x wolfdale cores), and you still should OC it... OCing has become needed for performance because CPU improvement has been utterly pathetic for intel and non existant for AMD for the past 2 years.
If you can't OC, then stick to the E8400. Since it will give you better performance in most games.

For GPU, pair it up with at least a 4850. Feel free to go higher since games like mass effect are not the norm YET! But they certainly exist...
But don't expect the CPU to be out of the picture.

Actually the main purpose of this thread is to find out if these games ARE the norm but nobody noticed... I wish to test other games here and see if they suffer from similar symptoms or not. But AFAIK it is not the norm.

CPU + GPU ideas from cheap to expensive:
1. 100$ intel dual core (E7200 i think) + 8800GT 512 (130 - 30$MIR in most places now)
2. Same CPU as above + 4850
3. 200$ intel quad core Q6600 OC to 3+ ghz, if you don't OC (less recommended) dual core (E8400) + Same GPU as above.
4. same CPU as above + 4870.
5. 300$ intel quad core 45nm, OC to 3.5+ ghz + same GPU as above
6. same CPU as above + 4870x2
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Tatamir, some games do have an overhead not necessarily CPU bound like you think. Frame rates doesn't increase much when you lower or raise resolution long as you have enough GPU power to run the game. Only when the GPU runs out of steam in some resolution it is limited to does it start drop frames. You did gain when you did lower resolution. Not a huge one but it did improve non the less. Mass Effect could be one of those games.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Taltamir, I forgot to mention one thing. Trying running in full screen, and not windowed mode ?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Azn
Tatamir, some games do have an overhead not necessarily CPU bound like you think. Frame rates doesn't increase much when you lower or raise resolution long as you have enough GPU power to run the game. Only when the GPU runs out of steam in some resolution it is limited to does it start drop frames. You did gain when you did lower resolution. Not a huge one but it did improve non the less. Mass Effect could be one of those games.

what is the magical "overhead" you speak of? overhead is either CPU calculations, or extra bandwidth... and the CPU is certainly overtaxed...

if it is running the GPU at 20% and getting crap framerate (which are fairly similar to ones at 6.67x the pixels), either the CPU, pcie bandwidth, ram amount, vram amount, or something like that is preventing the GPU from getting enough data to crunch through.
This appears to be a CPU in this case from my tests, and the tests of those with quad cores oced to 3.6 ghz... And thread affinity. Going from 2 to 3 cores showed improvement, going from 3-4 cores showed further improvement and made the CPU usage fall below 100.. into the 70-80% range... You can not get any clearer proof then that.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
I don't honestly feel like digging up all my UE3 runs better on quads than duals info, but i've done lots of testing with UT3, & things are quite similar on other UE3 games.

The engine utilizes quads very well.

And it does run smoother on them than duals.

I've been trying to say this for a long time in various threads all over, but there are so many people screaming that quads aren't needed, just get a dual, that honestly, i get tired & prefer to let people wallow in their own ignorance.

Nice to see some people are finally figuring out what i learned as soon as i got my quad last year. :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yea... someone here posted some tests though... mass effect is a UE3 game that gets 1/3 the FPS of UT3 itself... i guess it took something drastic to show me that quads are needed.. but now that we have it, we can dig and dig and uncover some juicy meat.

So lets hear some other games where we can get such proof... :)
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: n7
I don't honestly feel like digging up all my UE3 runs better on quads than duals info, but i've done lots of testing with UT3, & things are quite similar on other UE3 games.

The engine utilizes quads very well.

And it does run smoother on them than duals.

I've been trying to say this for a long time in various threads all over, but there are so many people screaming that quads aren't needed, just get a dual, that honestly, i get tired & prefer to let people wallow in their own ignorance.

Nice to see some people are finally figuring out what i learned as soon as i got my quad last year. :)

I don't think UT3 engine is optimized for quad cores at all... If it was we would see more gains than measly 5-10% gain from dual core to quad core.

Here's anandtech's article with quad cores and dual cores in action with UT3 engine.

2.66ghz dual core is averaging 159.6 fps while 2.66ghz quad core is averaging 168.7 fps. The difference here is minimal. We know that quad cores have more cache than a dual core. This could be the real reason why it performs 5-10% faster than dual cores. Not necessarily quad optimized.

http://www.anandtech.com/video/showdoc.aspx?i=3127&p=7
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
Originally posted by: Azn
Tatamir, some games do have an overhead not necessarily CPU bound like you think. Frame rates doesn't increase much when you lower or raise resolution long as you have enough GPU power to run the game. Only when the GPU runs out of steam in some resolution it is limited to does it start drop frames. You did gain when you did lower resolution. Not a huge one but it did improve non the less. Mass Effect could be one of those games.

what is the magical "overhead" you speak of? overhead is either CPU calculations, or extra bandwidth... and the CPU is certainly overtaxed...

if it is running the GPU at 20% and getting crap framerate (which are fairly similar to ones at 6.67x the pixels), either the CPU, pcie bandwidth, ram amount, vram amount, or something like that is preventing the GPU from getting enough data to crunch through.
This appears to be a CPU in this case from my tests, and the tests of those with quad cores oced to 3.6 ghz... And thread affinity. Going from 2 to 3 cores showed improvement, going from 3-4 cores showed further improvement and made the CPU usage fall below 100.. into the 70-80% range... You can not get any clearer proof then that.

It's not magical? It's just the way a game was programmed. I've seen this happen in some games over the years. It doesn't matter how much gpu power you pump into the game and raw CPU mhz doesn't change a thing either.
 
Aug 9, 2007
150
0
0
Mass Effect has to be the most poorly coded game in the recent years.
And it did NOT utilise my Quad at all. Everytime i looked at the taskmanager, either cause it hung, crashed or mysteriously froze it would only use up to 50% sometimes only 33% CPU power.

Don't get me wrong, the game mechanics are quite nice (though dumbed down to suit console gamers) but the technical side of the game is a friggin mess.
Memory managment is a joke too. It pretty much stays at 512MB and just streams the textures in (XBOX360 DVD remnants) which sometimes destroys all illusion. There is no real scaling for a true highend machine. You could probably put half the game into memory with 4GB but it just won't.
 

sticks435

Senior member
Jun 30, 2008
757
0
0
^Agreed. I think the major problem with the "micro studdering" and sudden drop-outs in framerate is due to the texture streaming system. Our system's are running to fast for the streaming system to keep up, thus causing the FR to skip.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: frythecpuofbender
Mass Effect has to be the most poorly coded game in the recent years.
And it did NOT utilise my Quad at all. Everytime i looked at the taskmanager, either cause it hung, crashed or mysteriously froze it would only use up to 50% sometimes only 33% CPU power.

Don't get me wrong, the game mechanics are quite nice (though dumbed down to suit console gamers) but the technical side of the game is a friggin mess.
Memory managment is a joke too. It pretty much stays at 512MB and just streams the textures in (XBOX360 DVD remnants) which sometimes destroys all illusion. There is no real scaling for a true highend machine. You could probably put half the game into memory with 4GB but it just won't.

and you see nothing wrong with only testing CPU usage when the game CRASHED on you?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: chizow

I could link about a dozen reviews done at 3GHz on today's high-end parts and CF/SLI solutions even at 1920 that show CPU bottlenecking but frankly, they're quite abundant. Its going to take some time to get used to, but a 3GHz C2D isn't enough to guarantee you're not CPU bottlenecked anymore with the current high-end and multi-GPU solutions out there now.

To me cpu bottleneck is NOT the same thing as being CPU limited. If I were to switch from E6600 2.4ghz ot Q6600 @ 3.4ghz, I might gain 10-20% in a game with 8800GTS. If I were to switch from 8800GTS to 4870 I'd increase my framerates by a factor of 2x at least. Now consider taking Q6600 @ 3.0ghz with 4870 and putting a 2nd 4870 in CF and you'd probably gain another 50-70%. Yet, overclocking a quad to 4.0ghz might net maybe another 10-20% with a single 4870).

So while I would tend to agree that you are almost always cpu limited, you aren't necessarily cpu bottlenecked. At least that's my interpretation on the subject. A cpu bottleneck would rather imply that CPU is THE limiting factor in generating FPS, which for the vast majority of games is completely untrue for a C2D at 3.0ghz.

The way I look at it is this:

1) If you swap to the fastest CPU, do you get a % increase in games? (Yes = Cpu limited, no effect = not limited)

2) If you swap to the fastest GPU, do you get a % increase in games (Yes = gpu limited, no effect = not limited)

3) Which of Option 1 or Option 2 provides the greatest % increase in games relative to one another? (If Option 1 = CPU Bottlenecked, if Option 2 = GPU Bottlenecked).

To imply that C2D 3.0ghz is the "bottleneck" is somewhat unfounded imo.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: RussianSensation
Originally posted by: chizow

I could link about a dozen reviews done at 3GHz on today's high-end parts and CF/SLI solutions even at 1920 that show CPU bottlenecking but frankly, they're quite abundant. Its going to take some time to get used to, but a 3GHz C2D isn't enough to guarantee you're not CPU bottlenecked anymore with the current high-end and multi-GPU solutions out there now.

To me cpu bottleneck is NOT the same thing as being CPU limited. If I were to switch from E6600 2.4ghz ot Q6600 @ 3.4ghz, I might gain 10-20% in a game with 8800GTS. If I were to switch from 8800GTS to 4870 I'd increase my framerates by a factor of 2x at least. Now consider taking Q6600 @ 3.0ghz with 4870 and putting a 2nd 4870 in CF and you'd probably gain another 50-70%. Yet, overclocking a quad to 4.0ghz might net maybe another 10-20% with a single 4870).

So while I would tend to agree that you are almost always cpu limited, you aren't necessarily cpu bottlenecked. At least that's my interpretation on the subject. A cpu bottleneck would rather imply that CPU is THE limiting factor in generating FPS, which for the vast majority of games is completely untrue for a C2D at 3.0ghz.

The way I look at it is this:

1) If you swap to the fastest CPU, do you get a % increase in games? (Yes = Cpu limited, no effect = not limited)

2) If you swap to the fastest GPU, do you get a % increase in games (Yes = gpu limited, no effect = not limited)

3) Which of Option 1 or Option 2 provides the greatest % increase in games relative to one another? (If Option 1 = CPU Bottlenecked, if Option 2 = GPU Bottlenecked).

To imply that C2D 3.0ghz is the "bottleneck" is somewhat unfounded imo.

No one is saying you won't see an increase from a faster GPU even with a slower CPU, if you were GPU bottlenecked with a slower part than of course you would see an increase up to the point you became CPU limited. But you would eventually hit a limit that all of the higher-end parts meet and that would become more obvious the lower resolution you were running. If a 4870 and 9800GTX are both capping out at say 78FPS at 1280x1024 was there any real advantage of a 4870 over the 9800GTX? Of course there is, you'll just need a faster CPU to realize it. I'd certainly agree with upgrading the GPU first, but my point is that adding a 2nd GPU might not result in the gains you were expecting without a faster CPU.

But let's look at your 4870 example to prove this point, as this is a clear case where CPU bottlenecks are holding back the 4870 in CF.

4870 and CF @ 3GHz

4870 and CF @ 4GHz

If you look at the 3GHz results its very clear that 4870 CF is not scaling well compared to a single 4870 and results in the same performance as 4850 CF even at 2560. Now look at the 4GHz results and you'll see the story is entirely different, with a single 4870 outperforming the 4870 in CF @ 3GHz at 1920 and 4870CF @ 4GHz distancing itself from 4850CF. There's also huge increases across the board at 1280 with the 4GHz compared to 3GHz, showing heavy CPU bottlenecking. That's just scratching the surface of all the hints in those graphs showing CPU bottlenecking but it was most relevant to your example.

No its not perfect as there is some discrepancies between the 3 and 4GHz rigs, like 2 vs 4GB and DDR2 vs DDR3 but they are still consistent between the results at the same clock speed and I think CPU speed clearly has the greatest impact of all the differences. There's quite a few other reviews that show CPU bottlenecking at 3GHz and a few at 4GHz that show GPU scaling with a faster CPU, but few done by the same site.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Clearly UT3 the actual game and not the engine is getting hammered by modern GPU where it is becoming CPU limited. This is what happens with old games time and time again. Just look at Half Life 2. I was getting over 60fps with my 8600gts UT3. While I get 40fps average with my 8800gs that is 2x faster with mass effect. Mass effect is more GPU intensive than UT3.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Azn
We know that quad cores have more cache than a dual core. This could be the real reason why it performs 5-10% faster than dual cores.

The e6750 and q6700 you were comparing have the same amount of cache per core.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: RussianSensation
Originally posted by: chizow

I could link about a dozen reviews done at 3GHz on today's high-end parts and CF/SLI solutions even at 1920 that show CPU bottlenecking but frankly, they're quite abundant. Its going to take some time to get used to, but a 3GHz C2D isn't enough to guarantee you're not CPU bottlenecked anymore with the current high-end and multi-GPU solutions out there now.

To me cpu bottleneck is NOT the same thing as being CPU limited. If I were to switch from E6600 2.4ghz ot Q6600 @ 3.4ghz, I might gain 10-20% in a game with 8800GTS. If I were to switch from 8800GTS to 4870 I'd increase my framerates by a factor of 2x at least. Now consider taking Q6600 @ 3.0ghz with 4870 and putting a 2nd 4870 in CF and you'd probably gain another 50-70%. Yet, overclocking a quad to 4.0ghz might net maybe another 10-20% with a single 4870).

So while I would tend to agree that you are almost always cpu limited, you aren't necessarily cpu bottlenecked. At least that's my interpretation on the subject. A cpu bottleneck would rather imply that CPU is THE limiting factor in generating FPS, which for the vast majority of games is completely untrue for a C2D at 3.0ghz.

The way I look at it is this:

1) If you swap to the fastest CPU, do you get a % increase in games? (Yes = Cpu limited, no effect = not limited)

2) If you swap to the fastest GPU, do you get a % increase in games (Yes = gpu limited, no effect = not limited)

3) Which of Option 1 or Option 2 provides the greatest % increase in games relative to one another? (If Option 1 = CPU Bottlenecked, if Option 2 = GPU Bottlenecked).

To imply that C2D 3.0ghz is the "bottleneck" is somewhat unfounded imo.

Funny, this is EXACTLY what I said until I actually ran some tests...


Another thing is... You could be CPU limited and experience the same min FPS/stutter, but have higher max FPS when upgrading the card. So the artifical tests show the more powerful GPU increases performance, but it has the same level of smoothness due to identical min FPS.

Another thing is the "min FPS" is not accurate, it averages all the frames that were rendered during that entire second... so if your min FPS was 41, it means 41 frames were averaged together... but how long did it take to render each one? at a constant rate it would be 24.4ms, but if you spike into 40+ms then youare at a much lower smoothness level.

I saw microstutter on my tests and playthroughs in mass effect. And at 720x480 resolution I was getting hickups where it goes from the low 10s of ms per frame to 40 ms... my worst case was to 41ms, that is 23FPS at that FRAME (not at that second). The min FPS listed in the chart was 43 though.

Originally posted by: schneiderguy
Originally posted by: Azn
We know that quad cores have more cache than a dual core. This could be the real reason why it performs 5-10% faster than dual cores.

The e6750 and q6700 you were comparing have the same amount of cache per core.

Isn't CPU cache similar to vram for multi GPU setups? that is, the effective cache is total / cores, rather then just the total, since each one has its own dedicated cache?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: schneiderguy
Originally posted by: Azn
We know that quad cores have more cache than a dual core. This could be the real reason why it performs 5-10% faster than dual cores.

The e6750 and q6700 you were comparing have the same amount of cache per core.

no. Q6700 shares the cache between cores which has 8 meg cache. While Q6750 has 4megs.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Azn
I don't think UT3 engine is optimized for quad cores at all... If it was we would see more gains than measly 5-10% gain from dual core to quad core.

We know that quad cores have more cache than a dual core. This could be the real reason why it performs 5-10% faster than dual cores.

http://www.anandtech.com/video/showdoc.aspx?i=3127&p=7

Sorry, but you're mistaken.

I don't really have the time or patience to argue with you (i know you love debating; i don't really, especially when i know i'm right).

Please see the UT3 testing here: http://www.anandtech.com/cpuch...owdoc.aspx?i=3344&p=15

Notice how the "crappy" Phenoms X4s are beating the higher cached & clocked C2Ds?

I'd appreciate you not coming up with a non-sensical argument to explain that too.

But here's the facts: UT3 does utilize quads quite well; it does spread the load over four cores somewhat also.
With a decently clocked quad, you will never see the game hit 100% on a core.
With even a heavily clocked dual, it will hammer both cores @ 100%.

I've done real world gameplay testing for myself (me running around in maps actually playing) & measured the difference in Fraps between quad vs. two cores (two disabled), & while in some maps the difference in fps in minimal to none, in other maps, the difference for minimum fps is actually very large.
It's very hard to simulate real world gameplay & still get precisely accurate results, so obviously results can vary.
But again, overall, there is zero question that the game runs better with quads than duals.

I am busy with work & also getting semi-prepared to start my 8800 GTX vs. GTX 280 comparison, so i don't have time to start running more testing now.

I cannot speak for certain on other UE3 titles, as i've never tested them to the extent i have UT3, but i know they are at least similar, which would explain taltamir's results.

And Azn, you have a PM.