• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

CPU bound on mass effect, E8400 @ 3.6ghz

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Someone has been asking about his unreasonably low FPS and GPU utilization... we started arguing if he is CPU bound with his X2 @ 2.7ghz OC is simply CPU bound on those games... which were Crysis, Age of Conan or Call of Duty 4.
http://forums.anandtech.com/me...id=2205023&STARTPAGE=1

Finally, he mentioned that even mass effect is slow...

I thought to myself, how could this be... So I ran a little test on my gaming machine... this rig has:

E8400 @ 3.6ghz
2x2GB DDR2-1000 ram (running it at 800mhz with lower voltage and timings).
WD640GB HDD (I use a file server for storage)
VisionTek HD4850 512MB GDDR3 card. (with fixed MSI bios further modified for higher fan speeds)

I run mass effect at 1920x1200 (native) resolution, particles are on high (3/3), textures on high (3/4, there is also Very High which is unplayable), no blur, no film grain, vsync on.

Firstly, I am definitely not hitting 60 fps, I can feel some lag.

Secondly, I was shocked to see that even on 3.6ghz and high graphic settings I am still getting 100% CPU usage.

I thought that at such an OC I am certain to be GPU limited in everything I play... especially with a "weaker" card like the 4850 (compared to the 4870 or the G200 series).

I probably need to bite the bullet and edit the cfg file for riva tuner to get it to work with the 4850 like i did with the 8800GTS 512 when it first came out (rather then wait for the next riva tuner version). That way I could see video card usage and exact FPS on mass effect. I actually will do so ASAP.

In the meanwhile, please post any other game that you know that would be CPU bound even on 1920x1200, high settings, and an OCed E8400...


EDIT:
Tests done with FRAPS:
1920x1200 window mode:
Min Max Avg
39 65 52.252

720x480 window mode:
Min Max Avg
47 96 67.797

The FPS barely increases at all going from 720x480 to 1920x1200...
that is going from 345,600 pixels to 2,304,000 pixels. or... 6.6666666666 (repeating) times the pixels!

Both of those exhibit the "jitter", something I normally do not see in games until they fall into the low 10s in FPS, but is seen in mass effect even in the 50s. I don't know why FRAPS doesn't show it, it seems to be something other then an FPS drop, as in, something else in the engine is causing that effect rather then low FPS. (that or I am really really sensitive to sudden drops).
Notice that both are on a 3.6ghz C2D E8400... CPU is a constant 100%, sometimes drops to 95%, I have seen it go as low as 80 once... The GPU is 70-100% on 1920x1200, and at 20% on 720x480... ckearly it is not being taxed at that resolution.

These are tested with frame smoothing off. Which I think improved performance across the board. (but I haven't tested with fraps yet, it was before when I was using the inaccurate bioware tool)...

An interesting thing, the game built in frame counter DID show what appeared to be 10s and 20s on the FPS... fraps never showed it going that low.

however... I think it is a flaw with how fraps calculates it...
digging through the data for the 720x480 resolution I found the following:
605 8385.076 10.98
606 8426.748 41.672
607 8441.728 14.98

This is not the only example, but it is the worst (there were many which were 40 and 39 ms)...

As you can see frame 605 and 607 both took 11 and 15ms respectively to draw... while frame 606 took a whopping 41.672 ms to draw. That is undoubtly the source of the "jitters" I am noticing both in the high and low resolutions. Also... 1000/42 = 23fps... so at that instantanous frame, I had an FPS of 23, just like I said I noticed in the built in counter (which updates EVERY FRAME!). I will try it again with frame rate smoothing turned back on, but as far as I can tell the jitters decreased with it off...
And just to make things clear, this is at 720x480 resolution!

Either way...
Conclusion: Mass Effect is CPU bound as hell on a "mere" 4850 with an E8400 C2D oced to 3.6ghz!
---
This thread has run its course, it has mostly devolved to circular posts. If you want to start a new thread with new information, please feel free to.

-ViRGE
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Are you running any AA? If not no wonder you're CPU bound. 1920x1200 isn't a very high resolution.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
there is no AA in mass effect. that is why they have a "blur" and "film grain" effect... its supposed to be "like AA"... except it is NOT, it looks like crud. i turned both off for BETTER visual quality AND better performance at the same time...
It is an Xbox360 port with a revamped UI... also please note that I am CPU bound at BELOW 60FPS! that is the big deal here... on a CPU overclocked beyond the fastest you can buy.

And 1920x1200 is significantly high... CPU bound is typically a 1280x1024 scenario.. MAYBE a 1650x1080.. i have never heard of it above, in fact everyone keep on telling me how rare it is even in 1650x1080 resolutions... Yet this is clearly the case here.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: taltamir

And 1920x1200 is significantly high...
High compared to what? 2560x1600 has ~78% more pixels than 1920x1200.

2560x1600 is a high resolution; 1920x1200 really isn't that high.

I've been running 1920x1440 on my CRT for about six years and even that has ~20% more pixels than 1920x1200.

Again if you're not running any AA it's no surprise you're CPU limited.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I play mass effect at very high maxed out. I get average of 40fps in that game at 1440x900 resolution. It doesn't dip below 25fps so it's doesn't effect my game play. I even turn on 4xAA and it barely does anything to my frame rate so I have it on.

I doubt you are cpu limited. Here's 4850 mass effect benchmark. You should be hitting about 50fps average.

http://www.tomshardware.com/re...n-hd-4850,1957-18.html
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Texture resolution's effect on performace doesn't have anything to do with how fast your CPU is, and neither does high display resolutions. If you were CPU bound, then turning the resolution down wouldn't help, but I'm sure it will as the game isn't particularly demanding on the CPU.
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
I'm playing the game maxed out @ 1600x1200 without problems. And that means motiong blur, film grain and dynamic lighting turned off since they look like crap.
 

Davegod

Platinum Member
Nov 26, 2001
2,874
0
76
Don't games still push the cpu to 100% even when GPU bound? Usually games would rather make full use of the cpu and throw away any work the card cannot handle. I haven't looked into it since dual core TBH but still I'd expect one core to be 100% all of the time.

The only true way to determine if you are GPU bound is to switch graphics settings and compare FPS. Try 1920x1200 with high settings and compare to 800x600 low settings (bear in mind often games put some settings under 'graphics' which are really done on the cpu). "Stat FPS" is the console command to show FPS. You should disable vsync and the ingame framerate cap while testing.

also http://www.tweakguides.com/ME_1.html
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
yeah, just because the cpu hits 100% doesn't mean you're CPU bound ...

you will wanna look just vary resolution with all the other settings you have and see how much frame rate changes to determine if you are CPU bound or not (when CPU bound you'll not see an increase (or much of one) in performance from lowering resolution).
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Like others have said, 100% cpu usage in games doesn't mean you're cpu bound. This has to do with the way games typically are programmed on Windows, in which the main thread constantly reads Windows message calls as fast as the cpu can run it. On my AMD dual core this would usually load the cpu at 50% (dual core), but with the multithreading "enhancements" in Nvidia drivers, and possibly Ati drivers as well, both cores get fully loaded.

OTOH, I've played Mass Effect with everything cranked up on a single 8800gt at 1920x1200 and I didn't feel like the game was too slow. It's not a run and gun twitch shooter, so even less than 60fps was completely playable.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Mass Effect clear and simple is a badly ported game. Unreal 3 shows how well the Unreal 3 engine runs on the PC.

Mass Effect 55-65% CPU Usage @ 1080p 45-95fps ( maxed with no motionblur, film grain, and smooth framerates disabled ). These are just walking around the citadel, as I hate to play this game again.

Unreal 3 with 31 bots is around 55-60% CPU Usage @ 1080p 60-140fps ( maxed and smooth framerates disabled ). Heat Ray map.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Mass Effect is definitely CPU intensive, it runs 65-70% on my Quad @ 3.6GHz. That's nearly a full extra core worth of utilization compared to a Core 2 Duo. Still, I think having to run textures on High and not Very High points to a GPU/driver bottleneck. I was able to run everything maxed + Dynamic Shadows, no film grain on an 8800GTX with ~35-42 FPS. On my GTX 280 everything is absolutely capped at 60 FPS except for pre-rendered cutscenes.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
Considering it's based on the same unreal 3 engine it's not as cpu intensive as you think it is.

If you have a core 2 duo you are fine long as you have more L2 cache. Quad cores help some.

Absolutely minimal difference in clock speed even @ 1024x768 resolution.

http://www.anandtech.com/video/showdoc.aspx?i=3127&p=6

Except that's not done at 1024x768, its done at 1920 with last gen parts:

We then cranked up the resolution to 1920 x 1200, and increased the world detail slider up to 5 to give us a more realistic situation for this clock speed comparison. The results were a bit surprising:

This is why its important to put commentary in graphs imo, because no one actually reads the text anymore. Like in cases where everything is frame capped or cpu limited, just put a big red "FRAME CAPPED" instead of 59.1, 60.1, 59.8 and have people draw the conclusion all the parts are equal......

I could link about a dozen reviews done at 3GHz on today's high-end parts and CF/SLI solutions even at 1920 that show CPU bottlenecking but frankly, they're quite abundant. Its going to take some time to get used to, but a 3GHz C2D isn't enough to guarantee you're not CPU bottlenecked anymore with the current high-end and multi-GPU solutions out there now.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: chizow
I could link about a dozen reviews done at 3GHz on today's high-end parts and CF/SLI solutions even at 1920 that show CPU bottlenecking but frankly, they're quite abundant. Its going to take some time to get used to, but a 3GHz C2D isn't enough to guarantee you're not CPU bottlenecked anymore with the current high-end and multi-GPU solutions out there now.

I would actually like to see those reviews, in particular with a single high end card and high resolutions. Most reviews only test low resolutions for cpu scaling in games, and are utterly useless to me. With the release of new high end video cards I'm now seriously wondering how much of a bottleneck my AMD cpu would be at 1920x1200 for these cards.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Maybe someone on these fine forums with a gtx280 or 4870 and an overclocked CPU at around 4ghz (as well as a monitor capable of higher resolutions) would do us the honor of benchmarking several games and lowering their overclock steps at a time. Something like 4ghz, 3.5, 3, 2.5, 2 would be sufficient for me. And across several games.

In fact, if anyone wants to start a fund and pool some cash together to build me such a system, I will gladly benchmark any game you like :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i am posting from work right now, but this is exactly what I intend to do when I get home... granted I only have a 4850... and a 3.6ghz OC (i had it at 4ghz stable before, I just prefered a more modest OC). I can go back up and down and get some benchies.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
there is no AA in mass effect

Sure there is. Profile is available using latest Nvidia drivers (Vista and XP). With ATI, you need to rename the executable to Bioshock.exe (This may not work under Vista)

Trust me, you wont be CPU limited after turning AA on.

Leon
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Leon
there is no AA in mass effect

Sure there is. Profile is available using latest Nvidia drivers (Vista and XP). With ATI, you need to rename the executable to Bioshock.exe (This may not work under Vista)

Trust me, you wont be CPU limited after turning AA on.

Leon

you are kidding right? about naming the exe bioshock.exe i mean...

And having my frames drop even lower is not a hot prospect... anyways, I will try it and we will see.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
you are kidding right? about naming the exe bioshock.exe i mean...

No. Profile for masseffect.exe AA is not yet in Catalyst drivers, by renaming the executable (to bioshock.exe or GOW.exe), UE3 AA workaround will be applied in the game. Just don't use the launcher when starting the game and you'll be fine.

Leon
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
A demonstration:

No AA

AA

Also note the performance hit, a lot of it from adaptive AA. Catalyst AI must be enabled.

Leon
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ok here is some extra info..
Firstly, mass effect can use 4 cores apperantly, if it is getting 80% usage on quad cores @3ghz.

RivaTuner does not work with 4850 as of yet, so measuring is sketchy, you use the console command to display the FPS and try to observe it, unfortunately it appears to be updated with every fps, (it moves like it updates 60 times a second or so, it shows the ms between each frame and deduces an FPS rating from that).

Typically I hover at 17/19 ms (switching back and forth over and over... which shows up as 52fps.
This goes with between 80 and 100% CPU usage and between 70% and 100% GPU usage. I have seen 100CPU/70GPU, 80CPU/100GPU, and 100/100 @ 30fps when staring down a fountain...
All of those are with Very High textures...

Increasing textures to very high did not change the FPS measured at all, I saw it dip once to 17fps which is lower then what it seems to dip to with high textures.
but standing at the same spots gives the same 52fps (which is exactly what was measured by that toms hardware site that was linked here... which tested with a 3ghz quad core... i am using a 3.6ghz dual core... and we have someone here that confirms ME takes 70-80% of his 3ghz quad core).

However, despite the very high settings not changing the measured FPS at all, and matching the 52fps reported by the review perfectly, it does not feel even remotely smooth, it is really bothersome to play due to jitter.

I would say that rather then being CPU bound, I am merely balanced, to the point that the CPU and GPU take turns being the limiting factor.

I tried renaming (a copy of) mass effect to both bioshock.exe and GOW, neither impacted performance at all, and the game is clearly not anti aliased (it is so jugged it hurts!).

I am currently considering that there will be merit in pushing my OC all the way to 4ghz with results like this.