Bad Company 2 Beta Benchmark

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phil1977

Senior member
Dec 8, 2009
228
0
0
Awesome!

I will wait with patience :)

EDI: I also preordered it. I will enjoy it no matter what...
 

Toasted

Junior Member
Feb 19, 2010
3
0
0
Hey, did anyone notice a sharp change in cpu usage within the last couple of days?

I was playing BFBC2 on an E8400 at 3.6 Ghz while getting an average of 60-70 % cpu usage. I noticed that sometimes I would get this "glitch" where the game would constantly use 95-100%, and I would just restart to fix.

Now it seems that BFBC2 always maxes out my cpu (95%-100%). I tried boosting my E8400 to 3.8 Ghz and it still maxes out. Oddly enough, turning HBAO on actually reduces cpu usage, as did turning AA filtering up.

Is this likely a problem with my computer, a result of the beta not being optimized, or because the game is now "optimized" and the dual core just isn't powerful enough to run it?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Hey, did anyone notice a sharp change in cpu usage within the last couple of days?

I was playing BFBC2 on an E8400 at 3.6 Ghz while getting an average of 60-70 % cpu usage. I noticed that sometimes I would get this "glitch" where the game would constantly use 95-100%, and I would just restart to fix.

Now it seems that BFBC2 always maxes out my cpu (95%-100%). I tried boosting my E8400 to 3.8 Ghz and it still maxes out. Oddly enough, turning HBAO on actually reduces cpu usage, as did turning AA filtering up.

Is this likely a problem with my computer, a result of the beta not being optimized, or because the game is now "optimized" and the dual core just isn't powerful enough to run it?
Did your FPS change while the CPU usage went up? I'd guess that turning up image settings lowers FPS enough to take a load off of the CPU so it no longer has to max itself to keep up with the GPU.
 

Toasted

Junior Member
Feb 19, 2010
3
0
0
Yeah that's what I figured, but now I still can't explain the random jump in cpu usage. Every comment I've read about dual cores mentions around 60-80% load, not a constant 100%.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
They released an update that was supposed to be more optimized for dual cores.

Also they've said that the settings in the Beta are all on medium. Even if you set it to high it's still medium.
 

Toasted

Junior Member
Feb 19, 2010
3
0
0
When you say more optimized for dual cores, does that mean shooting the 70-80% load (with which I had no problems) up to a constant 100 %, which causes my game to stutter every time there is more processing to do? I'm just confused about what optimizing entails.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
They released an update that was supposed to be more optimized for dual cores.

Also they've said that the settings in the Beta are all on medium. Even if you set it to high it's still medium.
I'm looking forward to "high" settings and DX11; I like having eye candy in multiplayer as well.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Doubtfull, the Q9550 has the same 6MB of cache shared between two cores as the E8500, like two 8500 glued together, and if data has to be shared across the two different sets of dual cores, the performance penalty would be great because it would have to travel through the memory bus. The performance gains would be thanks to the additional cores, but is just too little of a gain though.

, I personally went from a e8400 @ 3.6 to a q9550 at 3.6, and yes the quad is faster. It's also true with Farcry 2 for some odd reason.

With new upcomming games like Metro ,the quad is going mainstream.
The days of the fast dual core giving = performance are numbered.
 
Last edited:

glorylin

Junior Member
Sep 20, 2008
13
0
0
Hi happy medium. I don't post here much but I saw your reply and wanted to ask you personally how big of an increase in performance you saw from an e8400 to q9550.

I'm on an e8400 @3.82 and I will be getting my q9550 monday. Hoping to be able to hit 4ghz on it.


Before Patch: DX10 - Everything high except shadows & level of detail @ medium. AA off, AF off.

Min: 14
Max: 164
Avg: 38.418

After Patch: DX10 - DX10 - Everything high except shadows & level of detail @ medium. AA off, Af off.

Min: 24
Max: 112
Avg 52.285

After Patch: DX9 - DX10 - Everything high except shadows & level of detail @ medium. AA off, Af off.

Min: 25
Max: 154
Avg: 54.472

After Patch: DX10 - Everything high, AA off, AF off.

Min: 23
Max: 127
Avg: 48.084

Those are the results I got running under 1680x1050. I'm very disappointed with the way it runs and I'm hoping the quad will increase my avg by a substantial amount (+10-15 avg)
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Whatever they did in the patch fixed the DX10 performance issues I was having. I can use AA now, yay ^_^
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
Same. My dualie is performing much better after the patch. Seriously can't believe how many people I've seen lately writing that a highly clocked dual just won't cut it. Utter BS. I'm not saying some games don't benefit more from quads, but no benchmarks have given me any reason to go to a Core 2 Quad or Core Ix. Maybe Sandybridge or Bulldozer will justify a new build for a gamer like myself.
 

glorylin

Junior Member
Sep 20, 2008
13
0
0
Same. My dualie is performing much better after the patch. Seriously can't believe how many people I've seen lately writing that a highly clocked dual just won't cut it. Utter BS. I'm not saying some games don't benefit more from quads, but no benchmarks have given me any reason to go to a Core 2 Quad or Core Ix. Maybe Sandybridge or Bulldozer will justify a new build for a gamer like myself.


It performs better outside of action, but during fights my FPS is still around the 30's. Anything under high 40's is super irritating to me.

Before the patch, quads were getting a 2:1 performance increase over the duals. I'm not sure what it's like now but I still hear there;s a solid 50-70% increase in AVG frames.
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
It performs better outside of action, but during fights my FPS is still around the 30's. Anything under high 40's is super irritating to me.

Before the patch, quads were getting a 2:1 performance increase over the duals. I'm not sure what it's like now but I still hear there;s a solid 50-70% increase in AVG frames.

The game runs better after the patch on my dual core. However, all this whining about dual cores being phased out is a load of crap. Mostly its because of shoddy programming rather than actual performance being taken out of the extra cores.

For example, all audio is done on the processor even if there is dedicated audio hardware available, which is an obvious console port issue.

When Crysis is running well on dual cores and this is not, theres a problem.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
So taking your statement, it shouldn't be difficult to see what's going on here: In a multithreaded optimized application (using more than two threads), a multi-core processor is obviously going to perform better than a dual core processor. Again, this is an extremely simplified view, as not all threads in this title are going to be doing the exact same thing, thereby weighting thread performance differently. But still, more cores (up to whatever the application was designed for) will make a performance difference. In essence, you just disproved your own argument.

Please do explain how E8400@ 3ghz is just as fast Q6600 @ 2.4ghz in GTA 4.

http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Please do explain how E8400@ 3ghz is just as fast Q6600 @ 2.4ghz in GTA 4.

http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2

Because GTA4 only uses 3 threads (as far as I can tell, it never goes over 75% usage on my quad) and the e8400 is faster per clock than the q6600.

If we have 3 cores at 2.4ghz (the q6600 running gta4), that is around 7.2ghz worth of power (assuming linear scaling, no cache sharing issues, etc). If we have 2 cores at 3.0ghz, then we have 6.0ghz worth of power, but if it's 10-15% faster per clock, that 6ghz from the e8400 is more like 6.8ghz in terms of Q6600 power, which would put it right up with the q6600 in this application.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
In window 7 all the cores are being used even in a single threaded application
Ex falso quodlibet?

Win7 does a lot of thing but it tries to not utilize all cores if there's no reason for that.. core parking and stuff
 
Last edited:

Phil1977

Senior member
Dec 8, 2009
228
0
0
Please do explain how E8400@ 3ghz is just as fast Q6600 @ 2.4ghz in GTA 4.

http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2

Different cpus. 45nm vs. 65. Cache size is different. FSB is different. Wolfdale has massive cache very good for games. Newer architecture.

Compare E6600 with Q6600. Tells the real story. Same 65nm, same cache, same generation, same clock speed, same FSB.

64% improvement for min. frames by going from 2 to 4 cores. Very good core scaling!

If you want to compare wolfdale core, compare E8400 with QX6850 (same 45nm generation, same cache, same FSB, same clock). Also great scaling.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Different cpus. 45nm vs. 65. Cache size is different. FSB is different. Wolfdale has massive cache very good for games. Newer architecture.

Compare E6600 with Q6600. Tells the real story. Same 65nm, same cache, same generation, same clock speed, same FSB.

64% improvement for min. frames by going from 2 to 4 cores. Very good core scaling!

If you want to compare wolfdale core, compare E8400 with QX6850 (same 45nm generation, same cache, same FSB, same clock). Also great scaling.

I though all QX6xxx CPUs were 65nm?
Did you mean QX8xxx or QX9xxx series?
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
looks like my 4850 can barely do 12x10 on this with AA. as long as I get 30 somehow I can live with it.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I believe 100% cpu usage in games is impossible, unless all the graphics would be rendered via the CPU like in old DOS games. At least I haven't seen 100% usage in any game I can remember. I always have ATI Tray Tools overlay running. It shows the CPU usage along with other probe readings.

But the video card does the rendering now. And the CPU has to wait for the GPU at times. Also some threads need to wait for other threads to finish and sync with each other.

Maybe in a chess game where the graphics are minimal?

STALKER Call of Prypiat uses one of my 4 cores at full 100% all the time, that's the usage I found monitoring the CPU usage with Rivatuner, so Windows 7 not all the time spreads the thread across all the cores. Bioshock uses two cores with an average usage of 27% to 54% while the other two remaining never exceeds 24% and can go as low as 7%, it would means that it's optimized for Dual Core only.

look how much better the i7 is in GTA 4. it kills the Core 2 Quad never mind Core 2 Duo on a clock for clock basis. http://www.pcgameshardware.com/aid,...ead-of-Core-2-Quad-in-CPU-benchmarks/Reviews/

I find odd that in every thread you keep hailing the i7, you should buy one. The i7 architecture isn't any wider than the Core 2 Quad architecture, so the IPC isn't that much different, both C2Q and C2D shares the same IPC, it isn't any different at all.

The main advantage of the i7 is their server approach tweaks that helped to boost performance like an integrated memory controller, more SSE instructions, cache tweaks, Turbo Boost for single threading and hyper threading which helps to maximize the front end utilization which went underutilized a great deal of time with the Core 2 architecture, plus it might be not a big deal but the i7 is a true quad core, the C2Q communication between the first set of two cores with the second set of two cores was abysmally slow. The Core i7 has a higher latency cache compared to the C2Q, but the C2Q besides of the underutilized front end issue, it was very limited by the Front Side Bus and its non native quad core design.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3382&p=3

"Conroe was the first Intel processor to introduce this 4-issue front end. The processor could decode, rename and retire up to four micro-ops at the same time. Conroe’s width actually went under utilized a great deal of the time, something that Nehalem did address, but fundamentally there was no reason to go wider. "

GTA4 will never be the best representation of future games, the game has crappy graphics that can't even tax enough a HD 3870 in terms of raw power, it is so CPU bound, plus has a lots of uncompressed textures that will simply fill a 2GB card easily, making it a VRAM/Texture limited game, it is simply a crappy game with no optimizations. If the game was properly done, it would run great on a Dual Core and a 9600GT with no problems, so much hardware for so little in return, Crysis did it much better as it offered much better graphics, and running it on a Quad Core won't give you lots of benefits.

http://www.behardware.com/articles/778-10/giant-roundup-131-intel-and-amd-processors.html

Here the benchmarks shows that Nehalem is barely faster than the C2Q and Phenom II in terms of gaming performance, in everything else, the biggest difference is only 50%, which shows that talking out of context, Nehalem is simply a rehashed C2Q with a True Quad design, IMC, minor tweaks for better threading performance, server oriented performance tweaks and a redesigned cache. At the execution engine level, they're not too different. But there's no reason to dump the design since it is a very good one which dates from the P6 design and its first incarnation, the Pentium M.
 
Last edited:

Phil1977

Senior member
Dec 8, 2009
228
0
0
Well I had a AMD Athlon II 250. I just upgraded.

Would upgrade to a Core 2 System? Prices are the same between a Core 2 board + ram + cpu and going with a i5 / i7 system.

I got the i7 860 a cheap H55 GB board and cheap PNY 1333 DDR3. Should last me for the next years or so.

I heard that the next intel architecture is only 20% faster clock for clock compared to i7...