Battlefield 3 Armored Kill GPU & CPU Performance - GameGPU.ru

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Tahiti scales very well with clock speed in this game. At 1300Mhz it crushes it.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Look at my sig. In Bandar Desert on the crane in the main flag area, my FPS was just above 50. So far my min FPS in this map expansion is about low 50's, so I wonder where they were testing to get a min of 100. Hmm... Perhaps my DDR3 1600 is too slow?
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Look at my sig. In Bandar Desert on the crane in the main flag area, my FPS was just above 50. So far my min FPS in this map expansion is about low 50's, so I wonder where they were testing to get a min of 100. Hmm... Perhaps my DDR3 1600 is too slow?

You're probably looking in the wrong places. How about in a busy intersection with tanks rolling in and a bunch of players firing at each other? Maybe a few explosions.

I'd expect the top of a crane to be GPU intensive but quite light on the CPU.

That's the nature of minimums though - you're not going to see them on average.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
You're probably looking in the wrong places. How about in a busy intersection with tanks rolling in and a bunch of players firing at each other? Maybe a few explosions.

I'd expect the top of a crane to be GPU intensive but quite light on the CPU.

That's the nature of minimums though - you're not going to see them on average.

Its CPU intensive to overlook a lot of action. I've done enough testing to find the bottleneck. GPU usage drops like a rock and FPS tanks. During regular gameplay when I am on the ground, FPS is much higher, but nowhere near a 100min regardless of resolution. I don't know what they did for testing, but those CPU numbers are not what you should expect as they are way too optimistic. Sorry, but i'm just saying the truth here.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Tahiti scales very well with clock speed in this game. At 1300Mhz it crushes it.

Not at all. Kepler still crushes a 1300mhz 7970 in BF3.

The 7970 @ 1300mhz is still SIGNIFICANTLY slower than the 1286mhz GTX 680. Assume we give the 7970 another 10% OC to bring it 1300mhz, and assume 100% scaling we still get the following:

7970 @ 2560x1600 BF3 Ultra Quality + FXAA-High = ~72fps
GTX 680 @ 2560x1600 BF3 Ultra Quality + FXAA-High = 79.1fps

7970 @ 1920x1200 BF3 Ultra Quality + 4xMSAA = ~70fps
GTX 680 @ 1920x1200BF3 Ultra Quality + 4xMSAA = 79.9fps

SOURCE: http://www.anandtech.com/show/6096/evga-geforce-gtx-680-classified-review/9 and http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/18

Don't get me wrong, I love the 7950/7970, but Kepler dominates BF3 without question on the high-end. It is also a lot easier for a standard 680 to hit 1200-1250mhz vs a 7970 to hit between 1200 and 1300mhz.
 
Last edited by a moderator:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Its CPU intensive to overlook a lot of action. I've done enough testing to find the bottleneck. GPU usage drops like a rock and FPS tanks. During regular gameplay when I am on the ground, FPS is much higher, but nowhere near a 100min regardless of resolution. I don't know what they did for testing, but those CPU numbers are not what you should expect as they are way too optimistic. Sorry, but i'm just saying the truth here.

Conceded.

Paints an even darker picture for slower chips.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

Not at all. Kepler still crushes a 1300mhz 7970 in BF3.
The 7970 @ 1300mhz is still SIGNIFICANTLY slower than the 1286mhz GTX 680.

If by signification you mean by 10% or less, sure. Calling that "crushing" is ridiculous (especially when 680 gets slaughtered in Metro 2033, Dirt Showdown, Sniper Elite, Alan Wake, Anno 2070, by far greater amounts than 10%).

AT is consistently proving to have their GPU reviews way off from everyone else this generation. Not sure what they are doing honestly. For starters, when you see GTX680 OC is 40% faster than HD7970 GE in BF3 in AT's review you gotta ask yourself, does this even make sense? Did I see something like this anywhere else on the web? The answer is NOT EVEN CLOSE! Sorry, but when 10-20 professional reviews on the Internet cannot corroborate this performance delta, the AT's review is a major outlier here. We should be able to find some major reviewers with such startling performance delta, but it cannot be done.

Let's look at the world-wide web to see if AT's numbers actually make sense:

Most professional reviews show that HD7970 GE 1050mhz and GTX680 are about 5-10% apart in BF3. I wouldn't call that crushing by any means:

1. Computerbase doesn't show 680 winning by much as AT does, tied at high-rez:
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660/28/

2. Xbitlabs - same story, 7970 GE OC and 680 OC are very close:
06_bat3.png


3. HardOCP - 680 OC vs. 7970 OC very close:
1343520064BMDv1z2TsH_5_2.jpg


4. TechPowerup - same story, barely any difference between 680 and 7970 GE and 7970 GE wins at high rez:
http://www.techpowerup.com/reviews/Sapphire/HD_7970_Toxic_6_GB/8.html

5. HT4U - same story, 680 leads but not by more than 10%:
http://ht4u.net/reviews/2012/nvidia..._gtx660_sc_msi_n660gtx_tf_oc_test/index26.php

6. Legion Hardware - neck and neck at 1080P:
http://www.legionhardware.com/images/review/Gigabyte_GeForce_GTX_660_Ti/BF3_02.png

7. Hexus - 680 OC vs. 7970 OC - very close:
http://hexus.net/tech/reviews/graphics/42489-sapphire-radeon-hd-7970-toxic-6gb/?page=7

8. KitGuru - Very close:
http://www.kitguru.net/components/g...-x-turbo-3gb-iceq-x2-graphics-card-review/16/

9. BitTech - very close:
http://www.bit-tech.net/hardware/graphics/2012/08/13/palit-geforce-gtx-680-2gb-jetstream-review/3

10. PureOverclock - 680 actually loses to HD7970 GE:
http://www.pureoverclock.com/Review-detail/his-7970-x/8/

11. Guru3D - 680 is faster than 7970 but loses to 7970 GE:
http://www.guru3d.com/article/his-radeon-hd-7970-x-turbo-edition-review/21

12. TechReport - very close, 680 AMP! has a slight lead:
http://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/4

13. Sweclockers - not much to it:
http://www.sweclockers.com/image/diagram/3140?k=2dea8a82033a7e6f242bec8c7181f7a1

14. Tom's Hardware - very close and 7970 actually wins at high rez:
http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-8.html

15. TechSpot - pretty much within frames apart:
http://www.techspot.com/review/572-nvidia-geforce-gtx-660/page4.html

And then you have this review @ #16 GameGPU.ru where the performance delta between the 7970 GE and 680 is about 5-10% at most. And that review from GameGPU.ru actually tested multi-player scenario. What do we see:

1080P
HD7970 GE = 54 fps avg / 41 fps min
680 = 57 fps avg / 42 fps min (wow! 6% higher)

1600P
7970 GE = 39 fps avg / 27 fps min
680 = 36 fps avg / 24 fps min (680 loses!)

So how do you explain the massive performance difference in AT's review and 15+ world reviews show that the performance delta is actually fairly close?

You are saying GTX680 OC crushes HD7970 OC in BF3? You are looking at 5-10% delta here. Crushing would be if one videocard was leading by 20-30%+. And once you get to 2560x1600, there is no way GTX680 is crushing anything since it performs much worse beyond 1080P/1200P. Since MrK6 has a 30 inch monitor, he was probably referring to HD7970 1300mhz+ beating a 680 at his resolution. That's probably true too.

Not to mention GTX680 is $500 and HD7970 GE are $450. So no GTX680 does not crush 7970 GE in BF3, not even close. Is 680 a faster card in BF3, yes @ 1080P, but that difference is very small. At 1600P, it's even less conclusive. The outlier here is the AT review you linked actually. Since you used your math calculations off that review, it's not even relevant since we already have 7970 OC vs. GTX680 OC real world data.

And every single review I linked was very recent to make it as fair as possible.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Its a real shame they decided to do the CPU tests at such a low resolution. No one runs the game that low unless their GPU is also pretty old. Realistically a Bulldozer from the results above shouldn't bottleneck a 6990 very much, but without seeing the actual data its not clear.

I keep hearing that BF3 is a real CPU killer in multiplayer, and while this data somewhat supports that (frame rate halves from empty server to 64 players on this very low quality setting) in practice that still remains within the GPUs performance bounds.

On the other hand nice to see the 3930k showing a small improvement over the 2600k, those 2 extra cores give just a little more :)

Is it the extra cores or the extra mem/pcie bandwidth?
 

legcramp

Golden Member
May 31, 2005
1,671
113
116
Not at all. Kepler still crushes a 1300mhz 7970 in BF3.

The 7970 @ 1300mhz is still SIGNIFICANTLY slower than the 1286mhz GTX 680. Assume we give the 7970 another 10% OC to bring it 1300mhz, and assume 100% scaling we still get the following:

7970 @ 2560x1600 BF3 Ultra Quality + FXAA-High = ~72fps
GTX 680 @ 2560x1600 BF3 Ultra Quality + FXAA-High = 79.1fps

7970 @ 1920x1200 BF3 Ultra Quality + 4xMSAA = ~70fps
GTX 680 @ 1920x1200BF3 Ultra Quality + 4xMSAA = 79.9fps

SOURCE: http://www.anandtech.com/show/6096/evga-geforce-gtx-680-classified-review/9 and http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/18

Don't get me wrong, I love the 7950/7970, but Kepler dominates BF3 without question on the high-end. It is also a lot easier for a standard 680 to hit 1200-1250mhz vs a 7970 to hit between 1200 and 1300mhz.

actually the 7950 doesn't even need to be clock for clock with the 670 to "crush" it :D

http://www.hardocp.com/article/2012/08/23/galaxy_gtx_660_ti_gc_oc_vs_670_hd_7950/2
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
LOL @ having to use an OC'd card against a stock 680 and still losing.

Sad.

If offered the two I'd take the 680, but...

Is it really overclocked if it comes that way from AMD? I mean, the 7970GE is an official AMD part, it's not even a "factory OC".

EDIT: I assume you're talking about the GE, but I'm not certain which post you're responding to.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
LOL @ having to use an OC'd card against a stock 680 and still losing.

Sad.

You realize the gtx680 overclocks itself right?

I would honestly say that Anandtech is using a different single player section to benchmark the game than everyone else. Either way, I considered going back to Nvidia for BF3 performance, but I saw the number on the 7970 at launch and bought one right away.

I had a gut feeling that they would pull off some miracles with their new commitment to drivers and developer relations, and so far its paying off.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You realize the gtx680 overclocks itself right?

I don't think you understand the word "overclocks".
It dosn't mean what you want it too mean.

GPU Boost != overclocking.

Just like:

Intel Turbo Boost != overclocking.

Or

AMD Turbo Core != overclocking.

Amazing how people on a tech site cannot grasp the most simple concepts.

It's really simple:

http://en.wikipedia.org/wiki/Overclocking
Overclocking is the process of making a computer or component operate faster than the clock frequency specified by the manufacturer by modifying system parameters

Got it now?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
And for the topic:

Oh really?!

When BF3 tries to be more like ARMA2...performance tanks?
Really?
Trying to do more computations mean lower performance?
Really?

In before first posters come and start about "unoptimized gamecode"...based purely on benchmark numbers with no understanding of the computations going on.

*sigh*
 

Granseth

Senior member
May 6, 2009
258
0
71
@Lonbjerg
I agree that the boost isn't OC, but it's the same for GHz ed. as it manufacturer specified clock frequency. So maybe you should enlighten OCguy too.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
And for the topic:

Oh really?!

When BF3 tries to be more like ARMA2...performance tanks?
Really?
Trying to do more computations mean lower performance?
Really?

In before first posters come and start about "unoptimized gamecode"...based purely on benchmark numbers with no understanding of the computations going on.

*sigh*

Where was ARMA 2 mentioned in this thread?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
@Lonbjerg
I agree that the boost isn't OC, but it's the same for GHz ed. as it manufacturer specified clock frequency. So maybe you should enlighten OCguy too.

The post was meant to enlighted all, that cannot figure out the definition of "overclock"...red, green, or lemonblue...I don't care.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Overclocking is the process of making a computer or component operate faster than the clock frequency specified by the manufacturer by modifying system parameters

I will have to disagree with that,

OVER-clock is any frequency OVER the base frequency no matter if it was made by the manufacturer or by the user.

It is the BASE frequency that the CPU or GPU operates at 100% 24/7, with technologies like Turbo Boost/Turbo Core etc providing en extra performance boost for a small period of time (or with less Cores active).
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You are saying GTX680 OC crushes HD7970 OC in BF3? You are looking at 5-10% delta here. Crushing would be if one videocard was leading by 20-30%+. And once you get to 2560x1600, there is no way GTX680 is crushing anything since it performs much worse beyond 1080P/1200P. Since MrK6 has a 30 inch monitor, he was probably referring to HD7970 1300mhz+ beating a 680 at his resolution. That's probably true too.
That was an excellent meta-analysis of the data that's out there :thumbsup:. I was actually referring to my 7970 crushing the game itself (as in performance is stellar), but that's the problem with using pronouns around here. :D

The post was meant to enlighted all, that cannot figure out the definition of "overclock"...red, green, or lemonblue...I don't care.
Why are you so upset over semantics? We're still talking about computer hardware, right?
 

Granseth

Senior member
May 6, 2009
258
0
71
The post was meant to enlighted all, that cannot figure out the definition of "overclock"...red, green, or lemonblue...I don't care.

I would be scared if i found a blue lemon. Is that really a color?

Anyway, I think I have been reading to many mine is bigger than your posts, so I start reading to much behind the words :)
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I will have to disagree with that,

OVER-clock is any frequency OVER the base frequency no matter if it was made by the manufacturer or by the user.

It is the BASE frequency that the CPU or GPU operates at 100% 24/7, with technologies like Turbo Boost/Turbo Core etc providing en extra performance boost for a small period of time (or with less Cores active).

I think that's a mostly fair definition but the lines aren't so clear. At stock, my CPU sits at 1.6ghz, and depending on its load it might pick a frequency anywhere between 1.6 and 3.8ghz.

Is it fair to say then that although the 3570K can "turbo" to 3.8ghz at stock all day over its 3.4 stock speed and still be within its TDP specifications (there are exactly zero circumstances where it won't be at a "turbo" frequency in a desktop computer), is 3.8 really an overclock from 3.4, when the CPU is as likely to pick 3.4 as any other frequency between 1.6 and 3.8? For all practical intents, the 3.4 -> 3.8 range is only different because it uses a different voltage table.

You might have a case for that argument in a laptop version of a chip, where the TDP is "artificially" lowered in order to save power.

It's all semantics, really, but my inclination is to instead say that any frequency the CPU (or GPU) picks on its own without user intervention is not an overclock.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Core i5 3570K base frequency is 3.4GHz with four cores. At 1.6GHz its called DOWN-Clock again from the base frequency either if its done automatically or manually from the user. At 3.8GHz its a single core OVER-clock from the base frequency no matter if its done automatically or manually by the user.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I don't think you understand the word "overclocks".
It dosn't mean what you want it too mean.

GPU Boost != overclocking.

Just like:

Intel Turbo Boost != overclocking.

Or

AMD Turbo Core != overclocking.

Amazing how people on a tech site cannot grasp the most simple concepts.

It's really simple:

http://en.wikipedia.org/wiki/Overclocking


Overclocking is the process of making a computer or component operate faster than the clock frequency specified by the manufacturer by modifying system parameters
Got it now?

Yes, I got it. You bolded the parts for me that proved my point. Thank you.
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
Core i5 3570K base frequency is 3.4GHz with four cores.

With turbo it's 3.5GHz on all cores, 3.6Ghz on three, 3.7GHz on two and 3.8GHz on one.
Very rarely you'll see just one core working but with all four it's 3.5GHz
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Makes me happy I went SLI.

You realize the gtx680 overclocks itself right?

I would honestly say that Anandtech is using a different single player section to benchmark the game than everyone else. Either way, I considered going back to Nvidia for BF3 performance, but I saw the number on the 7970 at launch and bought one right away.

I had a gut feeling that they would pull off some miracles with their new commitment to drivers and developer relations, and so far its paying off.

The boost clock is set in the BIOS. It boosts based on temps, GPU load, TDP etc. It's not an overclock at all. AN overclock is when you go into afterburner and set the GPU Core to +10 and add 10Mhz to the stock frequency.

The BIOS is handling all the calculations for what clock speed to boost the GPU to. Overclocking is when the end user adjusts the clock frequency above what comes out of the box.

Core i5 3570K base frequency is 3.4GHz with four cores. At 1.6GHz its called DOWN-Clock again from the base frequency either if its done automatically or manually from the user. At 3.8GHz its a single core OVER-clock from the base frequency no matter if its done automatically or manually by the user.

It's not considered an overclock because it's the stock functionality. If you ask Intel if you are considered to be overclocking your CPU when it runs at 3.8Ghz Turbo and thus void your warranty they would say a resounding no.
 
Last edited: