Battlefield 3 recommended GPU specs out

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Karl Agathon

Golden Member
Sep 30, 2010
1,081
0
0
Just ordered a PC with a single 580, glad that will be able to play this at decent settings. ive always been a COD guy (console versions) I might just give the BF3 PC version a chance this go around.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just ordered a PC with a single 580, glad that will be able to play this at decent settings.

Looks like we were all optimistic this time.

"When asked if a single Nvidia GeForce GTX 580 1.5GB would have enough power to run the game at Ultra settings, Matros went on to say that if you were aiming for that level of detail from the Frostbite 2 engine, then you’ll probably need two GeForce GTX 580s in SLI configuration." :eek:

Source: http://www.bit-tech.net/news/gaming/2011/09/23/battlefield-3-recommended-specs-only-good-f/1
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Looks like we were all optimistic this time.

"When asked if a single Nvidia GeForce GTX 580 1.5GB would have enough power to run the game at Ultra settings, Matros went on to say that if you were aiming for that level of detail from the Frostbite 2 engine, then you’ll probably need two GeForce GTX 580s in SLI configuration." :eek:

Source: http://www.bit-tech.net/news/gaming/2011/09/23/battlefield-3-recommended-specs-only-good-f/1

:eek:,where are these new cards lol.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
:eek:,where are these new cards lol.

Single cards can play it high quality, 6950+ or 560ti+ will play it fine at 1080p. Benchmarks have already been done showing fluid gameplay with single GPU's.

I'm guessing the Ultra settings are somewhat similar to the "uber" setting in witcher 2....that will probably require SLI/xfire. But that setting is out of reach of most players.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Hmmmm, guess this means I'll need that extra 6970 in order to play this game at 5760x1200 with eye candy on...

Now I just need to convince the wife that the authorization to purchase extra 24" monitors somehow also meant she authorized another GPU ;)
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I hope you're talking about something other than the Alpha, because those numbers don't mean much.

So you're anticipating that the final product will have worse performance than the current alpha? Interesting. Performance will only go up between now and release, and its ridiculous to think that a single high end GPU (ergo 6970 or gtx 570/580) won't be able to play the game fluidly. So i'll repeat: A high end video card will play BF3 at HQ settings, 1080p, just fine.

If you want ultra quality with SSAA, yes, I don't know of any card on the planet that can run a dx11 game with SSAA/16X AF with a good framerate. The GTX 580 can't run Super Sample AA - 16X AF with uber sampling in witcher 2 at a good framerate, and neither can the 6970. If you want Ultra SS-AA / Ultra AF you need sli or crossfire. That applies to any DX11 game, not just bf3.

I'm in disbelief that anyone would think a game would be designed solely for people with SLI or crossfire. With 700$ video card setups. I'm sure that would sell REALLY well :D Basically, the ultra setting is just DICE throwing a bone to users of multi GPU systems...but its by no means required for a good experience. Most eveyrone else will fall back on the high quality setting.

img.php
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Hmmmm, guess this means I'll need that extra 6970 in order to play this game at 5760x1200 with eye candy on...

Now I just need to convince the wife that the authorization to purchase extra 24" monitors somehow also meant she authorized another GPU ;)

Well you could always try and convince her the g card comes with the new monitor in some super duper mega deal combo:p,scrub that,im being devious:p(takes a freindly slap from the gf).
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Hmmmm, guess this means I'll need that extra 6970 in order to play this game at 5760x1200 with eye candy on...

Now I just need to convince the wife that the authorization to purchase extra 24" monitors somehow also meant she authorized another GPU ;)

haha! :D

You will make it stick if you try hard enough!;) Give us some feedback on how this went, ill be having the same woes soon
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Hmm, my poor 3.0GHz C2Q and GTX460 1GB OC card, wonder if they will be able to handle this game.

Or should I get a 2600K, Asrock Fatal1ty Z68 Pro board, and stick BOTH of my GTX460 1GB cards in for SLI goodness?

Edit: Monitor is 1920x1200.
 
Last edited:

GotNoRice

Senior member
Aug 14, 2000
329
5
81
So you're anticipating that the final product will have worse performance than the current alpha? Interesting. Performance will only go up between now and release

Considering that the alpha didn't even have all the final textures in place, I'd say that yes, there is a very real possibility that performance could be lower in certain situations once the beta hits. But really what I was saying is not that performance will be higher or lower compared to the alpha, but rather that the alpha numbers are all over the place due to unoptimized drivers and other quirks that won't exist once the game is released or even likely during the beta. The 5870 outperforming the 6950 for example is very likely a fluke.

But the biggest thing of all that makes those results 100% bogus is the fact that the cards weren't even tested in the same computer... Explain to me how a card tested in an i7 rig is somehow comparable to a card tested in a Phenom II rig? I'll give you a hint: It's not.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Considering that the alpha didn't even have all the final textures in place, I'd say that yes, there is a very real possibility that performance could be lower in certain situations once the beta hits. But really what I was saying is not that performance will be higher or lower compared to the alpha, but rather that the alpha numbers are all over the place due to unoptimized drivers and other quirks that won't exist once the game is released or even likely during the beta. The 5870 outperforming the 6950 for example is very likely a fluke.

But the biggest thing of all that makes those results 100% bogus is the fact that the cards weren't even tested in the same computer... Explain to me how a card tested in an i7 rig is somehow comparable to a card tested in a Phenom II rig? I'll give you a hint: It's not.

Not a fluke. The 5870 outperforms the 6950 in lots of other benchmarks in other review s as well. The 6950 is only stronger where tessellation is involved or when 1gb VRAM isn't enough.

http://www.tweakpc.de/hardware/test...e_gtx560_ti_oc/benchmarks.php?benchmark=bfbc2

http://tpucdn.com/reviews/Powercolor/HD_6870_PCS_Plus_Plus/images/f1_1920_1200.gif
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
haha! :D

You will make it stick if you try hard enough!;) Give us some feedback on how this went, ill be having the same woes soon

I'm not even going to broach the subject until BF3 arrives. Then if I act all frustrated when it won't play smoothly in eyefinity (won't really be an act) she will take pity on me and let me get a new card. At least that's my hope... :biggrin:
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Not a fluke. The 5870 outperforms the 6950 in lots of other benchmarks in other review s as well. The 6950 is only stronger where tessellation is involved or when 1gb VRAM isn't enough.

http://www.tweakpc.de/hardware/test...e_gtx560_ti_oc/benchmarks.php?benchmark=bfbc2

http://tpucdn.com/reviews/Powercolor/HD_6870_PCS_Plus_Plus/images/f1_1920_1200.gif

5870 also does not support MLAA or MSAA. The latter isn't a big deal since no card can play DX11 games without a gigantic performance hit with full MSAA.
 
Last edited:

GotNoRice

Senior member
Aug 14, 2000
329
5
81
Not a fluke. The 5870 outperforms the 6950 in lots of other benchmarks in other review s as well. The 6950 is only stronger where tessellation is involved or when 1gb VRAM isn't enough.p

Regardless of the performance of those two individual cards, the benchmarks that took place didn't even test the cards in the same system. If the results happened to correspond with reality that would have as much to do with chance as anything else.

Though I'm sure that for the blog pretending to be a news website that produced those results, it was "Easier" for them to do it this way instead of trying to adhere to established journalistic standards (aka actually testing the cards in the same computer :rolleyes:)
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
What on earth are you talking about? All the cards WERE tested in the same computer. If you look at the graph bars, the phenom result is the number on the left of the bar and the i7 result is the number on the right of the bar.

wat?

The numbers on the right are average FPS, the numbers of the left are minimum FPS... It even has the legend at the bottom.

Plus, there are 5 different CPUs listed.

People should really stop posting that graph, there are way too many variables to get a good benchmark out of it. I know it is the only one around, but that doesn't make it right...
 

WMD

Senior member
Apr 13, 2011
476
0
0
wat?

The numbers on the right are average FPS, the numbers of the left are minimum FPS... It even has the legend at the bottom.

Plus, there are 5 different CPUs listed.

Only less powerful cards like 4870 and below are tested on PII 940 and Q9550 the others are tested on i7 and i5. Plus all cpus are heavily overclocked. There will be some variation but it wont be significant at that resolution with those overclocked CPUs.

"Less powerful video cards to the level of GeForce GTS 450 and the Radeon HD 4870 tested on the basis of the Core 2 Quad Q9550 and Phenom II X4 940 BE, depending on the resource-games, and more productive on the basis of a more powerful solutions. Overclocked versions of cards represented the sponsors, were equal to conventional counterparts by reducing their clock speeds."
 

GotNoRice

Senior member
Aug 14, 2000
329
5
81
Plus all cpus are heavily overclocked. There will be some variation but it wont be significant at that resolution with those overclocked CPUs.

I disagree. 1080P is not a particularly demanding resolution these days and the Frostbite engine is among the most multi-threaded and CPU intensive out there. I upgraded from a Q9650 @ 4.4Ghz to an i5-2500k @ 5Ghz and the difference in BC2 was significant. I can't imagine that would not also be the case for BF3.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
5870 also does not support MLAA or MSAA. The latter isn't a big deal since no card can play DX11 games without a gigantic performance hit with full MSAA.

5 series does support MLAA. I assume MSAA is a typo and you ment to type FXAA, in that case it supports both.