BF3 graphics card

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
^^ That chart doesn't make a lot of sense (CPU chart) unless BF3 likes cache more than anything in the universe. A 3.0 X4 PhII shouldn't be 2x faster than a 2.6 X4 AII. At most you normally see ~15% boost from AII X4 to PhII X4 at the same clock speed.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
^^ That chart doesn't make a lot of sense (CPU chart) unless BF3 likes cache more than anything in the universe. A 3.0 X4 PhII shouldn't be 2x faster than a 2.6 X4 AII. At most you normally see ~15% boost from AII X4 to PhII X4 at the same clock speed.

Could well be. Definitely likes cores/threads, though.
 

WMD

Senior member
Apr 13, 2011
476
0
0
These don't look bad for either brand's GPU's. SLI/Crossfire aren't working yet, though. :)
BF3_benches1080.jpg


The CPU chart is interesting too. Definitely likes more cores, but doesn't seem to prefer Intel or AMD. The performance differences are about where I'd expect them to be. Should be fine on any modern quad core above the Athlon II. Even Core2 quads are fine.
BF3_CPU_Benches.jpg

That benchmark is really dissapointing for Amd 6000 series especially compared to their last generation products. 6950 slower than a 5870 while 6970 is barely 3fps faster. 6870 also not performing any better than 5850.

I also thought about going the 6950 unlocked upgrade path. But now it seems not really worth it. Definitely better to hold and wait. Too close to the next launch. Prices are going to falling soon.
 

GotNoRice

Senior member
Aug 14, 2000
329
5
81
How would it be possible for the 6990 to be faster than a 6970 if crossfire isn't working? The individual cores on a 6990 are slower than a 6970 so it should be slower if that was the case.

Unless of course that's an example of one of the cards they tested in a completely different computer :rolleyes:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That benchmark is really dissapointing for Amd 6000 series especially compared to their last generation products. 6950 slower than a 5870 while 6970 is barely 3fps faster. 6870 also not performing any better than 5850.

I also thought about going the 6950 unlocked upgrade path. But now it seems not really worth it. Definitely better to hold and wait. Too close to the next launch. Prices are going to falling soon.

Well, we don't know which GPU/CPU combinations they were running. Notice that the benches weren't done on one computer. The 5800 series also has more shader power than the 6900, generally speaking. Drivers, obviously aren't anywhere near optimized either. You can't draw any concrete conclusions from an alpha build of a game. This does show that generally though there is no performance discrepancies between nVidia and AMD GPU architectures and generally scales properly. Except for possibly the 5800's seem to have a little advantage.

How would it be possible for the 6990 to be faster than a 6970 if crossfire isn't working? The individual cores on a 6990 are slower than a 6970 so it should be slower if that was the case.

Unless of course that's an example of one of the cards they tested in a completely different computer :rolleyes:

While technically there is some crossfire scaling, assuming the 6970 is running 100%, we are looking at the 6990 GPU's running ~65%. Or one might be running 100% and the other ~29% That's what I meant by no working. You can call it terribly unoptimized, if you prefer. And, yes, we have no idea what system(s) these benches were run on. These can be used though for general performance observations.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Mabe we should have a "wait till BF3 comes out for your upgrade" thread stickied?

Nothing worth upgrading to at the moment anyway, until something new and exciting happens. I say, unless your system is malfunctioning somehow, just stand pat until we get closer to launch is good advice. Might even get the game for free with a new card at release time?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Nothing worth upgrading to at the moment anyway, until something new and exciting happens. I say, unless your system is malfunctioning somehow, just stand pat until we get closer to launch is good advice. Might even get the game for free with a new card at release time?

:thumbsup:
 

WMD

Senior member
Apr 13, 2011
476
0
0
Well, we don't know which GPU/CPU combinations they were running. Notice that the benches weren't done on one computer. The 5800 series also has more shader power than the 6900, generally speaking. Drivers, obviously aren't anywhere near optimized either. You can't draw any concrete conclusions from an alpha build of a game. This does show that generally though there is no performance discrepancies between nVidia and AMD GPU architectures and generally scales properly. Except for possibly the 5800's seem to have a little advantage.

The 4870 and GTS450 are run on the Q9550. The rest are run on the i7 system. This is what the article says:
Less powerful video cards to the level of GeForce GTS 450 and the Radeon HD 4870 tested on the basis of the Core 2 Quad Q9550 and Phenom II X4 940 BE, depending on the resource-games, and more productive on the basis of a more powerful solutions. Overclocked versions of cards represented the sponsors, were equal to conventional counterparts by reducing their clock speeds.

Hopefully you are right and the game or drivers are unoptimized. Though it seems strange that the harder to utilize 5+1 shaders on the 5000 series are performing better. Deus Ex HR and Witcher 2 also show the same trend but AMD in general are doing very well compared to Nvidia:

img.php


I certainly hope we don't see the meagre 10-20% performance increment again for the next generation.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Thanks for qualifying which GPU/CPU pairings there were. I hadn't read that. My bad, but I get to where I can't be bothered copying and running the link through Google translate every time someone links directly to a non-English site. That will teach me (maybe ;)).

This gen provided a pretty lackluster performance improvement over last. Being stuck on 40nm caused that. 28nm will be different. :crossedfingers:
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Thanks for qualifying which GPU/CPU pairings there were. I hadn't read that. My bad, but I get to where I can't be bothered copying and running the link through Google translate every time someone links directly to a non-English site. That will teach me (maybe ;)).

This gen provided a pretty lackluster performance improvement over last. Being stuck on 40nm caused that. 28nm will be different. :crossedfingers:

I believe Nvidia and AMD are gonna give us a mediocre 28nm launch and a better refresh this time.
40% now 30% in 10 months. That way they get our money twice. :sneaky:

It makes sense...
how many people went from a gtx470/gtx480 to a gtx570/580 and same for AMD 5850 to 6950 was not a upgrade and either was a 5870 to 6970?

Not very many, at least not the smart ones. :)
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I believe Nvidia and AMD are gonna give us a mediocre 28nm launch and a better refresh this time.
40% now 30% in 10 months. That way they get our money twice. :sneaky:

It makes sense...
how many people went from a gtx470/gtx480 to a gtx570/580 and same for AMD 5850 to 6950 was not a upgrade and either was a 5870 to 6970?

Not very many, at least not the smart ones. :)

nVidia and AMD both struggled to give us any improvement this gen. I think for 2 different reasons. I think Cypress was a tough act to follow, considering AMD's small chip philosophy, on the same process. The chip just worked perfectly performance, efficiency, and size. Tessellation and crossfire performance were it's only weaknesses. That's really all they improved upon with Cayman. Considering the state of gaming for Cypress' time, it wasn't that much of a weakness. In order to increase performance though they had to make a bigger and less efficient chip. No 32nm stifled AMD a lot, IMO.

nVidia was able to finally give us a fully functioning Fermi. If they had made that originally, and on time, they would have hit it out of the park. AMD had planned on the 5970 to compete with the equivalent of the 580, and they would have needed it. Being a dual chip card though it wouldn't have competed well. They would have been able to claim the fastest single card, but that would be it, and I don't think it would have been enough to truly compete. The 580 is really what should have been the GTX-385, if all had gone well.

I think 28nm will be huge. I don't think either company can afford to give us half baked designs. They have to worry that the other is going to go all out. If one company pushes it to the max, say 70%-100% performance improvement, and the other tries to schlock in with 40%, they risk losing too much.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
They have to worry that the other is going to go all out. If one company pushes it to the max, say 70%-100% performance improvement, and the other tries to schlock in with 40%, they risk losing too much.

Thats a good point, but I think in reality neither company needs a 70% boost.
I think the 6970/gtx580 do rather well with the latest games at 1080p and I think they will be just fine for BF3 also. Most people run dual cards for anything for higher resolutions than that anyway.

I think a 40% boost with less heat and noise would be fine, with another 30% at refresh time, but what do I know? :).

If they do go all out with the 28nm chips, its gonna be a good time to buy.
A full node could give us 100%+ increase if things go right.
These new cards could be the next 8800gtx for both teams.
 
Last edited:

Nged72

Member
Jan 25, 2011
131
0
71
You could always get a 560 ti now and get Batman Arkham City for free....don't open it until you read about how the Open Beta performs on those cards.

If it perform good, open that sexy beast up, if not, then return it :)
 
Feb 19, 2009
10,457
10
76
Can you guys stop using GameGPU as valid benchmarks unless they specify exactly which cards are comboed to which CPU? Cos the x6 1100T is a heck of a lot slower than the i7s in a lot of games and its considered high-end.

Look at their F1 2011 results, baffling and no mention of GPU/CPU combinations when the game is severely CPU limited even at 1080p.