[fixed] Dragon Age 2 Low Performance on Nvidia Cards [fixed]

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
there is a HUGE difference in that game between a q660 @ 3.0 and a 2600k @ 4.5.

I assure you I am running in DX11 at very high with everything pegged.

and yes a q6600 is garbage in comparison to the i7 series..

the 2600-K is a beast especially running at 4.5+
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
For example, when you hit an enemy and the enemy body moves in a certain direction due to your hit, that happens due to a physics engine.

As you can see just by that example, there is a ton of games that use physics engines.

The difference between a physics engine using the CPU and a physics engine using the GPU is that in theory, due to the parallel nature of GPUs and the parallel nature of some physic calculations, a physic engine can be more accurate, while maintaining a playable frame rate.

That's not really a physics engine. Many of those are simple scripted actions.

A physics engine does a lot more than a simple cause and effect relationship.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Well for us common folk, it is. He is using 580SLI, so he might be used to free AA?

it's still not "free", it's just that unless you have a 120 hz monitor you normally can't see the difference b/c the fps is so high in every game even with AA enabled. however, using fraps you would still notice a significant slowdown going from 0xAA to 8xAA in almost any situation.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
it's still not "free", it's just that unless you have a 120 hz monitor you normally can't see the difference b/c the fps is so high in every game even with AA enabled. however, using fraps you would still notice a significant slowdown going from 0xAA to 8xAA in almost any situation.

I agree, but he might be CPU limited in a few titles so the hit is much smaller.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
That's not really a physics engine. Many of those are simple scripted actions.

A physics engine does a lot more than a simple cause and effect relationship.

On the other hand collision detection is exactly one of the things you have physic engines for.

So detecting my hammer (body) hits enemy (body) is a physics engine.

What happens next depends on how accurate or simplified the physics engine is - the enemy body might be represented as a simple ball or a much more complex finite element-based system.

In one case a simple falling with a scripted animation might occur. On the other you can have the enemy body twisting in several ways depending on the variables being considered.
 
Last edited:

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
So I checked and:

CPU:
- core 1 and 2 ~30%
- core 3 and 4 ~60%

GPU:
- pegged at 100%

Q9450 stock (2.67GHz). HD5850 860 / 1125 with Cat 11.4s. Very High, AAx4, AFx16 all max / ON. 1920x1080 gives me ~25FPS. I hate low FPS, but this seems actually very playable. Kinda like Crysis.


Asked a friend to do the same on a GTX570.

CPU: (pretty much identical)
- two cores @ 30%
- two cores @ 60%

GPU: (here's the kicker!)
- 20-85%, jumping all over the place

Q6600 @ 3GHz. GTX570 stock with newest betas. PhysX selected to use CPU in nV CP. Very High, AAx4, AFx16, all max / ON. Gives ~20FPS.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
that's a good sign, they'll have it fixed for the 480 asap in that case.

very annoying for 480 owners that they have to wait, though.

The whole problem with that statement he made is that nothing is fixed for 580 users either with performance and that driver. All it fixed are rendering errors not performance issues. In the thread over at Nvidia's forums, 580 users have piss poor performance as well.

Apart from the one user in this thread who claims they can max it on a 570 in contradiction of every other fermi user's experience..........?? As well as in contradiction to benchmark results of several hardware sites.............?? the issue has not gone anywhere.

Let's not forget the 580 and 570 are no different than a 480 and 470 apart from shader count, clockspeeds and thermals. The architecture is the same. These cards respond to driver revisions in the same way, whether it be improvements or bugs or performance isssues.
 
Last edited:

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
The whole problem with that statement he made is that nothing is fixed for 580 users either with performance and that driver. All it fixed are rendering errors not performance issues. In the thread over at Nvidia's forums, 580 users have piss poor performance as well.

Apart from the one user in this thread who claims they can max it on a 570 in contradiction of every other fermi user's experience..........?? the issue has not gone anywhere.

Let's not forget the 580 and 570 are no different than a 480 and 470 apart from shaded count, clockspeeds and thermals. The architecture is the same. These cards respond to drivers revisions in the same way, whether it be improvements or bugs or performance isssues.

Groove, can you check your system resource usage? See my post above.

EDIT: Nvm, I saw in your OP, that you're getting similar usage on a single GPU. And an i7 @ 4.2GHz... anyone claiming you could be CPU bound is delusional.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Groove, can you check your system resource usage? See my post above.

EDIT: Nvm, I saw in your OP, that you're getting similar usage on a single GPU. And an i7 @ 4.2GHz... anyone claiming you could be CPU bound is delusional.

Exactly.

My CPU could bottleneck me at a lower resolution, but not at the resolution I am running.

Moreso, X58 in general is a superior platform than p67 once you are dealing with 3 GPUS.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Exactly.

My CPU could bottleneck me at a lower resolution, but not at the resolution I am running.

Not to mention if CPU was an issue, then Xbitlabs would never get high 80 FPS on their 6990 since it would be held back by the Core i7 @ 3.3ghz. With proper SLI scaling, Tri-SLI 480 should be faster than HD6990 in games. It seems that not only is SLI scaling broken, but this game just doesn't run well on NV cards at the moment for 1 reason or another. So you actually have 2 problems it seems. Let's say if you were getting 25 fps on a single 480 but SLI worked properly, you should still get ~ 55-60 fps with 3 GPUs.
 

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
Totally awful......

da2.png
 

sticks435

Senior member
Jun 30, 2008
757
0
0
The only thing that I know for it's used for is whether or not ranged attacks reach their targets or if they hit the wall/cliff/whatever.
I've never seen a ranged attack miss. Such bullshit that the arrows curve in mid flight to track you. D:
 

chris2012

Junior Member
Mar 11, 2011
12
0
0
One can only hope the issues lie in the drivers and not in the cards themselves. Pretty funny how ATi ( they still get the blame on bad drivers etc when compared to nvidia even nowadays when it's no longer the case) have the superior drivers.

Anyway the 6990 obliterates 580s in every single title. Not happy about it since i'm on the green side atm. Doubt that 590 will beat it.

ManuelG posted on nvidia forums:

We have performance improvements in an upcoming driver but these aren't in driver 270.32. It is in a newer version.
 
Last edited:

THEfog

Junior Member
Mar 15, 2011
1
0
0
I Can Pretty Much guarantee that it is not the CPU limiting Performance here in 90% of cases.

My Specs:

CPU: Q8600 @3.33GHz
RAM: 8GB DDR2 @900.8Mhz
GPU: Gigabyte GTX560Ti SOC @1Ghz 1GB
HDD : Seagate 7200.12 1TB
MOBO: Intel DP35DP
PSU: 775W Thermaltake
OS: Win 7 Pro x64

My Mates Machine:

CPU: Core i7 970 @3.2Ghz
RAM: 8GB DDR3 @1333Mhz
GPU: Gigabyte GTX570 OC (Dont Know what the core clock is on his)
HDD : 2x Seagate 7200.12 1.5TB
MOBO: Gigabyte X58A-UD3R
PSU: 800W Cooler Master
OS: Win 7 Ultimate x64

Before i installed the Latest Nvidia Driver i was getting 5-10FPS at Everything maxed out as well as black squares and triangles all over the place, as soon as i moved it from Very High to High......BAM! Constant 60FPS.

My mate had a very similar story, except he was getting about 7-15FPS and as soon as he placed it on High, exact same results.

His screen is slightly larger than mine (His is 24 Inch while mine is 22 inch). As Soon as i installed the Driver i suddenly stopped getting the little black triangles and can now play comfortably at 45-60FPS, Whilst my mate was still stuck on single digit FPS. He installed the driver and now he is almost always sitting on 60FPS.

Goes to show that pretty much all the performance loss is directly related to the interaction between the Game and the GPU.
 
Last edited:

solarissf

Member
Jan 30, 2011
47
0
0
I have 1 560ti now... how does sli work?.. could I pair it with a 570 or 580? or do you have to pair it with exactly the same card.

I'm assuming this will just help DA2 a bit, but not fix the actuall problem as everyone is having.
 

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
I have 1 560ti now... how does sli work?.. could I pair it with a 570 or 580? or do you have to pair it with exactly the same card.

I'm assuming this will just help DA2 a bit, but not fix the actuall problem as everyone is having.

This
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
So DAII performing better on AMD means NV has bad business practices? Or are you implying that since Nvidia TWIMTBP program has tilted some games towards NV architecture that now AMD is doing the same thing for DAII so its Nv fault that AMD is running a similar program?

I don't have a problem with either side paying money to get a game optimized for their cards (as long as its not artificially crippling the other). This may be one of those cases, so thats a nice bonus for AMD owners. Pc developers need all the cash they can get lest our beloved platform not make any financial sense for them to continue with.

However, in this game the performance is so bad compared to what we would expect given the hardware. Ie (580 losing to 6900 series) that it seems to me to be a combination of things. Optimized for AMD but also a driver issues with Nv. so a driver fix will probably improve things for NV cards but may never beat AMd in this one.

Same case with LP2, Dirt2, Hawx for NV and F1, AvP, OpenGl games for AMD


yes i hate this short of tactic, did you know that BAtman:AA have vendor ID that if you change you vendor id to Gforce card you can enable AA in the games ??

You derived your conclusion based on your assumption, just like many others here. What is your bases on the game being optimized? What exactly do you mean truly AMD Dx11 game? What is the architectural differences and how does it affect games written in Dx11? What is the problem on Nvidia's Dx11 implementation? Is DA2 the first Dx11 game? Are there any other Dx11 games that indicate there is a dx11 features which Nvidia didn't implement correctly? What exactly is the API that is causing problems on Nvidia hardwares? Perhaps what is the combination of APIs that is causing problems on Nvidia?

I found the whole DA2 incidents very interesting. First, the game clearly have problems, but not caused by PhysX. Interestingly, PhysX has not been lifted from the game and clearly isn't causing any problems to AMD users, unlike what many believed that PhysX is ONLY for Nvidia users. We actually have one PhysX game that farvor AMD cards. To my surprise, not a single person mention or complain about it.

Right now, my guess is there exists a feature in game that can not by altered by user through the game or its ini files. This feature automatically enables when the game graphics is set to very high automatically and there are no way to turn that particular feature off, nor we even know what that feature is. Most nvidia user simply stick with graphics high and SSAO off and play the game as the difference is minor. Some say the game use HDAO anyways to SSAO is actually redundant, and the tessellation difference can't justify 75% tax on FPS.

Yes, FPS on AMD video cards are far better comparing to Nvidia video cards, but what about image quality? Someone needs to compare the game side by side, and must be able to turn the specific feature on/off to determine what is actually causing the FPS loss on Nvidia card. Simply claiming that Nvidia is the fault on everything won't do any good to anyone. Yes, you can use the game to compare performance between camps, but those behavior is not mainstream. The mainstream is playing the game.

I also found the following idea amusing. When Nvidia performs better than AMD, then it is Nvidia's fault and Nvidia must have paid developers to sabotage AMD. When AMD performs better than Nvidia, then it is also Nvidia's fault because it is Nvidia who started this trend. So unless the game performs identically on each and every video card from both camps' equivalent hardware, then Nvidia is at fault, but since there are no equivalent hardware from both camps, Nvidia will always be at fault.

Suppose Nvidia is always at fault, and it will make you unhappy, then you will simply always unhappy. A patch won't make you happy, a driver won't fix make you happy, even if Nvidia cease to exist, it still won't make you happy. There are no solution as the premise itself won't lead to happiness.

DA2 isn't a MMO. People will simply spent around 100 hrs to finish the game and probably won't revisit the game anytime soon. A driver profile update should give extra performance to the game, but the performance of the game should not relay on any driver's update. Say there exist a problematic API calls that will not run properly on a particular video card, this should have been caught by the QA of the game, and workaround should be applied by developers before the game is actually released. Right now people are having problems saving the game, indicating that the QA was done very poorly. I am not surprised seeing the QA missed the performance of the game as the QA missed lots of obvious gameplay bugs.

IMO, bioware don't really care. Shall there be bugs, they will simply fix those major gameplay bugs via a patch, but put the rest in DA2's expansions. If you believe that Nvidia should release a driver to fix one game's problem and said to be at fault if they failed to do so, then you are not going to find a solution, because you are not seeking one, but making paranoid statements. In today's literature, those are FUDs.

yes their have very different architecture and different approach, while nvdia use single powerfull cuda core with higher shader frequency, AMD use VLIW4 architecture.

and why i'm soo hate about this tactic is, we will eventually need 2 PC with AMD and NVDIA graphic card to play all PC games if this thing continue.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think differentiation is the key to competition and yet you may think it is a tactic. The beauty of the PC is titles will play fine with each competitor but each is trying to find ways to improve the experiences for their customer base.

If they're both the same then whats the point of competition and trying to move forward?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I think part of the reason it runs better is just, well the graphic guys that designed the game, probably picked alot of graphic stuff that suits the AMD design better than the Nvidia one.

It might not all be a magical driver fix and things run fine on nvidia cards.

ps. Nvidia's TWIMTBP usually have the same thing, lots of stuff their cards do well in games, to make sure they shine vs competitors.
 
Last edited:

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I think differentiation is the key to competition and yet you may think it is a tactic. The beauty of the PC is titles will play fine with each competitor but each is trying to find ways to improve the experiences for their customer base.

If they're both the same then whats the point of competition and trying to move forward?

but i don't want to find out in the future when i buy games and perform like crap, sure this time i can play DAII (i have AMD card). i buy the games for full price, so I expect full performance (thats why i don't buy batman :AA) and its not about improving customer experience, its just mean to make other competitor look crap, we don't deserve all of this, we as costumer deserve the best and not based on certain brand.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
but i don't want to find out in the future when i buy games and perform like crap, sure this time i can play DAII (i have AMD card). i buy the games for full price, so I expect full performance (thats why i don't buy batman :AA) and its not about improving customer experience, its just mean to make other competitor look crap, we don't deserve all of this, we as costumer deserve the best and not based on certain brand.

You shouldn't blame AMD if nVidia cards are performing poorly. and you shouldn't blame nVidia if AMD cards are performing poorly. Except in special cases where features are locked away from the other vendor. nVidia have done this, I don't remember AMD doing this.

nVidia or AMD should NOT be performing so poorly in any AAA game no matter who helped the devs more.

Just like AMD in CIV5, they shouldn't be performing so poorly when all their cards highend are basically performing the same as their low end cards. That did improve with Cat 11.4 though.

Yes, you can say AMD did a great job by making sure the game worked great on their hardware, but nVidia should also be lambasted for not having the game work right on their hardware. Same can be said to AMD about CIV5 although to a lesser extent.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
If they're both the same then whats the point of competition and trying to move forward?

If they both do same which one you choose?

Clearly the one that is the least expensive, the one that consumes less power, the one that is faster, etc.

Now this differentiation by having features in certain games means that you aren't comparing hardware graphic cards head-on anymore - you are comparing game catalogs.

For the consumer this isn't ideal - the consumer wants the hardware companies to develop faster parts that consume less power and are cheaper and for the game developers to create fun games to play that run on the hardware available.

Seems to be a healthy system where everyone wins - but one where the consumer has great power.

Someone makes a crappy game - you don't buy it. Someone makes a crappy card - you don't buy it.

But I'm not even sure this is what happening with Dragon Age II - my 6850, 1680x1050, takes a drop from 55fps at high settings to 25fps at very high settings (and this is without the hirestexture package!).