[fixed] Dragon Age 2 Low Performance on Nvidia Cards [fixed]

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
I run a single GTX 580 and I can confirm that running the game with all DX11 options set to on results in terrible performance, below 20FPS usually.

I do not have the high rest texture pack installed.

Ok this is very strange...
I'm running a i7 2600-K running at 4.3ghz with 12 gigs of ram and a single EVGA GTX 580.
I am not having ANY slowdown at all.. runs great good temps and ice smooth.
I am using the high res pack with everything pegged on ultra.

I am however using some beta drivers as without them I had some black ground textures at times.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
I believe one of the two settings (depth of field or ambient occlusion) seems to carry an abnormally large performance hit. Try running with one or both turned off and see how it is.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Well according to this review, the 580 is averaging 24fps at 1080p.

http://www.hardwareheaven.com/revie...d-6990-graphics-card-review-dragon-age-2.html


Wow. Yeah Nvidia is terrible in Dragon Age 2, horrible framerates. This is a AAA title, where is the driver support ? :) The 5970 is averaging 90fps with minimums of 55, the 580 is averaging 24 with minimums of 17.

If there was a 6870 in there it would be faster than the 580 as well :eek:

2q18iab.jpg


2ylr22x.jpg
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Groover I could have sworn that I just informed the participants of this thread that driver improvements for this game are on the way?
I was just wondering what the taunting is all about?
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
Groover I could have sworn that I just informed the participants of this thread that driver improvements for this game are on the way?
I was just wondering what the taunting is all about?

Driver improvements dont guarantee increased performance. I mean we can hope it does but without numbers and an actual fix we cant assume it will until then i think whatever evidence can be found showing the poor performance is perfectly valid to be posted.

And yes i know you may have sources and proof that we dont but i believe what i see i guess.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Wow. Yeah Nvidia is terrible in Dragon Age 2, horrible framerates. This is a AAA title, where is the driver support ? :) The 5970 is averaging 90fps with minimums of 55, the 580 is averaging 24 with minimums of 17.

If there was a 6870 in there it would be faster than the 580 as well :eek:

2q18iab.jpg


2ylr22x.jpg

Hey, atleast SLI works, great scaling there. ;)

Groover I could have sworn that I just informed the participants of this thread that driver improvements for this game are on the way?
I was just wondering what the taunting is all about?

Keys, you should let nV know that SLI flatout doesn't work in SHOGUN 2, my friends 580SLI is slower than a single card.

Im sure they know though.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Yeah, I'm sure there are quite a few titles out there that aren't supported by multi-gpu configs.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Driver improvements dont guarantee increased performance. I mean we can hope it does but without numbers and an actual fix we cant assume it will until then i think whatever evidence can be found showing the poor performance is perfectly valid to be posted.

And yes i know you may have sources and proof that we dont but i believe what i see i guess.

Allrighty.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Well... no idea which Radeons plow through this game but...

- no AA: 25-30 FPS
- MLAA: 15-20 FPS
- in-game AAx4: 20-25 FPS

That's on my HD5850 @ 860 / 1125, Q9450 stock, card is pegged at 100% usage. Using the 11.4s preview drivers.

Settings: DX11, all maxed / on, high resolution texture pack. Game's playable with in-game AAx4 (best IQ too, MLAA looks weak here and kills performance too), but buttery smooth it is not.

Pretty much what the GeForce cards are getting :p
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
Well... no idea which Radeons plow through this game but...

- no AA: 25-30 FPS
- MLAA: 15-20 FPS
- in-game AAx4: 20-25 FPS

That's on my HD5850 @ 860 / 1125, Q9450 stock, card is pegged at 100% usage. Using the 11.4s preview drivers.

Settings: DX11, all maxed / on, high resolution texture pack. Game's playable with in-game AAx4 (best IQ too, MLAA looks weak here and kills performance too), but buttery smooth it is not.

Pretty much what the GeForce cards are getting :p
glad that 5850 can keep pace with the GTX 580 xD
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I am surprised no one pointed the finger at AMD for crippling the performance on the GeForce cards, I know if it was a TWIMTP game people would be all over nVidia.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I am surprised no one pointed the finger at AMD for crippling the performance on the GeForce cards, I know if it was a TWIMTP game people would be all over nVidia.

Let's assume AMD has purposely done this. Also, that nVidia is guilty as well. See, this is the only eventual outcome for these types of business practices. In the end we'll all need 2 GPU, one from nVidia and one from AMD, to get good performance from all games. In the long run it will kill PC gaming. We need a 3rd party to slap both of them down before it gets too far out of hand.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Let's assume AMD has purposely done this. Also, that nVidia is guilty as well. See, this is the only eventual outcome for these types of business practices. In the end we'll all need 2 GPU, one from nVidia and one from AMD, to get good performance from all games. In the long run it will kill PC gaming. We need a 3rd party to slap both of them down before it gets too far out of hand.
I was just throwing the possibility out there and if AMD actually did do this then I would be disappointed just like I was disappointed with the whole Batman AA fiasco.
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
Let's assume AMD has purposely done this. Also, that nVidia is guilty as well. See, this is the only eventual outcome for these types of business practices. In the end we'll all need 2 GPU, one from nVidia and one from AMD, to get good performance from all games. In the long run it will kill PC gaming. We need a 3rd party to slap both of them down before it gets too far out of hand.

****in my mind****

Intel:
Challenge_Accepted_You_Laugh_You_Lose-s329x270-125393-580.png


**************************************


In reality however that isnt going to happen anytime soon and it is a real shame.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
****in my mind****

Intel:
Challenge_Accepted_You_Laugh_You_Lose-s329x270-125393-580.png


**************************************


In reality however that isnt going to happen anytime soon and it is a real shame.

Well, Intel might like to piss on both of them after the last 2 years. LOL
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
o_O! This is why I don't buy nVidia! Hey, wait... 9 out of the last 10 card I bought are from nVidia. Honestly, I am surprised by this. I generally prefer nVidia drivers, but this is an eye opener for me. Goes to show that much of the performance we see is probably profiles or implementation of the technology.

This, is a major blow to nVidia, in my opinion. The reason is because while HAWX 2 perform twice as good on nVidia, even a 5770 performed 60 FPS+ consistantly. The same can't be said of this situation. So this is actually far worse.

But, the reality is that I don't own this game, so it does not affect me. In the grand scheme of things, I consider this an 'oops!' by nVidia.

BTW - has nVidia given us a time table for the fix? I scanned the thread, but I could not find it.
 
Last edited:

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Well granted, if you look at 3dmark11 and Vantage, the ATI HD6xxx series pretty much stomps the Nvidia cards outside of the GPU physics tests. The theoretical floating point output is superior on the red team.

But part of the business is, Nvidia has a lot of money and has the head-start to optimize their performance for games by "working with" developers.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Further proof that DX11 mode is almost pure brute force at this point:

When I set my 6970s to -20% (200W) power mode, the idle framerate in Kirkwall Hightown dropped 15%.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Groover I could have sworn that I just informed the participants of this thread that driver improvements for this game are on the way?
I was just wondering what the taunting is all about?

I don't see him taunting anybody.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Let's assume AMD has purposely done this. Also, that nVidia is guilty as well. See, this is the only eventual outcome for these types of business practices. In the end we'll all need 2 GPU, one from nVidia and one from AMD, to get good performance from all games. In the long run it will kill PC gaming. We need a 3rd party to slap both of them down before it gets too far out of hand.

I'm surprised these kinds of business arrangements are not viewed as anti-competitive by the FTC given the effective duopoly that exists in discrete video cards for gaming.

Of course it prolly doesn't help that you have an effective monopoly, Microsoft, dictating the terms and conditions of the DX API itself.