Latest Guild Wars 2 GPU Performance Data

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Anyone get a chance to test the 306.02 drivers in Guild Wars 2? I'll be upgrading the drivers tonight to see if there is any difference. I'm also thinking about overclocking my 670 to improve the minimum frames I'm getting.

Didnt test for speed, but no issues in Gw2 atleast with running them.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Anyone get a chance to test the 306.02 drivers in Guild Wars 2? I'll be upgrading the drivers tonight to see if there is any difference. I'm also thinking about overclocking my 670 to improve the minimum frames I'm getting.

I personally don't see a performance increase on my 670, coming from 304.79.

Also don't forget, if there was ANY difference AT ALL, Nvidia would have stated that in their patchlog.
We all know how Nvidia loves to inform us with insignificant numbers:
GeForce GTX 560:
  1. Up to 5% in Battlefield 3 with SLI
  2. Up to 4% in Dragon Age II
  3. Up to 8% in The Witcher 2: Assassins of Kings with SLI
  4. Up to 7% in Lost Planet 2

The support for SSAO is a nice gesture, but unlike most games, it is barely noticable in GW2, in its current implementation. The AO is extremely faint and introduces non-AO related glitches with transparancy.

I hope Nvidia comes with a fully working MSAA implementation soon. Until then, i'm using driver-level 1.2x OGSSAA combined with the SMAA-injector, looks great.
 
Last edited:

Toe

Member
Jul 23, 2001
48
0
66
CPU%20clock.png

Those numbers look kinda screwy. I'm guessing the game is capped at around 70fps on Balanced and Best Appearance settings, which would explain those i5 numbers. But the FX numbers? A 1.33x increase in clock speed giving a 1.7x performance increase? Seriously?

I'd give it a few patches before drawing any conclusions.
 
Last edited:

HexiumVII

Senior member
Dec 11, 2005
661
7
81
Been trying all types of AA to get GW2 to look nice, can't seem to quite get it. Tried FXAA injector, SMAA injector, OGSSAA, and can't seem to get the overrides right in nvidia inspector. Had great luck in D3 though. I still get enough jaggies and shimmering to annoy me. And my character has ribbons that always look bad. Hopping they can get something nice out one day. I'm pretty amazed at how well it runs on a 9800GT with a Core2Quad 8200 though.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Just for anyone else which was disappointed that the Nvidia driver-level SSAO for GW2 is barely noticeable, introduces glitches and has quite the performance hit. Using NV-Inspector, you can change the Ambient-Occlusion compatibility-bit in GW2's profile to those of:

Darksiders 2 (good performance, good looking soft-AO, this is the one I'm using)
TES5: Skyrim (middle of the road, somewhat darker AO)
Demigod (High performance cost, very nice and very dark)
And probably more!

They work perfect, GW2 with SSAO looks great, and this does not have the transparency-related glitches which GW2's AO profile has, and all are much more noticable and better looking. The only minor problem with it, is that its rendered above the UI and does not play nice with fog, however, it is barely noticeable and does not really bother me. Mind that AO only works if you're using Native-sampling, since driver-level SSAO can not be forced when the resolution of the framebuffer is not equal to output resolution.
I recommend setting AO-quality to "Quality".

HexiumVII, I'm personally using 1.2x OGSSAA (using Nvidia's resolution-downscale method) and SMAA Injector. It looks fabulous. Aliasing and pixelcrawling is completely gone.
Just remember to disable any AA-flags you put in NV-I, and disable ingame FXAA, its implementation by ArenaNet is a terrible, blurry mess.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Gah, this game's controls are so counter intuitive. Poorly set up or maybe I'm just not used to this genre of game. Yes, it appears that the jaggies do not completely go away when I select FXAA (which is the only AA option). It's lessened, but still there.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Thanks Someonesimple. I've been testing SSAO quite a bit. And never got it to work properly in GW2. The fps-hit is quite noticable, but I hardly seen any visual changes in the game. I saw some AO-shadows under some plants. But not under most plants. And I never saw any AO on surface (buildings, rocks, etc). In Skyrim the effect was quite clear, if you knew what to look for.

I think I've seen during tests in the past, that my SSAO started working only when I used 4xMSAA via the control panel or via nVidia Inspector. I've always used native-Sampling in GW2. But I'm not sure how to set my AA. Should I just use the FXAA option in the game, and then set the contol panel to "application controlled". Or should I set AA ingame to off. And then set it in the driver to 4xMSAA ?
And what about transparency AA ? Does transparency-MSAA work with in-game FXAA ? Does it work with SSAO ?

Any clue is helpful. Testing this takes a lot of time, and when it never does what you want/hope, it's hard to keep a clear head about your results. Can be very frustrating.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
I think I've seen during tests in the past, that my SSAO started working only when I used 4xMSAA via the control panel or via nVidia Inspector. I've always used native-Sampling in GW2. But I'm not sure how to set my AA. Should I just use the FXAA option in the game, and then set the contol panel to "application controlled". Or should I set AA ingame to off. And then set it in the driver to 4xMSAA ?
And what about transparency AA ? Does transparency-MSAA work with in-game FXAA ? Does it work with SSAO ?

For now, I highly recommend to disable any form of MSAA (and AA-bits) in GW2, since the only working bit is flawed and introduces glitches.
Transparency MSAA/SSAA don't work either.

If you can't get SSAO working (you can easily notice it on snowy-area's), open Nvidia Inspector and reset your GW2 profile to Nvidia's Default, change the AO-bit to one of the above, put AO on Quality and save it.

And like I posted before, I personally think you shouldn't use ingame FXAA, it is horrible. For far better results, use SMAA Injector instead.

NIi0A.png
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
FXAA sucks, that's no news. If you don't want jaggies just turn on Supersampling in-game. It's a fantastic implementation within the engine both in IQ and performance.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
I played a bit with the SSAO settings (and FXAA and MSAA). To be honest, I hardly see differences in game. I haven't tried it in snowy areas yet. But I just don't see the effects that I see in Skyrim (under plants, in doorways, under rocks, etc).

Without SSAO, my fps is ~60 in most areas. (gtx680 + 3570k@4GHz, 19x12). With Quality SSAO it drops 10-20 frames, resulting in ~30fps in some areas. It's just not worth it. A shame, because I really like SSAO in Skyrim (and Fallout3).

Enabling SuperSampling in-game also has a huge impact on framerates. I can't remember the numbers, but I remember when I tried it out (twice or so), the dip in fps was enough for me to immediately switch back to Normal sampling. And SubSampling makes everything really blurry.

What is the best tool to use SMAA ? I am now trying the stuff from http://mrhaandi.blogspot.nl/p/injectsmaa.html. TIA.
 
Last edited:

Meekers

Member
Aug 4, 2012
156
1
76
I am in game right now with every setting maxed. I usually get around 60 fps out in the world, but I am suddenly getting 80 fps. I have not changed any of my settings or changed the oc on my card.

Edit: I just opened up afterburner and fought a couple mobs to check gpu usage. I was at 99% while playing. I guess it is time to bump up the OC a little more.
 
Last edited:

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
The only minor problem with it, is that its rendered above the UI and does not play nice with fog, however, it is barely noticeable and does not really bother me.
Enabling SMAA injector was really easy. The resulting AA-effect (with prest high) is indeed rather nice. No idea what the performance impact is. But it doesn't seem to hit harder than FXAA.

I also enabled SSAO, and for the first time it seems noticeable. I do see the SSAO "shadows" on top of some of the UI elements. Like chat bubbles, quests/chat windows, etc. I had SSAO enabled in Tera (I played that for a few weeks). And I had the same problems in Tera. No big deal, I can live with it.

HexiumVII, I'm personally using 1.2x OGSSAA (using Nvidia's resolution-downscale method) and SMAA Injector. It looks fabulous.
How/where do I enable OGSSAA ? I don't see any options for OGSSAA in the injector.ini or smaa.h files. And I don't see any option that looks like this in the nv-Injector tool ?
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
How/where do I enable OGSSAA ? I don't see any options for OGSSAA in the injector.ini or smaa.h files. And I don't see any option that looks like this in the nv-Injector tool ?

Well, like I said: I'm personally using 1.2x OGSSAA (using Nvidia's resolution-downscale method)

By enabling a custom resolution, the game renders at a higher resolution then you're monitor supports, the driver downsamples this, basicly providing OGSSAA.
If you are not familiar with this, here's a link that can help you out.
I'm using 1.2x my resolution, so forcing 2016x1260 on my 1680x1050 display.

If you don't want jaggies just turn on Supersampling in-game. It's a fantastic implementation within the engine both in IQ and performance.

No, it's not. Like ArenaNet's FXAA's implementation, ingame supersampling is similarly horrible. It uses OGSSAA, which on its own, has little effect on diagonals, the reason why many of us prefer the rotated grid of SGSSAA. Also, there is a ton of pixel crawling going on, since its running on from what I can tell at 1.5x SSAA, and not 2x like one would imagine.

Add to that, that enabling supersampling ingame disables driver-level AO and it diminishes the usefulness of SMAA, the only reason to use ingame supersampling is if you where running using an AMD GPU.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Well, like I said: I'm personally using 1.2x OGSSAA (using Nvidia's resolution-downscale method)
No, it's not. Like ArenaNet's FXAA's implementation, ingame supersampling is similarly horrible. It uses OGSSAA, which on its own, has little effect on diagonals, the reason why many of us prefer the rotated grid of SGSSAA.
:confused: But you're clearly not unless you misspoke (you're using OGSSAA too).
Also, there is a ton of pixel crawling going on, since its running on from what I can tell at 1.5x SSAA, and not 2x like one would imagine.
And you're only using 1.2x. Again, you're using worse settings than ANet's standard implementation yet you disapprove of it, which without further explanation seems idiotic.
Add to that, that enabling supersampling ingame disables driver-level AO and it diminishes the usefulness of SMAA, the only reason to use ingame supersampling is if you where running using an AMD GPU.
The driver level AO I can't comment on because I haven't seen it (although I've seen people post that it's poorly implemented). I've also noticed reports saying that NVIDIA GPU's are tanking with the in-game supersampling enabled, so if your FPS are low with that method then it's understandable that it's a poorer compromise. I think AMD's muscle at higher resolutions might be beneficial here. I don't even have my card overclocked beyond stock voltage, and I rarely drop below 60FPS with Superscaling enabled @ 2560x1600.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I'm still running this on a single card as there is still no SLI working in this game. :thumbsdown:
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
So you need a brand new $300 GPU to get a solid 60 FPS?

This is unacceptable IMO.

Companies should be targeting $150 GPU's for solid 60 FPS.

Never has it been more clear that software companies are working hand in hand with hardware vendors to ensure that consumers need to buy the latest greatest hardware in order to play the game they have been waiting years for.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
:confused: But you're clearly not unless you misspoke (you're using OGSSAA too).
And you're only using 1.2x. Again, you're using worse settings than ANet's standard implementation yet you disapprove of it, which without further explanation seems idiotic.
I did not misspoke. I said that OGSSAA on its own has little effect on diagonals, thats why I am using it in conjunction with SMAA.
If SGSSAA was an option, I'd enable that, but that is not the case.

To make my opinion easily understandable:
1.2x driver-OGSSAA + SMAA > 1.5x ingame-OGSSAA (+ ingame-FXAA)/(+incorrect-SMAA)

Since SMAA can't be directly injected into the game's supersampled framebuffer, ingame 1.5x OGSSAA+SMAA looks worse then 1.2x OGSSAA with correctly applied SMAA.

But to each their own. I think we all agree here, that Nvidia and AMD should just implement proper support for MSAA and SGSSAA in GW2, since I doubt that ArenaNet have any plans to implement it themselves.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
So you need a brand new $300 GPU to get a solid 60 FPS?

This is unacceptable IMO.

Companies should be targeting $150 GPU's for solid 60 FPS.

Never has it been more clear that software companies are working hand in hand with hardware vendors to ensure that consumers need to buy the latest greatest hardware in order to play the game they have been waiting years for.
Drop your settings. There's no conspiracy here: if you want eye candy, get a GPU to run said eye candy. It's not complicated.
I did not misspoke. I said that OGSSAA on its own has little effect on diagonals, thats why I am using it in conjunction with SMAA.
If SGSSAA was an option, I'd enable that, but that is not the case.

To make my opinion easily understandable:
1.2x driver-OGSSAA + SMAA > 1.5x ingame-OGSSAA (+ ingame-FXAA)/(+incorrect-SMAA)

Since SMAA can't be directly injected into the game's supersampled framebuffer, ingame 1.5x OGSSAA+SMAA looks worse then 1.2x OGSSAA with correctly applied SMAA.

But to each their own. I think we all agree here, that Nvidia and AMD should just implement proper support for MSAA and SGSAA in GW2, since I doubt that ArenaNet have any plans to implement it themselves.
Oh OK, I knew I was missing something. To be honest, the more I've been playing I think at my resolution I find I actually prefer not running any AA and or Superscaling. The loss of crispness/sharpness is more noticeable than any aliasing.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
So you need a brand new $300 GPU to get a solid 60 FPS?
No you don't. You need a good GPU if you want to have all settings on Ultra, and have 60 fps.
This is unacceptable IMO.
No it isn't. I like it. If you don't like it, go buy and play another game.
Companies should be targeting $150 GPU's for solid 60 FPS.
No, they shouldn't.
Companies should build games with scalable engines. With scalable engines I mean: the more power compute power you have, the more features you can enable. Scalable engines with scalable games have been there since the early Unreal and Quake engines. Crysis1 was the ultimate example of a scalable game.

The big problem with games over the last few years has been that games came with only technology and content that required a (below-)average PC. This was maybe because they were developed for multiple platforms. The result was that customers with fast hardware were looking at a game that looked the same as when it is played on a mediocre PC. Now imho, *that* is unacceptable.

Never has it been more clear that software companies are working hand in hand with hardware vendors to ensure that consumers need to buy the latest greatest hardware in order to play the game they have been waiting years for.
If you don't like something, don't buy it.
If you want a game that looks mediocre on mediocre hardware, go buy it and play it.
But don't tell other people they can't play games with nicer graphics than you, because you don't want to buy the hardware for it.

Scalable engines. It's almost like democracy. Everybody gets what they want.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I didn't have the time to read this thread. Here's my quick question: is CF working yet? I bought this game the day it launched but played only 20 minutes because it's just not playable on a single 6970 :( That state of affairs is really getting on my nerves. I play most of my games with just 1/4th of my hardware. The other game I bought is prototype 2 and it also doesn't support CF, and it's been out a couple of months. Next time I won't bother with CF.
 

VisceralM

Member
Feb 1, 2005
92
0
0
AO, even the forced kind using Nvidia Inspector, requires Reflections to be turned down to tke sky+ terrain and Supersampling turned off to even work. The default AO is a joke, not even noticeable. The other profiles mentioned are much more noticeable. Oddly, I don't seem to get much of a performance hit using it on my overclocked 670, it never tanks my FPS as someone else mentioned above.
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
So you need a brand new $300 GPU to get a solid 60 FPS?

This is unacceptable IMO.

Companies should be targeting $150 GPU's for solid 60 FPS.

Never has it been more clear that software companies are working hand in hand with hardware vendors to ensure that consumers need to buy the latest greatest hardware in order to play the game they have been waiting years for.

You need a high end card to get max out all the settings. This is how everything works in the world and really not a revelation. Almost every car can do freeway speeds, but if you want to do it faster than everyone else, you have to spend a lot more. If you want a nice house for a family of 4, you can find an average home, but if you want a pool, home theater room, 6 car garage you spend a lot more money.
 

RaistlinZ

Diamond Member
Oct 15, 2001
7,470
9
91
I am in game right now with every setting maxed. I usually get around 60 fps out in the world, but I am suddenly getting 80 fps. I have not changed any of my settings or changed the oc on my card.

Edit: I just opened up afterburner and fought a couple mobs to check gpu usage. I was at 99% while playing. I guess it is time to bump up the OC a little more.

It's not an OC issue. My GPU is also taxed at 99-100% during normal gameplay when a lot of action is going on.
 
Feb 4, 2009
35,862
17,402
136
This is going somewhere weird now, being realistic its running fine right now ons Q9650 & a gtx 260 (original edition). The game looks beautiful on medium high setting. I have no idea what the fps are regardless it runs and looks beautiful.
 
Last edited: