[Multiple Sites] Watch Dogs GPU benchmark roundup

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Seems to me, the VRAM scare was/is overstated. Not only does it appear that Ultra can be enabled with 2GB cards but looking at the techspot numbers, a 3GB 7970 is performing a mere 1fps better than a 2GB 680 and WOSE than a 2GB 770. And that trend is the same for both 1080p and 1600p

Seems to me there is confusion over Ultra settings vs Ultra textures and that you may want to check out the HARDOCP link people are referring to where they say that even 3GB isn't enough for smooth gameplay w/ Ultra textures. 4GB+ is preferred.

We found out that even on the GeForce GTX 780 Ti with 3GB of VRAM performance can drop as new textures and scenery are loaded into memory. This was most notable moving the camera, or driving in the open world city. With the XFX Radeon R9 290X DD with its 4GB of VRAM we experienced smooth gameplay with no drops in performance using "Ultra" textures.
Is there a difference in the image quality between "High" and "Ultra" textures? There is definitely a difference in image quality, "Ultra" textures are noticeably better. We will have lots of image quality comparisons later today.



Keep in mind that we used the AMD 14.6 Beta driver for our Radeon video cards. We expect this driver to be dropped today and it should address performance issues you have seen written up on the previous current 14.4 WHQL driver. While there has been some Chicken Little proclamations, Team Red or Team Green not getting full looks at new release gaming code when the other team has is nothing new.

HardOCP will have a quick follow up to this article today on Watch Dogs image quality.

http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/5
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
From what i've seen on various articles, Ultra textures can not be enabled on 2gb cards. [H]s article was clear on this.

Wondering if confusion is due to ultra settings vs ultra textures or if that's not the issue.

Not sure, I haven't read the [H] article. I'm just looking at the techspot charts and the one labeled Ultra has several 2GB cards in there.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Seems to me there is confusion over Ultra settings vs Ultra textures and that you may want to check out the HARDOCP link people are referring to where they say that even 3GB isn't enough for smooth gameplay w/ Ultra textures. 4GB+ is preferred.

More ram is always preferred. But there could certainly be some confusion. I'll wait until the full [H] review is out and give it a thorough read through.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Seems to me, the VRAM scare was/is overstated. Not only does it appear that Ultra can be enabled with 2GB cards but looking at the techspot numbers, a 3GB 7970 is performing a mere 1fps better than a 2GB 680 and WOSE than a 2GB 770. And that trend is the same for both 1080p and 1600p


Look at the graphs in post eight. The GTX770 2GB scores better average FPS than the 280X, but the minimum frame rate is quite a bit lower. I'd like to see more benches, but it would seem 2GB may limit you in some ways in this game.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Look at the graphs in post eight. The GTX770 2GB scores better average FPS than the 280X, but the minimum frame rate is quite a bit lower. I'd like to see more benches, but it would seem 2GB may limit you in some ways in this game.

Perhaps you linked me the wrong chart, because that one shows the 770 with an advantage in both minimum and average.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

14011841102ZXDC1lHwG_3_1_l.gif


14011841102ZXDC1lHwG_3_2_l.gif


14011841102ZXDC1lHwG_3_3_l.gif


We tested, and re-tested this to be sure what we are seeing was correct. The NVIDIA GeForce GTX 780 Ti 3GB video card suffers from performance issues that directly relate to texture loading. As we entered new scenes in the game, or were entering or swinging the camera around to new areas that had to load the scene and textures there were at times severe drops in performance as the textures loaded in. This happened indoors and while we were driving in the open world city. It can clearly be seen on the graph.

The Radeon R9 290X GPU based video card has more VRAM on board, 4GB, and this seems to have given a different and better gameplay experience in this game. None of the performance drops experienced with the GTX 780 Ti happen on the Radeon R9 290X GPU based video card. The extra VRAM has made it so that performance is smooth with no discernible lag or stuttering while loading new scenes or new textures as we move about the game. This can really be felt as you drive through the city in the open world, it is much smoother on the Radeon R9 290X GPU based video card.

Due to those performance drops the XFX R9 290X DD video card is faster in minimum and average framerates and produces the best gameplay experience at "Ultra" texture settings. While the "performance" is playable on the GTX 780 Ti, the experience is choppy and inconsistent.

Ultra textures demand a lot of VRAM, and while 3GB may be required to allow it to work, it is certainly not the optimal solution for the smoothest performance.

We haven't even enabled AA yet at this point, and something like MSAA or TXAA is going to eat into VRAM capacity as well. Imagine the GTX 780 Ti with "Ultra" textures enabled and MSAA or TXAA enabled. It will be even less desirable.




Some pretty interesting results if you want to use the Ultra textures included with the game. And this is before even trying to use any anti-aliasing. The GTX 680/770 with its standard 2GB VRAM buffer is not even able to run the Ultra settings, but even the 780/780ti 3GB versions are apparently having issues. Just look at those minimums, not looking good.
 
Feb 19, 2009
10,457
10
76
How are some of these review sites lumping in all the 2gb cards in their ultra tests with ultra textures when you cannot enable it with 2gb vram?

My buddies are playing it now on their 2gb cards, they say they cant turn on ultra textures.
Review site confusion or just misleading?
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
How are some of these review sites lumping in all the 2gb cards in their ultra tests with ultra textures when you cannot enable it with 2gb vram?

My buddies are playing it now on their 2gb cards, they say they cant turn on ultra textures.
Review site confusion or just misleading?



Just cross off the review sites that can't easily explain and warn readers of the 2gb limitation for Ultra Textures in this game from your trusted source. I mean one of the first things that should be readily apparent to reviewers is how the Ultra Settings for textures plays into a gaming rigs requirements and then they ought to feel compelled to clearly impart this on their readership. Particularly so given Ultra Textures impact on the look of the game. This is why we view tech sites. I'm wondering if game review guides supplied by nVidia may have something to do with the confusion being sent out by some review sites.

I've got [H], Anandtech, and that's basically it for whom I get info I'll count on from. PCPer to a lesser extent and I really like bit tech and guru3d (love Wizzard) but am a bit hesitant to trust their looks alone.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Thanks... Now the real question... Is this game worth the $60 asking price?

No.

It's worth 40-50 IMO.

A $60 game better be gd'mnd good and special. Right now a lot of new games are asking $60 and there's nothing special about them.

BUT,.... in WatchDogs case, given the hype (i hate it but am a sucker for it) I might pull the trigger on it at $60 if I couldn't find it for less.

There should be codes for greenmangaming that can get you the game for <$50. Though at GMG for this game you'll be tied to uPlay for launching the game, going through Steam is cleaner but more expensive now.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Please post benchmark results for Watch Dogs here, and do not start new threads.
-- stahlhart
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Ultra settings can be enabled on 2GB, it's just jerky during gameplay. I've just upgraded from a 670 to a 780 and it runs nicely so long as Vsync is off.

Monitoring the game It often approaches 3GB.
 
Last edited:

Majcric

Golden Member
May 3, 2011
1,409
65
91
How are some of these review sites lumping in all the 2gb cards in their ultra tests with ultra textures when you cannot enable it with 2gb vram?

My buddies are playing it now on their 2gb cards, they say they cant turn on ultra textures.
Review site confusion or just misleading?

You can enable ultra texture on 2gb cards.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
No.

It's worth 40-50 IMO.

A $60 game better be gd'mnd good and special. Right now a lot of new games are asking $60 and there's nothing special about them.

BUT,.... in WatchDogs case, given the hype (i hate it but am a sucker for it) I might pull the trigger on it at $60 if I couldn't find it for less.

There should be codes for greenmangaming that can get you the game for <$50. Though at GMG for this game you'll be tied to uPlay for launching the game, going through Steam is cleaner but more expensive now.

AFAIK, Steam requires uPlay as well.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
You can enable ultra texture on 2gb cards.

Ultra Graphics Quality differs from Ultra Textures. Either HardOCP is wrong or a lot of 2GB people are whoo-hooing at running Ultra Textures but in fact are running Ultra Graphics Quality with sub-Ultra Textures. Or perhaps a third possibility like enabling Ultra + Ultra with 2GB cards if you are under a certain resolution threshold?

http://www.hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/2

EDIT: Ok apparently the thing is you CAN enable Ultra Textures on 2GB cards, but the game itself warns you not to do so (you can see screenshots of this), and even NVidia warns you not to do so, saying:

The caveat is that you&#8217;ll require a GPU with 3GB of VRAM to play with maximum-detail textures enabled, without encountering stutters. On cards with less than 3GB of VRAM, previously used textures will be removed frequently from memory to make way for those visible on the path ahead. This can result in hitching, stuttering, and even temporary pauses if other system-related bottlenecks are encountered during the process. To minimize the potential consequences of using less than 3GB of VRAM, we recommend that you load the game onto a SSD and refrain from using MSAA and TXAA hardware anti-aliasing modes, which consume VRAM, reducing the amount available for textures.

If stuttering does occur and detracts from your experience, the 2GB &#8216;High&#8217; setting is instead recommended. At first glance, texture fidelity appears unchanged. Looking closer, however, reveals moderately blurred text on surfaces, a reduction of the number of unique textures on surfaces, and finer details like the grooved material of Aiden&#8217;s hoodie being lost. During gameplay, these quality changes will likely go unnoticed, so if you are struggling with stutters High is definitely recommended.

http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

Ultra vs High textures comparison here: http://international.download.nvidi...dogs-textures-comparison-1-ultra-vs-high.html

(The differences will probably be more apparent if you are looking closely at something vs. standing far away)
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Ultra Graphics Quality differs from Ultra Textures. Either HardOCP is wrong or a lot of 2GB people are whoo-hooing at running Ultra Textures but in fact are running Ultra Graphics Quality with sub-Ultra Textures. Or perhaps a third possibility like enabling Ultra + Ultra with 2GB cards if you are under a certain resolution threshold?

http://www.hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/2

EDIT: Ok apparently the thing is you CAN enable Ultra Textures on 2GB cards, but the game itself warns you not to do so (you can see screenshots of this), and even NVidia warns you not to do so, saying:

The caveat is that you’ll require a GPU with 3GB of VRAM to play with maximum-detail textures enabled, without encountering stutters. On cards with less than 3GB of VRAM, previously used textures will be removed frequently from memory to make way for those visible on the path ahead. This can result in hitching, stuttering, and even temporary pauses if other system-related bottlenecks are encountered during the process. To minimize the potential consequences of using less than 3GB of VRAM, we recommend that you load the game onto a SSD and refrain from using MSAA and TXAA hardware anti-aliasing modes, which consume VRAM, reducing the amount available for textures.

If stuttering does occur and detracts from your experience, the 2GB ‘High’ setting is instead recommended. At first glance, texture fidelity appears unchanged. Looking closer, however, reveals moderately blurred text on surfaces, a reduction of the number of unique textures on surfaces, and finer details like the grooved material of Aiden’s hoodie being lost. During gameplay, these quality changes will likely go unnoticed, so if you are struggling with stutters High is definitely recommended.

http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures


[H]ard even found that the 780 & 780ti are stuttering and providing much lower minimum frames than the 290/290X with the ultra textures.

Right now R9 290/290X are offering the best experience on maxed settings in the game unless you are using a Titan or 6GB 780. Pretty sad.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
[H]ard even found that the 780 & 780ti are stuttering and providing much lower minimum frames than the 290/290X with the ultra textures.

Right now R9 290/290X are offering the best experience on maxed settings in the game unless you are using a Titan or 6GB 780. Pretty sad.

I'm gonna refer you to post 22 in this thread :p

We found out that even on the GeForce GTX 780 Ti with 3GB of VRAM performance can drop as new textures and scenery are loaded into memory. This was most notable moving the camera, or driving in the open world city. With the XFX Radeon R9 290X DD with its 4GB of VRAM we experienced smooth gameplay with no drops in performance using "Ultra" textures.
Is there a difference in the image quality between "High" and "Ultra" textures? There is definitely a difference in image quality, "Ultra" textures are noticeably better. We will have lots of image quality comparisons later today.



Keep in mind that we used the AMD 14.6 Beta driver for our Radeon video cards. We expect this driver to be dropped today and it should address performance issues you have seen written up on the previous current 14.4 WHQL driver. While there has been some Chicken Little proclamations, Team Red or Team Green not getting full looks at new release gaming code when the other team has is nothing new.

HardOCP will have a quick follow up to this article today on Watch Dogs image quality.

http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/5
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
The game runs really bad on my 290. Yeah, it doesn't stutter with Ultra textures but the framerate fluctuates quite a lot. It's not all roses.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The game runs really bad on my 290. Yeah, it doesn't stutter with Ultra textures but the framerate fluctuates quite a lot. It's not all roses.

Use 14.6 beta

http://www.guru3d.com/files_details/amd_catalyst_14_6_beta_download.html

Eurogamer gives WD 7/10 so I'll pass on the game for now; I rarely buy games at launch if they score that low. I'm just interested in WD because it may be a sign of things to come. There's been a lot of speculation about how system intensive next-gen console ports will be. Some say 2GB VRAM is enough, others like myself have been cautioning for a while now that 2GB VRAM may not be enough, not when XBO and PS4 give around 5GB of combined RAM to game devs. (The rest is reserved for OS on those consoles.)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You can enable ultra texture on 2gb cards.

Ya and you get unplayable game.

http://www.pcgameshardware.de/screenshots/original/2014/05/Watch_Dogs-1080p-Full-HD-4xMSAA-pcgh.png

280X 3GB Gaming = 27.7 fps
770 2GB = 5.9 fps

With Ultra Textures @ 1080P with MSAA, you'd be at 2.5-3GB of VRAM. You can try running Ultra textures without any AA.

From GameGPU: "VRAM should be about 2 gigabytes using the High textures and three gigabytes for ultra mode."

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test-vram_hi.jpg


More ram is always preferred. But there could certainly be some confusion. I'll wait until the full [H] review is out and give it a thorough read through.

Guru3D's testing revealed the issue before anyone else but most people didn't read their review since they only looked at the charts.

"We then lowered texture quality settings from the Ultra setting (recommended for 3GB graphics cards) towards HIGH (recommended for 2GB graphics cards) to give this card a little more Video memory to fiddle and fool around in. ....So any graphics card with 2GB of memory set at HIGH/HIGH @ 2560x1440, would break down as frames start so swap back and forth in the frame-buffer. With such a card you'd need to be at Full HD / 1920x1080 maximum."

If people only looked at the benchmark scores, they wouldn't see the issues of 780Ti SLI vs. 290X CF.

"Radeon R9 295 does roughly 49 FPS on average in this scene sequence
GeForce GTX 780 Ti SLI does roughly 56 FPS on average in this scene sequence."


780Ti SLI should be clearly faster but FCAT shows a stuttering fest.

index.php


In the conclusion, for 2GB mid-range cards:

"So for the mainstream gamers, above a quick run at High Quality settings with high Quality Textures. My recommendation for the guys with a cheaper mid-range 2GB card, switch to high quality settings and FXAA as maximum. Your playable resolution will be be 1920x1080 (Full HD)."
http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
The frame rate hangs waiting on textures to load.

On the plus side, there's no texture pop-in :awe: