Interesting Watch Dogs Benchmarks - GTX 770 4GB Sli vs. R290 4GB XF

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
With talk of CPU overhead of drivers and how AMD vs nVidia benefit in DX11 games I found this interesting from Termies post in other thread.

Termie's results are a bit suspect in my opinion, as they don't list the system specs or even what drivers he used for the NVidia setup.

That's why I don't trust just anyone when it comes to reviews..

We also are aware that nVidia has hurt competitors performance intentionally in the past and and I don't think this kind of behavior from nvidias involvement in game development can be discounted when we see a game like watch dogs that severely goes against the norm of GPU performance from AMD and nVidia.
How do you explain AMD's poor results in BF4 in D3D compared to NVidia then?

And how on Earth would NVidia even be able to hurt AMD's performance in Watch Dogs? The game was made by Ubisoft, not NVidia.. The code is optimized for DX11 hardware, and not NVidia hardware.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Termie's results are a bit suspect in my opinion, as they don't list the system specs or even what drivers he used for the NVidia setup.

That's why I don't trust just anyone when it comes to reviews..

How do you explain AMD's poor results in BF4 in D3D compared to NVidia then?

And how on Earth would NVidia even be able to hurt AMD's performance in Watch Dogs? The game was made by Ubisoft, not NVidia.. The code is optimized for DX11 hardware, and not NVidia hardware.

Hey man, can you please freakin stop.

I'm tired of hearing you now. You claim bad CrossFire with AMD but yet you just own GTX 770 in SLI.

And now you accuse Termie of falsificating the results? That is very bad. In my opinion he did an awsome unbiased review with the data he gathered.

Please stop defending your cards, instead, just go play a game and enjoy them because you have good gear dude, just like someone with a R9 290.

(And please stop pretending like Watch Dogs is a game that was on the market for 6 months, there are some issues and they need to fix it/them. Just like any game)
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
How about nv dropping frames in sli?
BF4_3840x2160_FRAPSFPS_0.png

BF4_3840x2160_OFPS_0.png




Bioshock_3840x2160_STUT_0.png

BF4_3840x2160_STUT_0.png
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
How about nv dropping frames in sli?

I already admitted that AMD has the edge at 4K resolutions, but for 1600p and below, SLI is much better as seen by the Anandtech results...

NVidia's next gen cards will likely have more architecture optimizations for 4K, as 4K gaming is still a rather new phenomenon and very few gamers are running that setup..
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Hey man, can you please freakin stop.

Dude, stop trolling me. The next time you tell me to stop giving my opinion I'm just going to call a moderator.. You obviously have nothing to contribute to this thread other than telling me I'm a bad guy or disingenuous because my opinion doesn't match up with yours D:
 

RaulF

Senior member
Jan 18, 2008
844
1
81
I already admitted that AMD has the edge at 4K resolutions, but for 1600p and below, SLI is much better as seen by the Anandtech results...

NVidia's next gen cards will likely have more architecture optimizations for 4K, as 4K gaming is still a rather new phenomenon and very few gamers are running that setup..

You keep telling yourself that.

I have experience both SLI and crossfire. And while it does depend on games too, crossfire is more stable than SLI.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Dude, stop trolling me. The next time you tell me to stop giving my opinion I'm just going to call a moderator.. You obviously have nothing to contribute to this thread other than telling me I'm a bad guy or disingenuous because my opinion doesn't match up with yours D:

I'm not trolling you bud, but you kinda derail the thread since the begginning. You are now accusing Termie of skewing the results and you are now posting frame latency graphs of Crysis 3 and Grid 2. As a reminder, this thread is about Watch Dogs and it has been know before it's release that it was badly optimized for AMD.

Check this article to understand what I'm talking about
Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

I hope you realize how much work is involved into doing what Termie did with his own gear and money.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
I'm not trolling you bud, but you kinda derail the thread since the begginning. You are now accusing Termie of skewing the results and you are now posting frame latency graphs of Crysis 3 and Grid 2. As a reminder, this thread is about Watch Dogs and it has been know before it's release that it was badly optimized for AMD.

Check this article to understand what I'm talking about
Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

I hope you realize how much work is involved into doing what Termie did with his own gear and money.

If Termie's bench was a GTX660 beating a 290X it would be certainly true in his mind. :biggrin:

Infraction issued for member callout, and one week off.
-- stahlhart
 
Last edited by a moderator:

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,792
136
There's plenty more info out there for those willing to look. Like these BF4 dx11 benches. . .

BF4_2560x1440_FRAPSFPS.png

BF4_2560x1440_OFPS.png

BF4_2560x1440_STUT.png

index.php


All the talk of driver overhead, multi-threaded, etc., is just supposition from an outside perspective without near the tools, access, or proper evidence to prove it one way or the other. If you really wanted to know if one company had less overhead than another, you'd need either the proper profiling tool set, or lots of consistent testing across a stable environment and many games/situations. This, unfortunately, doesn't exist that I have yet seen. Even if one did have less overhead or whatever magic in their drivers, is it really that significant of an advantage if the other company's cards trade blows with yours anyway? That's where we're at right now. Some games/engines are going to go AMD's way, others are going to go Nvidia's. That's just how it is. Both are putting out great products right now that compete very well against each other. Just buy the one you want and enjoy.
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Warning to everyone in this thread to get your tempers under control, or I'm going to lose mine even more.
-- stahlhart
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
There's plenty more info out there for those willing to look. Like these BF4 dx11 benches. . .

Did the reviewer specifically state he used DX11? I checked the review and he didn't say that he did so I assume he used the Mantle path.. Also, that's from the single player campaign if I'm not mistaken, which isn't really CPU bound except in one particular level.

The Guru3D benchmark can be discarded outright as they used Windows 7, which cripples NVidia's performance.

Anyway, it's funny how benchmark results differ so much from review site to review site, with Anandtech getting markedly different results than PCper..

All the talk of driver overhead, multi-threaded, etc., is just supposition from an outside perspective without near the tools, access, or proper evidence to prove it one way or the other
I don't think it's just supposition. It's been fairly well documented by several reviewers, including Tech Report, PClab.pl, PCgameshardware.de..

One thing I've noticed, is that the more threaded a game engine is, the greater the likelihood of NVidia to have the advantage.. Watch Dogs is just the latest example of that, and you can't really blame Gameworks for that as the only Gameworks feature is HBAO+..
 

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,792
136
Did the reviewer specifically state he used DX11?

Yep, it's at the top of the page: "Battlefield 4 (DirectX 11)." Some people from the AMD fan club were actually upset they didn't use Mantle to further set the AMD cards apart.
http://www.pcper.com/reviews/Graphi...-295X2-8GB-Graphics-Card-Review/Battlefield-4

The Guru3D benchmark can be discarded outright as they used Windows 7, which cripples NVidia's performance.

We've been over this time and time again on this forum. When BF4 first launched, it underperformed in Windows 7 against Windows 8.1 . However, after some patches, the performance is almost exactly the same. You gain a few fps by going to 8.1, but not much. Both AMD and Nvidia get a bump from Win 8.1, Nvidia gets a slightly higher bump, but not much. So yes, Guru3D is legit and does not show "crippled performance" for Nvidia. Even if it was true, that would be a negative for Nvidia's case as most people are still on Win7.

13849867120D7cnkMfuM_4_2.gif

http://www.hardocp.com/article/2013...ows_7_vs_81_performance_review/4#.U5vb7fldVPM

I don't think it's just supposition. It's been fairly well documented by several reviewers, including Tech Report, PClab.pl, PCgameshardware.de..

Anyway, it's funny how benchmark results differ so much from review site to review site, with Anandtech getting markedly different results than PCper..

You do realize the second quote above from you completely undermines your argument in the first quite, right?

One thing I've noticed, is that the more threaded a game engine is, the greater the likelihood of NVidia to have the advantage.. .

No, this has been the case for quite a while now, you've already drawn this conclusion in your head and have sought after evidence to support your unsubstantiated notions. BF4 is very highly threaded and shows no significant advantage (even in dx11) for Nvidia. Max Payne 3 is older now, but very well threaded (improvements on up to 6 cores) and had no advantage for Nvidia (after a couple patches/driver updates).

MaxPayne3-1680.png

http://www.tomshardware.com/reviews/geforce-gtx-660-ti-benchmark-review,3279-7.html

Metro Last Light scaled up to 6 threads and guess what, no advantage for Nvidia:

60545.png


Anyway, last post from me in here, this has been rehashed so many times, it's not worth it and every time we end up at the same place. You can't just pick what data suits you. In the end, pick the card you like from the company you like at the price you can afford and you'll be fine.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No, this has been the case for quite a while now, you've already drawn this conclusion in your head and have sought after evidence to support your unsubstantiated notions. BF4 is very highly threaded and shows no significant advantage (even in dx11) for Nvidia. Max Payne 3 is older now, but very well threaded (improvements on up to 6 cores) and had no advantage for Nvidia (after a couple patches/driver updates).

Metro Last Light scaled up to 6 threads and guess what, no advantage for Nvidia:.

I was going to do a full response, but then I thought the better of it. We could continue arguing and debating about this subject forever. Right now, there's just not enough data to come to any firm conclusions.

We'll just have to wait until the inevitable deluge of next gen games come in, particularly the big open world ones like AC Unity, Batman Arkham Knight, Dragon Age Inquisition, Witcher 3 etcetera....

One thing I have to say though, is that you're confusing multithreaded games, with multithreaded rendering, which is what I've been attributing NVidia's strong performance in Watch Dogs to. The two phrases are not really synonymous.

Max Payne 3 and Metro Last Light are all multithreaded, but they do not use multithreaded rendering as far as I know. Only a few engines do to my knowledge, and they are the Lore engine used in Civ 5, Frostbite 3, CryEngine 3, Disrupt Engine, Unreal Engine 4 plus some others that will be making their debut this year and next year.

Anyway, after thinking about things for a bit, I think you may be right that there are so many variables that come into play to determine how a video card performs in a given game; certainly more than just the amount of threads an engine supports.

There's something about Watch Dogs that's different. No other game I can think of matches Watch Dogs in terms of how much detail is being rendered at any given moment. Maybe that's it. I still believe that NVidia has lower CPU overhead and superior multithreaded optimizations, but it's nowhere near as simple as I thought..

One other thing, that HardOCP benchmark was done shortly after the game was released. Multiple patches and driver updates since then have certainly changed the performance characteristics of the game, so I don't think it's even valid..
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Carfax all you doing here is derailing this thread. This is about Watch Dogs, not other games where a certain company does well.

Advice: You can create another thread to prove the several points you apparently have.
Cheers
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Carfax all you doing here is derailing this thread. This is about Watch Dogs, not other games where a certain company does well.

Advice: You can create another thread to prove the several points you apparently have.
Cheers

OK, one thing I've considered but not discussed yet in this thread, is whether Watch Dogs support DX11 multithreading. We cannot know unless a Ubisoft developer tells us or NVidia themselves, and going by history, they will likely keep that information for themselves until they see fit to release it.

But it would certainly go a long way in explaining NVidia's dominance in Watch Dogs in terms of CPU scalability and utilization, as AMD does not support DX11 multithreading in their drivers.

While not as efficient as Mantle, DX11 multithreading is still superior to manual threading and can do much to mitigate CPU bottlenecks. It wouldn't surprise me that especially in the twilight year of DX11, NVidia is pushing developers to use DX11 multithreading because it's easier to implement and can result in a large performance gain in CPU limited circumstances.. And these latest AAA titles if anything seem to be CPU intensive due to their size and scale.

Developers have been reluctant in the past to use DX11 multithreading because of immature drivers and the fact that AMD does not support it. But the drivers have clearly matured, and AMD is busy pushing Mantle, so why shouldn't NVidia push DX11 multithreading?

I wouldn't be surprised if the Witcher 3, Far Cry 4, AC Unity and Batman Arkham Knight all support DX11 multithreading either..

The performance boost afforded by DX11 multithreading would be an excellent way to foil AMD's momentum with Mantle, by exposing their foolish mistake by not supporting the most important performance feature of DX11.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
OK, one thing I've considered but not discussed yet in this thread, is whether Watch Dogs support DX11 multithreading. We cannot know unless a Ubisoft developer tells us or NVidia themselves, and going by history, they will likely keep that information for themselves until they see fit to release it.

But it would certainly go a long way in explaining NVidia's dominance in Watch Dogs in terms of CPU scalability and utilization, as AMD does not support DX11 multithreading in their drivers.

While not as efficient as Mantle, DX11 multithreading is still superior to manual threading and can do much to mitigate CPU bottlenecks. It wouldn't surprise me that especially in the twilight year of DX11, NVidia is pushing developers to use DX11 multithreading because it's easier to implement and can result in a large performance gain in CPU limited circumstances.. And these latest AAA titles if anything seem to be CPU intensive due to their size and scale.

Developers have been reluctant in the past to use DX11 multithreading because of immature drivers and the fact that AMD does not support it. But the drivers have clearly matured, and AMD is busy pushing Mantle, so why shouldn't NVidia push DX11 multithreading?

I wouldn't be surprised if the Witcher 3, Far Cry 4, AC Unity and Batman Arkham Knight all support DX11 multithreading either..

The performance boost afforded by DX11 multithreading would be an excellent way to foil AMD's momentum with Mantle, by exposing their foolish mistake by not supporting the most important performance feature of DX11.

Like you said, DX11 multithreading isn't put forth as being a feature of this game. You're simply throwing it out there like it matters.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Like you said, DX11 multithreading isn't put forth as being a feature of this game. You're simply throwing it out there like it matters.

Ubisoft wouldn't put it out there. Nobody knew that AC III used DX11 multithreading, until NVidia made it public in a presentation. In fact, when they were questioned about it, they said:

While we do not wish to expose our threading and jobbing architecture, I can confirm that the approach scales with the amount of available cores.

Source

Now whatever else Watch Dogs is, one thing I have to say about it is that it makes very good use of the CPU; at least on NVidia hardware.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Even BF3 Dice slides says this.

What does DICE have to do with this? Johan already said that he didn't implement driver command lists in BF3 because the drivers were not available at the time. Quite a few developers wanted to use it but weren't able to because of availability problems.

Now of course it's different..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
AMD:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-proz_amd_hi.jpg


NVidia:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test-proz_nvidia_hi.jpg


Like the PCgameshardware.de review, there is a pronounced difference between AMD and NVidia when it comes to CPU utilization in the game; especially with hyperthreading.

HT makes a significant difference in performance on the NVidia setup, while on the AMD setup the difference is marginal..
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
HT makes a significant difference in performance on the NVidia setup, while on the AMD setup the difference is marginal..

Easy...
AMD gpu bound - adding more CPU power doesn't give much.
NV cpu bound - adding more CPU power increase FPS.
Quite the opposite of what your suggest, isn't it?

I wonder why would 295x2 be GPU bound at 50 fps... :sneaky:

Also, that looks quite the same to other twimtbp titles... Does borderlands use command list?
If there is a smoke, there is fire.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Easy...
AMD gpu bound - adding more CPU power doesn't give much.
NV cpu bound - adding more CPU power increase FPS.
Quite the opposite of what your suggest, isn't it?

I wonder why would 295x2 be GPU bound at 50 fps... :sneaky:

Also, that looks quite the same to other twimtbp titles... Does borderlands use command list?
If there is a smoke, there is fire.

It can't be because it's TWIMTBP and that's why it runs like crap on AMD hardware. It's because AMD drivers suck. And any problems it has on nVidia hardware, stutters etc..., are because the game sucks./sarc