Interesting Watch Dogs Benchmarks - GTX 770 4GB Sli vs. R290 4GB XF

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Easy...
AMD gpu bound - adding more CPU power doesn't give much.
NV cpu bound - adding more CPU power increase FPS.
Quite the opposite of what your suggest, isn't it?

How can a R9 295x2 be GPU bound at 1080p? o_O

I wonder why would 295x2 be GPU bound at 50 fps... :sneaky:
Three possibilities:

1) Lack of driver optimization

2) Absence of DX11 multithreading enabled drivers, assuming the game even uses that.

3) Poor/non existent Crossfire profile

Could be a combination of all three as well..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It can't be because it's TWIMTBP and that's why it runs like crap on AMD hardware. It's because AMD drivers suck. And any problems it has on nVidia hardware, stutters etc..., are because the game sucks./sarc

What does TWIMTBP have anything to do with it?

The only advantage that TWIMTBP gives is that it allows closer collaboration between NVidia and the game developer when it comes to implementing special features for PC gamers in general, and NVidia users as well..

From a performance optimization standpoint, TWIMTBP doesn't help NVidia, or hinder AMD..

Metro Last Light was TWIMTBP yet it runs slightly faster on AMD than on NVidia.. Similarly, Far Cry 3 was a Gaming Evolved title yet it runs faster on NVidia hardware than on AMD.

I suppose that Watch Dogs being a TWIMTBP game gave NVidia more access to and time with the game code so that they could optimize their drivers for launch day..

But that's about it. PC games don't use architecture specific optimizations. Watch Dogs is optimized for DX11 hardware, and it's up to IHVs to optimize their drivers for the game as it's the drivers that interface between the hardware and the API..
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What does TWIMTBP have anything to do with it?

The only advantage that TWIMTBP gives is that it allows closer collaboration between NVidia and the game developer when it comes to implementing special features for PC gamers in general, and NVidia users as well..

From a performance optimization standpoint, TWIMTBP doesn't help NVidia, or hinder AMD..

Metro Last Light was TWIMTBP yet it runs slightly faster on AMD than on NVidia.. Similarly, Far Cry 3 was a Gaming Evolved title yet it runs faster on NVidia hardware than on AMD.

I suppose that Watch Dogs being a TWIMTBP game gave NVidia more access to and time with the game code so that they could optimize their drivers for launch day..

But that's about it. PC games don't use architecture specific optimizations. Watch Dogs is optimized for DX11 hardware, and it's up to IHVs to optimize their drivers for the game as it's the drivers that interface between the hardware and the API..

So you are saying there are no specific nVidia optimizations in the game? lol

What other games do is irrelevant and will only get the discussion off topic. We'll suffice it to say that there's a lot more than your 2 sentences needed to describe TWIMTBP vs. Gaming Evolved. Here is the wrong place.

The thought though that other than early access to drivers (which is actually a pretty big advantage early on in a game's life) there are no other optimizations that specifically benefit nVidia really is funny. Maybe you've heard of "GameWorks"?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Maybe you've heard of "GameWorks"?

The only GameWorks related feature used by both IHVs in the game is HBAO+, and that runs equally well on NVidia and AMD. The other is TXAA, but that is architecture specific..

Anyway, I was speaking about raw performance optimizations. You guys are making it seem as though Ubisoft is intentionally optimizing for NVidia hardware more than AMD, when it doesn't work that way.

PC games do not use architecture specific performance optimizations (although I suppose Mantle does but that's N/A in this scenario) as far as I'm aware...
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
If AMD isn;t going to optimize DX11 for this game then the least they could do is have Mantle enabled by default. I almost expected the game to crash when a crane toppled over (levolution), framerate dropped into the 20s D:

No issues with Mantle though, runs just like BF4.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Do you really think the DX optimizations are the same for AMD when there's no Mantle pathway?

nVidia optimized their D3D driver in generel. If AMD doesnt do this then this will hurt then in other D3D games, too.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Its certainly not true that AMD doesn't optimise their D3D path for games, I bet there is a tonne of optimisation in their code. I actually think what we are seeing is the difference in architectures playing out. AMDs cards seem to be more dependent on software tweaking than Nvidia's, we see it time and time again when a game comes out. Initially they perform poorly or certainly less well than Nvidia's card and then a few weeks after release a beta driver comes out and the performance is better than the Nvidia card. Its been happening this way every since the 7970's came out nearly two and a half years ago.

If you think about it for the amount of raw hardware on offer we would expect AMD's cards to perform better. The raw figures suggest that they offer more hardware. Its just really common however to find them performing worse than Nvidia's cards at release. It doesn't stay that way, after a few weeks or months it changes and AMDs cards start performing better but quite often on release there is this period of poor performance. I have always felt that AMDs hardware underperforms compared to its raw stats, has done for many years. The 7970 should absolutely kick the 680's butt around the corner and sideways, but it doesn't. The 290X has a lot more raw horsepower than the Nvidia cards but that 780 ti seems to hang in there despite its performance deficit. Its one of those observations that seems to get people here angry but it does seem to be the reality that for a variety of reasons Nvidia's cards have better performance on release day than AMD, and that they seem to compete with cards that on paper have more performance from AMD.

I believe its this which makes Nvidia's cards more valuable in the market place. People call it driver problems with AMDs cards but this initial underperformance then fixed has been going on a long long time. Back with my 4870X2 I was heavily dependent on a driver release before games would run well. I didn't see any change with my 5970 or the 7970's, and it sounds like people are still seeing the same thing with 290's. This I suspect is the less obvious reason people say AMDs driver software is worse than Nvidia's. Ignoring the major scandal on this generation of cards around frame pacing and stuttering we haven't seen too many serious problems (mostly power saving issues really). But this little niggle of poor performance then beta driver then fixed and lots and lots of beta drivers from AMD continues. I just think their hardware is heavily dependent on tweaked parameters to make it perform well on a game and that they need to set those or it just doesn't perform. I know in the past I have taken profile details from people and applied them myself to get dramatic increases in performance and find the next driver performs identically and comes with a very similar profile as I used myself.

So I think we'll see hardline and watch dog performance improve, infact I guarantee it. Because I have seen this before and we are one beta driver away from AMD being awesome again, and frankly it ought to be winning its got the better hardware.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
nVidia optimized their D3D driver in generel. If AMD doesnt do this then this will hurt then in other D3D games, too.

Sorry, but this just seems like a rationalization. It makes sense to you but without any research into what you are claiming. You are saying nVidia does "general" DX optimizations then stating if AMD doesn't it will hurt them. Then you are using this big IF to substantiate your position.

Where are you getting that nVidia does more than AMD outside of specific game optimizations (I assume that's what you are referring to as "general")? Seems like an ambiguous assumption on your part and nothing else. Or, do you have some source that AMD doesn't maintain their DX driver as well as nVidia does?
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
HT makes a significant difference in performance on the NVidia setup, while on the AMD setup the difference is marginal..
Or you can say CPU overhead of NV drivers is comparatively larger than AMD's.. thus requiring more CPU cycles and faster CPUs. It's a different side of the same coin.

If there were a toggle in NV's drivers where this optimization that you are talking about could be turned off and the result mimicked AMD's, along with similar CPU usage, then maybe the theory on your side would be more persuasive.

I am thinking rather proprietary "optimization," knowing NV's track record. Hyperthreading doesn't really help games. (it usually hurts)
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
...Hyperthreading doesn't really help games. (it usually hurts)

Not true. The data from gamegpu.ru on average shows a 5% improvement in performance and just 2 of the 55 games I have captured data for shows a reduction in performance (up to 5%). The biggest gain was actually 22%. HT does indeed help games, just normally not by very much at all (3/4 of games the gain is less than 7% on a high end card either a 690 or a 780 ti)/
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Initially they perform poorly or certainly less well than Nvidia's card and then a few weeks after release a beta driver comes out and the performance is better than the Nvidia card. Its been happening this way every since the 7970's came out nearly two and a half years ago.

I think you're simplifying things a bit too much. There are also plenty of examples where a game ran faster on AMD at launch, and then later on down the road ran faster on NVidia..

The 7970 should absolutely kick the 680's butt around the corner and sideways, but it doesn't. The 290X has a lot more raw horsepower than the Nvidia cards but that 780 ti seems to hang in there despite its performance deficit. Its one of those observations that seems to get people here angry but it does seem to be the reality that for a variety of reasons Nvidia's cards have better performance on release day than AMD, and that they seem to compete with cards that on paper have more performance from AMD.
So either NVidia's cards are more efficient, or AMD is exaggerating the raw performance of their GPUs. I remember reading somewhere that AMD calculates their T/flops differently than NVidia..

So I think we'll see hardline and watch dog performance improve, infact I guarantee it. Because I have seen this before and we are one beta driver away from AMD being awesome again, and frankly it ought to be winning its got the better hardware.
Oh I'm certain it will improve as well.. The question is how much..

The difference is huge though, and the fact that the CPU scaling on NVidia is so much better than on AMD leads me to believe that Watch Dogs may indeed support DX11 multithreading. If that's true, then AMD is SOL, because they still haven't implemented that feature after years of having the chance to do so..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Or you can say CPU overhead of NV drivers is comparatively larger than AMD's.. thus requiring more CPU cycles and faster CPUs. It's a different side of the same coin.

If that were true, you would expect the game to run slower on NVidia than on AMD given the same CPU power....but you see the opposite.

I am thinking rather proprietary "optimization," knowing NV's track record. Hyperthreading doesn't really help games. (it usually hurts)
I think people are putting way too much stock into this "optimization" thing. Ubisoft owns the game code, not NVidia. And the game uses D3D11 so they can't implement architecture specific code into the game anyway. If the game used a lower level API like Mantle or Glide, then that would be different, but it doesn't. The only performance advantage NVidia has gotten with this Ubisoft alliance is that they are able to have access to the code much sooner and thus are able to start optimizing their drivers earlier than the competition.

Regarding hyperthreading, a game doesn't have to be specifically optimized to use it. The OS sees HT enabled CPUs as having extra logical processors.

So if a game can scale to use those extra threads, then it will do so and should receive a performance benefit though not as much as if they were "real" cores.

DX11 multithreading allows developers to scale to as many threads as they want to, and with greater ease and performance gain than with manual threading. It's really a shame that the feature was underused throughout the years because of immature drivers and AMD not bothering with it..

I hope this last round of advanced DX11 games will make use of it, before DX12 takes over.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
If that were true, you would expect the game to run slower on NVidia than on AMD given the same CPU power....but you see the opposite.

I think people are putting way too much stock into this "optimization" thing. Ubisoft owns the game code, not NVidia. And the game uses D3D11 so they can't implement architecture specific code into the game anyway. If the game used a lower level API like Mantle or Glide, then that would be different, but it doesn't. The only performance advantage NVidia has gotten with this Ubisoft alliance is that they are able to have access to the code much sooner and thus are able to start optimizing their drivers earlier than the competition.
Doesn't WD use nv game libraries, which is quite the opposite to what you claim as facts?
I wonder how many layers upon layers of tasselated ocean is below the surface of LA.
Should AMD owners thank nv for those ;) optimizations ;) as it changes performance of their machines rather than nv owner's?

DX11 multithreading allows developers to scale to as many threads as they want to, and with greater ease and performance gain than with manual threading.
Funny its quite the opposite to what game devs say... but be it your way. Lets hang the universe on the thin string of your beliefs.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And here numbers from Battlefield: Hardline:
http://www.pcgameshardware.de/Battl...Battlefield-Hardline-Beta-Benchmarks-1125079/

A GTX770 is nearly as fast a 290X with DX11.1 in 1080p with 4xMSAA.

Good luck with your AMD card when games dont support Mantle. :\

Way to make conclusions on the Beta testing that isn't a final game with no final driver optimizations from AMD or NV. Also, way to make drastic conclusions that suggest GTX770 nearly as fast as 290X based off 1 prereview and not having consulted 3-5 previews.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta-test-bfh_1920.jpg


At 200% draw distance, the 770 is only as fast as a 925mhz 7970 but you don't see people parading this as the final performance before the game is released :sneaky:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta-test-bfh_1920_200.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta-test-bfh_2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta-test-bfh_2560_200.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta-test-bfh_3840_150.jpg
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The difference is huge though, and the fact that the CPU scaling on NVidia is so much better than on AMD leads me to believe that Watch Dogs may indeed support DX11 multithreading. If that's true, then AMD is SOL, because they still haven't implemented that feature after years of having the chance to do so..

Whether your hypothesis is correct or not makes no difference since a $580 GTX780 OC 6GB can't beat a stock R9 290X in Watch Dogs. You continue to claim that somehow NV offers a huge advantage in WD but it's simply not true:

Review as of June 11, 2014

"The AMD R9 290X based video card was also able to run at "Ultra" settings with "Ultra" textures, but we did have to use SMAA, not Temporal SMAA, and MHBAO. Basically, it matches the gameplay experience and performance eye-to-eye with the ASUS STRIX GTX 780 OC 6GB in this game."
http://www.hardocp.com/image.html?image=MTQwMjQzNjI1NGowQ25oQWIyWjVfM18zX2wuZ2lm

However, an after-market R9 290X can be purchased for $470 and a 5-6% barely slower R9 290 for just $380. If NV sold GTX780 for $380-400, you may have a point but these prices are wishful thinking at the moment. Instead NV charges $650-700 for GTX780Ti and 780 3GB are going for $460+, 6GB version for $550+. Any small performance advantage NV may have accounts for nothing since one can purchase almost 2x 290s for the price of a single after-market 780Ti. :cool:

Also, you tend to ignore overall performance of GPUs and tend to cherry-pick the games where NV performs best ignoring all the games where AMD has an advantage. When taking many games into account, a reference 780Ti Max OC is barely faster than after-market R9 290s.

http://www.computerbase.de/2014-04/sapphire-radeon-vapor-x-r9-290-luefter-aus-lautstaerke-test/3/

You might squeeze 15-20% over a max overclocked 290 on the best 780Ti but those cards cost $700+ easily vs. $380 for an XFX R9 290, which also happens to have a lifetime warranty!

If 880 debuts with lower pricing than 780Ti and beats it, the resale value of 780Ti will drop $200-300 very quickly. If a gamer has 2 of those cards, it's going to be a "party". Going with AMD for the last 2-3 generations and losing the small 10-15% at the flagship single GPU level meant saving hundreds of dollars that are then carried over towards the next generation. Ignoring bitcoin mining, getting an HD4890 vs. 285 would have saved a gamer $160, getting a $230 6950 unlocked vs. $450 580 1.5GB would have saved a whopping $220. Getting $280 7950s overclocked vs. $450 680 2GBs OC would have saved one $340. If someone purchased R9 290s vs. 780Ti SLI, they saved $540. Over these 4 generations, that's a savings of $1,260. However, if taking bitcoin mining into the equation and assuming the gamer upgraded each generation (not taking account resale value since those who did bitcoin also sold their cards), going with NV single GPU would have meant:

580 = $500 vs. $0 for any 6900
680 = $500 vs. $0 for any 7900
780Ti = $700 vs. $0 for any R9 290

Ouch.

In the face of real world facts, NV's 10, 15 or even 20% advantages in certain games will never make up for the fact that all of AMD's cards since HD4870 were free due to bitcoin and scrypt mining. Ignoring this advantage, NV's route still failed miserably when you start adding all the savings from one generation over the other. Of course the more savvy gamers stepped 1 down to 570, 670 and 780 and overclocked those. However, as can be seen time and time again, buying NV's flagship GPU is one of the worst purchases. If one doesn't compare NV's flagship card that has a 10-15% advantage over AMD's best, then the 2nd best almost always has failed to beat AMD's best anyway. All of your arguments always ignore that videocards aren't free. For example your 770 4GB cost you $900 US but R9 280X in CF today costs just $560. And obviously R9 290s in CF beat your cards in 90% of games for the same price vs. today's 770 4GBs. You never think of that....as if money doesn't matter. For those gamers who are price inelastic, they purchase 780Ti SLI, not 770 4GB SLI.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Doesn't WD use nv game libraries, which is quite the opposite to what you claim as facts?

The only NVidia game features that Watch Dogs uses are TXAA and HBAO+, the latter of which runs very fast on AMD hardware..

Everything else was done in house by Ubisoft.

I wonder how many layers upon layers of tasselated ocean is below the surface of LA.

It's Chicago :rolleyes:
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Whether your hypothesis is correct or not makes no difference since a $580 GTX780 OC 6GB can't beat a stock R9 290X in Watch Dogs. You continue to claim that somehow NV offers a huge advantage in WD but it's simply not true:

In CPU limited situations it does, according to two reviews. In GPU limited situations, which is how HardOCP conducted their review it's likely more even.. And of course I shouldn't have to say that DX11 multithreading only helps in CPU bound situations..

However, an after-market R9 290X can be purchased for $470 and a 5-6% barely slower R9 290 for just $380. If NV sold GTX780 for $380-400, you may have a point but these prices are wishful thinking at the moment. Instead NV charges $650-700 for GTX780Ti and 780 3GB are going for $460+, 6GB version for $550+. Any small performance advantage NV may have accounts for nothing since one can purchase almost 2x 290s for the price of a single after-market 780Ti. :cool:
Yawns. And yet NVidia has no problem selling according to you, their vastly overpriced GPUs.. So either hardware enthusiasts are suckers, or maybe there's more to price and perceived value than just performance.

AMD is just now starting to understand that extra perks and features are just as important as performance when it comes perceived value..
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
AMD is just now starting to understand that extra perks and features are just as important as performance when it comes perceived value..

This certainly is true on the mobile front from the 7970m vs. 680m days where the recommendation was to buy the 680 even though it offered the same performance for $300 more because of 7970m enduro issues.

For mobile nvidia without a doubt (still can't play on the 7730m on other laptop because graphics switching is broken and there is no way to deactivate). Mobile drivers are often OEM locked and you have no way around that. Desktop AMD is significantly better.

That said AMD is offering outstanding value currently.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In CPU limited situations it does, according to two reviews. In GPU limited situations, which is how HardOCP conducted their review it's likely more even.. And of course I shouldn't have to say that DX11 multithreading only helps in CPU bound situations..

And when do these practical situations occur? You keep making statements that because of XYZ, NV's cards perform much faster and that AMD is SOL, but real world gaming shows no apparent advantage - just NV's high prices for similar performance and ludicrously overpriced 780Ti with barely more performance but only 3GB of VRAM.

Yawns. And yet NVidia has no problem selling according to you, their vastly overpriced GPUs.. So either hardware enthusiasts are suckers, or maybe there's more to price and perceived value than just performance.

AMD is just now starting to understand that extra perks and features are just as important as performance when it comes perceived value..

Right so when you have nothing to disprove my point, you turn to things like sales and market share. In Argentina Apple sells iPad for 2x the price and iPhone 5S came out for $3500 US there:
http://qz.com/127763/what-happens-when-a-country-doesnt-allow-apple-to-import-new-iphones/

Just because something sells and even if it sells well doesn't make it a good purchase. Again, on our forum I've seen people say they'd rather pay $900 for 780Ti than get an after-market 290X for free. If it makes sense for you to spend hundreds of dollars more for NV, which quickly adds up to thousands in 4 generations, that's your money. However, you tend to ignore the money spent as if it doesn't matter. I've been working in East Asia for about a year and people here buy NV 90-95% of the time. If you sold them a 290X for the price if a 760, they would pick a 760. Same with Russia, where people are brand-brainwashed.

In my travels, I could name a lot of places where products are overpriced yet they sell well. In Kyrgyzstan, Messi's Adidas shoes cost $300 and avg salary in the capital is around $500. In Brazil, NIKE running shoes are $400-500 and Ps4 is $1750. In Australia, brand new 911 GT3 is 340,000 Australian, or 2.5x what it costs in Canada.

If course at times NV has great options like the $249 760 or $499 680 but in a matter of months, AMD lowers prices and NV ends up horribly overpriced. It happens consistently almost every generation.

If you travelled a lot, in some places Coke dominates the market 90%. It doesn't suddenly mean that Pepsi is crap but in a lot of 3rd world countries all they care about is the brand. You say NV's drivers are far superior but 290X CF beats 780Ti in 4K and a multi-monitor gaming. Perception doesn't match reality. But if the avg consumer believes the perception, it is AMD's loss. Still doesn't change reality of performance however. Some posters on our forum even found CF smoother than SLI.

You keep saying "perceived value" and that's the point. I don't see anything that NV offers that makes me want to spend $650 for a 780Ti over a $380 R9 290. I'd rather buy a slightly slower card and spend the rest on games, beer, food, paragliding, downhill skiing. If gaming is someone's primary and maybe only outlet, then sure maybe for that person thousands of dollars extra from 4870 generation made sense.
 
Last edited: