[Forbes] Why 'Watch Dogs' Is Bad News For AMD Users

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
According to Oxide, the cost of supporting Mantle is relatively low. It's less buggy than DirectX, and performs better with less optimization, and makes up a relatively small portion of the entirety of the development cost. Once its incorporated into an engine, you can pretty much just forget about it.

Back in the day, some games supported DirectX, OpenGL, Glide, and occasionally even a few more vendor specific APIs. Even today, most games support at least two APIs, OpenGL to run on phones and consoles, and DirectX for Microsoft platforms. Once you've abstracted your code to translate between two APIs, additional ones aren't a lot of work.

eh...
according to Oxide

Lets be honest here. These guys are handpicked by AMD to promote Mantle, and they haven't been doing anything lately other than fixing their BENCHMARK and giving interviews about how wonderful Mantle is

WHICH IS NOT TO SAY I WOULDN'T WANT TO SEE MANTLE IN AS MUCH GAMES AS POSSIBLE

Because Mantle can and does achieve lower CPU/driver overhead, better mt, more responsive GPU etc etc.

But it seems to me they are too (emotionally and otherwise) involved with Mantle to be taken as granted when it comes with to Mantle prospects and (developing) issues.
This is basically the same as asking AMD to comment on their newborn baby.
 
Last edited:

The Alias

Senior member
Aug 22, 2012
646
58
91
eh...
according to Oxide

Lets be honest here. These guys are handpicked by AMD to promote Mantle, and they haven't been doing anything lately other than fixing their BENCHMARK and giving interviews about how wonderful Mantle is

wait, how do you know what an entire game studio has been doing over the course of 3+ months? Last time I checked, giving a few opinions on developing with a new api didn't make everything you say questionable.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If the article is right about nVidia requiring devs to sign saying that they cannot accept suggestions or code from AMD is true, this is horrible for all gamers, period. nVidia going and obfuscating the code to prevent AMD from being able to make driver improvements is a terrible thing for gamers as well. Why anybody would consider this acceptable is beyond me. It really is almost borderline illegal. Intel got in trouble for doing something similar many years ago and the FTC nailed them for it.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
GW can do whatever it wants before sending stuff through the driver.

Code:
Before sending data to the GPU
    if(AMDCard){
            Thread.Sleep(2);
    }
Optimize that.

This article still seems like a load of fud though.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
If the article is right about nVidia requiring devs to sign saying that they cannot accept suggestions or code from AMD is true, this is horrible for all gamers, period. nVidia going and obfuscating the code to prevent AMD from being able to make driver improvements is a terrible thing for gamers as well. Why anybody would consider this acceptable is beyond me. It really is almost borderline illegal. Intel got in trouble for doing something similar many years ago and the FTC nailed them for it.

And what is different from AMD?
Here something about TressFX:

Andrew Lauritzen
AMD isn't quite the white knight you make them out to be here. For instance while they seem to unofficially "say" that game devs are allowed to modify the TressFX code (although they have no license to say as much), they will not allow other IHVs to post optimized versions. Mantle is similar... to the press they say that they want it to be portable and would discuss standardizing it but have so far refused to even share specs with other IHVs let alone have a discussion. Really not that different from GW in practice, they've just managed to avoid getting press attention about it so far.
http://beyond3d.com/showpost.php?p=1850372&postcount=190

Nobody outside of AMD is allowed to share their "TressFX" version with other developers. Every game with TressFX uses the AMD version which will always run better on AMD hardware and other companies need to optimize it after the release.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
And what is different from AMD?
Here something about TressFX:

Andrew Lauritzen

http://beyond3d.com/showpost.php?p=1850372&postcount=190

Nobody outside of AMD is allowed to share their "TressFX" version with other developers. Every game with TressFX uses the AMD version which will always run better on AMD hardware and other companies need to optimize it after the release.

that's not true. it's been acknowledged that a dev messed up tomb raider and needed a patch. Nvidia did have game code prior to release. As far as mantle, it's been in alpha for months. I don't think that's a really good stage to judge it by.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Before even using the 14.6 cats, 7970Ghz/280X have no issues competing with a 770. R9 290x and 780ti are also very close, except of course the massive price disparity between them. 7950 Boost is right on the heels of the 680, pretty amazing considering 7950 V2 cost $280 when 680 was $450!

http://www.techspot.com/review/827-watch-dogs-benchmarks/page3.html

You're quoting a review that used pre-release drivers from both camps that had inferior performance compared to the release day drivers :rolleyes:

The only reason AMD is competitive in this game, is because their GPUs come with more standard memory. When the VRAM is equivalent or greater though, NVidia pulls ahead significantly.

The PCgameshardware.de review shows that perfectly. The GTX 770 4GB is faster than the R290, and just a bit slower than the R290x; a card in a higher price bracket.

Also, the review shows NVidia getting superior multithreaded performance compared to AMD. It's possible that the Watch Dogs engine uses DX11 multithreading, because the difference in performance between AMD and NVidia in this area is substantial.

PCgameshardware.de review
 

EJSLP

Member
Feb 3, 2014
77
0
0
Also lets not forget these exclusive Nvidia features don't come free either. When you turn on these nvidia features, it eats a huge chunk out of your performance. I remember Arkham City and Origin eating 10-20 FPS when enabling Physx.

Is this true? how do you disable it?
and what about Nvidia's TXAA should i use that or is that a major fps killer?
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
If the article is right about nVidia requiring devs to sign saying that they cannot accept suggestions or code from AMD is true, this is horrible for all gamers, period. nVidia going and obfuscating the code to prevent AMD from being able to make driver improvements is a terrible thing for gamers as well. Why anybody would consider this acceptable is beyond me. It really is almost borderline illegal. Intel got in trouble for doing something similar many years ago and the FTC nailed them for it.

This.

I had previously been a huge fan of nVidia (10 years ago), but after countless episodes of seeing nVidia pulling these stunts and sleight of hands to hurt competition rather than stand on their own I've made a decision to stop supporting nVidia by no longer buying their products.

Shame really. Some cards I really like, 750ti for x11 mining, 780ti for gaming, and loved my GTX460 SLI setup (last nVidia purchase). But the crap nVidia pulls here and elsewhere is a large deterrent to ever again giving nVidia my money. Doesn't keep me from suggesting nVidia where appropriate, but I personally won't buy nVidia.

Simple answer for nVidia to attract me back is to improve the games period, don't improve the games while using tactics to harm competition and put in artificial limitations like not letting a dev take suggestions from AMD. No doubt in my mind that games would be better without Gameworks. Gameworks is a half assed way to improve games, but it's main goal IMO is to undermine competition by locking out competition from having a look or improving things where possible for the given game. I'd rather nVidia *Not* be involved at all in game development if their involvement is contingent on silly lock outs for it's competition.

Same goes for AMD as they work with Devs to improve games, though from what i've seen AMD's approach to helping development of games alligns with what I want and expect from the two GPU manufacturers.

This kind of differention between the two GPU manufactures matters a lot to some people and not much at all to others, but it does effect our games regardless IMO. Fortunately apart from the GPU manufactures differences of approach in involvement/help in game development, both sides make great and exciting hardware.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
that's not true. it's been acknowledged that a dev messed up tomb raider and needed a patch. Nvidia did have game code prior to release. As far as mantle, it's been in alpha for months. I don't think that's a really good stage to judge it by.

Convenient for AMD : http://community.amd.com/community/...ider-launches-with-revolutionary-tressfx-hair

Oh, look who wrote the blog post - Robert Hallock. I guess it's okay to work with game developers as long as it AMD.
This technology is the result of months of close collaboration between software engineers at AMD and Tomb Raider's developer, Crystal Dynamics.
^_^

BTW: AMD got a copy of the game prio the launch, too. Unlike Tomb Raider it was not an outdated versions.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You're quoting a review that used pre-release drivers from both camps that had inferior performance compared to the release day drivers :rolleyes:

The only reason AMD is competitive in this game, is because their GPUs come with more standard memory. When the VRAM is equivalent or greater though, NVidia pulls ahead significantly.

Ok and look at other reviews, what conclusion is there? You say that AMD only pulls ahead when the VRAM issues come up and guess what they do. At 1080P, 680/770 2GB < 7970GE/280X, at 1440p, 780/780Ti < R9 290/X because they get stuttering. Same thing for SLI vs. CF. Users with those cards can enable Ultra textures due to VRAM. So while NV cards may run slightly faster at each price bracket, their actual IQ may be lower.

The PCgameshardware.de review shows that perfectly. The GTX 770 4GB is faster than the R290, and just a bit slower than the R290x; a card in a higher price bracket.

On what market are R9 290 and 770 4GB priced in different brackets?

R9 290 Asus DCUII = $400
The cheapest 770 4GB on Newegg is from PNY for $380.

That's the same price bracket. Problem is 770 4GB gets beaten by R9 290 4GB in 90% of all other games. So while 770 4GB is competitive with R9 290 in WD, in the majority of games it's far slower.

An after-market R9 290 is just 6-10% slower than a max power limit 780Ti and is a solid 18-23% faster than a stock 780. It would utterly destroy a 770 once overclocked.

That's the thing, even with GW, AMD's cards are still very close and still offer superior performance/$.

Also, the review shows NVidia getting superior multithreaded performance compared to AMD. It's possible that the Watch Dogs engine uses DX11 multithreading, because the difference in performance between AMD and NVidia in this area is substantial.

That's nice but R9 290 in CF for a bit more money destroys a 780Ti in most games, R9 290 CF mops the floor with 770 4GB SLI for nearly the same price! R9 290X costs about as much as 780 does and beats it. And in Watch Dogs, NV's 2-3GB cards are all crippled when it comes to their intended performance levels with Ultra textures (2GB 680/770 at 1080P and 3GB 780/780Ti at 1440/1600p).

NV needs to lower prices on all of its 770/780/780Ti cards. With bitcoin/script mining craze out of the way, all of these cards have become overpriced.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
On what market are R9 290 and 770 4GB priced in different brackets?

R9 290 Asus DCUII = $400
The cheapest 770 4GB on Newegg is from PNY for $380.

The lowest price for the GTX 770 4GB is $369. But I'm very surprised that the GTX 770 4GB is commanding such a high price after all this time. I guess they are a very popular card, despite that you think they are overpriced. It will make selling them much easier for me when it comes time for me to upgrade though.

That's the same price bracket. Problem is 770 4GB gets beaten by R9 290 4GB in 90% of all other games. So while 770 4GB is competitive with R9 290 in WD, in the majority of games it's far slower.

Now that the mining craze is over, AMD has apparently had to lower their prices again to compete with NVidia. The perception of AMD GPUs being bargain products still persists in the minds of hardware enthusiasts it seems.

An after-market R9 290 is just 6-10% slower than a max power limit 780Ti and is a solid 18-23% faster than a stock 780. It would utterly destroy a 770 once overclocked.

That's the thing, even with GW, AMD's cards are still very close and still offer superior performance/$.

I agree that the R290 is a very fast and capable card, but those reviews are automatically invalid to me because they don't even list the test platform. The 337.50 beta drivers increased performance across a wide variety of titles, and for all I know, those reviews may be using an older set.

That's nice but R9 290 in CF for a bit more money destroys a 780Ti in most games, R9 290 CF mops the floor with 770 4GB SLI for nearly the same price! R9 290X costs about as much as 780 does and beats it. And in Watch Dogs, NV's 2-3GB cards are all crippled when it comes to their intended performance levels with Ultra textures (2GB 680/770 at 1080P and 3GB 780/780Ti at 1440/1600p).

I like how you just gloss over this. NVidia and AMD will always release new products. Some will be faster, and some will be slower than their competition.

However, the fact that NVidia has shown their driver multithreading to be vastly superior to AMD's is a BIG deal, because it means that their GPUs have a greater chance of performing closer to their maximum potential as games become progressively more multithreaded.

Watch Dogs is just a sample of things to come.

NV needs to lower prices on all of its 770/780/780Ti cards. With bitcoin/script mining craze out of the way, all of these cards have become overpriced.

I agree that the GPUs are overpriced, but it's a consumer driven market. If people are willing to pay these prices, why should either company lower them? The GTX 770 4GB is a perfect example of that. .
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Convenient for AMD : http://community.amd.com/community/...ider-launches-with-revolutionary-tressfx-hair

Oh, look who wrote the blog post - Robert Hallock. I guess it's okay to work with game developers as long as it AMD.
^_^

Stop twisting my argument. Nvidia working with devs is not the issue. Our problem lies with Nvidia prohibiting developers from taking suggestions from amd regarding optimizations.

Also, Robert Hallock is a public figure for AMD, so I don't know why you're equating him speaking to the public as a reason not to believe what is said to be going on.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
However, the fact that NVidia has shown their driver multithreading to be vastly superior to AMD's is a BIG deal, because it means that their GPUs have a greater chance of performing closer to their maximum potential as games become progressively more multithreaded

Only to you, as you mention it in nearly every benchmark thread.

You do realize that a game engine using multiple threads is in NO way related to any DirectX multi-threading right? Just because more games are becoming multi-threaded does not mean the poorly designed DirectX 11 multi-threading is suddenly a bigger issue. I still have not heard of any released games that use it outside of FC3, which removed it because it caused so many issues.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You do realize that a game engine using multiple threads is in NO way related to any DirectX multi-threading right? Just because more games are becoming multi-threaded does not mean the poorly designed DirectX 11 multi-threading is suddenly a bigger issue. I still have not heard of any released games that use it outside of FC3, which removed it because it caused so many issues.

If you reread my post, you'll see that I never even mentioned DX11 multithreading. You're the one that brought that up, not me.

In any case, whether Watch Dogs supports DX11 multithreading or not, it's obvious that NVidia's drivers are more optimized for multithreaded CPUs and games.

The game doesn't have to explicitly support DX11 multithreading for NVidia to gain the edge. It only needs to be multithreaded, and the more threads it supports, the greater NVidia's advantage.

Watch Dogs is the most multithreaded game I've ever played. I'm seeing almost even loads across all 12 threads on my CPU. The PCgameshardware.de review also showed excellent scaling on an FX-8350..
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
If you reread my post, you'll see that I never even mentioned DX11 multithreading. You're the one that brought that up, not me.

In any case, whether Watch Dogs supports DX11 multithreading or not, it's obvious that NVidia's drivers are more optimized for multithreaded CPUs and games.

The game doesn't have to explicitly support DX11 multithreading for NVidia to gain the edge. It only needs to be multithreaded, and the more threads it supports, the greater NVidia's advantage.

Watch Dogs is the most multithreaded game I've ever played. I'm seeing almost even loads across all 12 threads on my CPU. The PCgameshardware.de review also showed excellent scaling on an FX-8350..

I like how you use only one set of benchmarks to prove you point. More benchmarks show the AMD cards being faster, more VRAM or not. High or Ultra textures.

I don't see a difference here.

wd_cpu_r.png


wd_cpu_gf.png
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
imho,

I think developers understand what they desire for their games and if nVidia and AMD can offer more tools and abilities -- no one is forcing developers!

Heck, Crytek sees potential with AMD's Mantle and nVidia's GameWorks!
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Thankfully Unreal Engine 4 is making sure even though they work with NVidia for PhysX and Apex it is done on CPU, and they make sure it works well on NVidia and AMD.

Plus thankfully there are people working on adding Bullet physics which will work on both AMD and NVidia.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I like how you use only one set of benchmarks to prove you point. More benchmarks show the AMD cards being faster, more VRAM or not. High or Ultra textures.

I don't see a difference here.

Are you serious? o_O

Look at the graph dude. The GTX 780's minimum frame rates are as much as 30% higher than the R290x at times using the same CPU (ie the stock 4770K)... If the GTX 780 had more VRAM, the difference would be even greater..
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Stop twisting my argument. Nvidia working with devs is not the issue.

I didn't twist your arguments. If nVidia isn't allowed to work with gaming evolved developers before the game is released then this is "a issue". Here are nVidia words about Tomb Raider:
Somewhat contradictorily, Cebenoyan went on to tell me that, in "at least" two instances, AMD's own developer relations efforts impeded Nvidia's work with game developers. "We know of real examples where we have actually explicitly been forbidden from seeing builds&#8212;forget source code, even just binary builds&#8212;of games that include high-end effects," Cebenoyan said. "The full game with all of the effects, the important PC ultra quality settings, [was] hidden from us until say a few weeks before launch, something like that. These were things that were contractually obligated."
Cebenoyan wouldn't name names, but his description sounded an awful lot like the allegations Nvidia made after Tomb Raider came out last year. Cebenoyan doesn't think the developers set out to disenfranchise Nvidia users willingly. Rather, he blames a hidden clause in their contract with AMD.
http://techreport.com/news/26521/nvidia-responds-to-amd-gameworks-allegations

Our problem lies with Nvidia prohibiting developers from taking suggestions from amd regarding optimizations.
And nVidia denied it.

Also, Robert Hallock is a public figure for AMD, so I don't know why you're equating him speaking to the public as a reason not to believe what is said to be going on.
Because he has an agenda. And knowing that this guy has no problem to compare cards with a broken game i cant take him serious.

But the whole thing is now backfiring at AMD:
http://techreport.com/news/26521/nvidia-responds-to-amd-gameworks-allegations

http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/
 

The Alias

Senior member
Aug 22, 2012
646
58
91
I didn't twist your arguments. If nVidia isn't allowed to work with gaming evolved developers before the game is released then this is "a issue". Here are nVidia words about Tomb Raider:
http://techreport.com/news/26521/nvidia-responds-to-amd-gameworks-allegations

And nVidia denied it.

Because he has an agenda. And knowing that this guy has no problem to compare cards with a broken game i cant take him serious.

But the whole thing is now backfiring at AMD:
http://techreport.com/news/26521/nvidia-responds-to-amd-gameworks-allegations

http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/
Cem works for Nvidia too does he not? So how come he's not the one with the agenda?

The point I'm trying to make is this. You guys automatically deem amd statements as lies but take nvidia statements as fact. In my opinion if you have 2 people saying two separate things then you evaluate both claims to see whether or not they are true and in this case, there isn't enough proof on either side to deem another's as false. So why are you guys assuming amd is lying and nvidia is telling the truth?
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
You're quoting a review that used pre-release drivers from both camps that had inferior performance compared to the release day drivers :rolleyes:

The only reason AMD is competitive in this game, is because their GPUs come with more standard memory. When the VRAM is equivalent or greater though, NVidia pulls ahead significantly.

The PCgameshardware.de review shows that perfectly. The GTX 770 4GB is faster than the R290, and just a bit slower than the R290x; a card in a higher price bracket.


Also, the review shows NVidia getting superior multithreaded performance compared to AMD. It's possible that the Watch Dogs engine uses DX11 multithreading, because the difference in performance between AMD and NVidia in this area is substantial.

PCgameshardware.de review

What are you basing this on, the 1920x1080 benchmarks? If so those are basically irrelevant to many people. Nvidia cards tend to be faster than AMD at 1080p resolutions, but as soon as you start cranking up the resolution 1440+ the lead completely goes away. Doesn't matter if VRAM is an issue or not. Look at benches of the 290X vs Titan at 4K resolution, the 290X pulls ahead in many cases.

AMD's architecture is better suited for high resolutions.

http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/14#.U4bQHB3Nn-g

Another good data point from HardOCP here: http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

They tested Watch Dogs @ 2560x1600. At High texture settings where VRAM limitations aren't coming into play the 780 Ti got 53.1 FPS versus the 290X's 52 FPS.

I have to say, considering the title is also NV-sponsored this is a pretty darn good showing from AMD.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What are you basing this on, the 1920x1080 benchmarks? If so those are basically irrelevant to many people.

Perhaps you didnt phrase that the way you wanted but 1080p performance is relevant to the vast majority of people.
 
Status
Not open for further replies.