Does GameWorks influences AMD's Cards game performance?

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gameworks, does it penalizes AMD cards?

  • Yes it defenitly does

  • No it's not a factor

  • AMD's fault due to poor Dev relations

  • Game Evolved does just the same


Results are only viewable after voting.
Status
Not open for further replies.

Edgy

Senior member
Sep 21, 2000
366
20
81
I have never heard of Extreme Tech but that article alone is an epitome of what I expect in a great tech information site.

I wait in hope that DX12 can bring some parity solution to this issue - at least until NV can think of other ways to implement and hide their unfair competitive practices.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I have never heard of Extreme Tech but that article alone is an epitome of what I expect in a great tech information site.

I wait in hope that DX12 can bring some parity solution to this issue - at least until NV can think of other ways to implement and hide their unfair competitive practices.

That's the unfortunate part that a less known site wrote about GW years ago, while the most popular North American sites such as AnandTech, TechPowerUp, TechReport, PCPerspective, HardOCP, Tom's Hardware have closed their eyes on this issue.

"In Nvidia’s GameWorks program, though, the libraries are effectively black boxes. Nvidia has clarified that developers can see the code under certain licensing restrictions, but they cannot share that code with AMD — which means AMD can’t optimize its own drivers to optimally run the functions or make suggestions to the developer that would improve the library’s performance on GCN hardware. This is fundamentally different from how most optimization is done today, where Nvidia and AMD might both work with a developer to optimize HLSL code for their respective products."

The interesting part is if NV had an inherent advantage in certain GW's features it uses, there would be no way for a developer's patch to improve performance since well fair and square NV's architecture would be superior, as so one would think. But...

Is GameWorks distorting problems in today’s shipping games?

"To answer this question, I’ve spent several weeks testing Arkham Origins, Assassin’s Creed IV, and Splinter Cell: Blacklist. Blacklist appears to only use GameWorks libraries for its HBAO+ implementation, and early benchmarks of this last game showed a distinct advantage for Nvidia hardware when running in that mode. Later driver updates and a massive set of game patches appear to have resolved these issues; the R9 290X is about 16% faster than the GTX 770 at Ultra detail with FXAA enabled."

http://www.extremetech.com/extreme/...rps-power-from-developers-end-users-and-amd/1

More PC gamers need to be aware about GW and how it works, and why it promotes unfair competition / business practices. It also gives NV full control over how GW's source code works on various generations of NV cards because the developer uses NV's in-house developed SDK source code. Since that source code (i.e., GW SDK versions) are constantly being updated, NV can easily use GWs as a calculated method to plan obsolescence of older generation of cards. After all, NV knows the capabilities of Fermi, Kepler, Maxwell, and soon Pascal in performing PCSS+, HBAO+, tessellation, PhysX, etc. and can alter the intensity or optimization for each of its architectures inside the SDK's own source code.

NV could update NV's HairWorks, WaveWorks, PhysX Vehicle SDK, etc. source code in such a away that it instantly makes all Fermi cards obsolete, for example. The developer would have no clue to how to fix this easily since they aren't the experts in that particular version of GW SDK or in how to best optimize for the Fermi architecture. Oops, guess it's time to get a new NV GPU folks! All aboard the upgrade train! :p

***It is odd to see Kepler architecture performing so poorly in so many GW titles, which just creates even more suspicious about how well is that GW SDK optimized for Fermi vs. Kepler vs. Maxwell. We won't know since NV made it. :cool:
 
Last edited:

casiofx

Senior member
Mar 24, 2015
369
36
61
"To answer this question, I’ve spent several weeks testing Arkham Origins, Assassin’s Creed IV, and Splinter Cell: Blacklist. Blacklist appears to only use GameWorks libraries for its HBAO+ implementation, and early benchmarks of this last game showed a distinct advantage for Nvidia hardware when running in that mode. Later driver updates and a massive set of game patches appear to have resolved these issues; the R9 290X is about 16% faster than the GTX 770 at Ultra detail with FXAA enabled."[/I]
.
Hilariously Assassin's Creed Rogue run very smoothly, even if it's using the same game engine. The only difference is the lack of obvious "Nvidia's optimizations"

Another thing that's funny, Assassin's Creed IV ran horribly on Nvidia's GPUs too.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Hilariously Assassin's Creed Rogue run very smoothly, even if it's using the same game engine. The only difference is the lack of obvious "Nvidia's optimizations"

Another thing that's funny, Assassin's Creed IV ran horribly on Nvidia's GPUs too.

I know, right. I mean we already have enough proof that GW's for sure hurts AMD in performance because after game patches in Shift 2 and AC and Splinter Cell Blacklist, AMD cards start to "magically" perform WAY faster. Woot?! That means even if NV doesn't purposely create GW's source code that directly hurts AMD's cards, the fact that AMD can't optimize this code without a slew of developer patches shows that NV constantly gains the early driver optimization advantage that AMD can't simply fix with a new driver update since GWs SDK is hidden from AMD. In turn it becomes an industry of who throws more marketing dollars, computer programmers and perks/gifts towards a developer - which means the firm with the most money and least ethics automatically wins. WTH! :colbert:
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
It's very hard to believe a dev would slow AMD on purpose as it would really cut off your customer base and that can't sit well with employers/investors. However, can't ignore all the data proving that.
I do know I had the same problem with Shift 2 with GPU usage being low (sub 50% on SLI setup causing sub 60FPS) except for bumper view (still low GPU usage, but at least >60 FPS). It would be funny and sad if dev/Nvidia did purposely hinder AMD performance which cause SLI performance to tank.
from my 15 years of gaming experience. and following hardware news semi closely the last few years.

believe it.

it sucks for pc gaming, but it is the truth.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
from my 15 years of gaming experience. and following hardware news semi closely the last few years.

believe it.

it sucks for pc gaming, but it is the truth.

I believe it based on the facts, but it's still hard to believe it. I mean if I was part of the dev or QA team I would definitely raise objections to letting the game go out that made my game look bad to any customer. Doing a stupid thing like purposely hurting performance should be a fire-able offense.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I believe it based on the facts, but it's still hard to believe it. I mean if I was part of the dev or QA team I would definitely raise objections to letting the game go out that made my game look bad to any customer. Doing a stupid thing like purposely hurting performance should be a fire-able offense.

It's entirely possible they didn't know how it would work. nVidia could have just sneaked this by them.
 
Feb 19, 2009
10,457
10
76
It's entirely possible they didn't know how it would work. nVidia could have just sneaked this by them.

No way. That implies:

1) They never tested on AMD during development so had no idea it would be so bad.
2) Never received feedback from backers in alpha testing.
3) Never received feedback from steam gamers during alpha.

We know #2 and #3 happened so it cannot be. #1 is possible, but its clear the devs knew exactly what they were doing, even so far as to BLAME AMD for the poor performance rather than admitting its on them. Eventually they admitted its PhysX (and now we know its also their HUD as well as PhysX GPU-accelerated particles for smokes/weather).

I was too hasty in blaming NV for Prj CARS, its become clear these developers (the same one responsible for NFS Shift 1/2 which ran terrible on AMD) are just messed up.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Told you guys a long time ago, GW will be the death of AMD more-so than anything else. Because its the ONE factor that AMD can never improve. They can update their uarch for more performance, more efficiency and compete there. But that won't mean much when many GW titles come broken on AMD or crippled like what we've just seen with Prj Cars.

AMD GPU, do a burn out with lots of smoke, slideshow fps. PhysX particles and weather in the game ruin AMD performance because they cannot GPU accelerate it.

GameWorks = "Game works only on NV, lolz AMD".

It's clear to me just reading the Project Cars gaming forums, the typical response from gamers who are clueless (not their fault) is this: "You have a choice, buy NV or suffer" or "That's what you get for saving a few $ going with AMD" or "AMD can't make good drivers, its their fault."

How does AMD stand a chance against that?!

This whole situation has intensified my digust for NV's tactics. I used to buy NV when they offer a good GPU deal, but I will absolutely not from now on. I will also boycott any GameWorks game that run poorly on AMD. Making a stand, one gamer at a time.

That is exactly what im doing, i dont buy GW games since Watchdogs and i have also started not recommending NVIDIA hardware because of those NVIDIA tactics.

It is in our hands to make things better for the PC Gaming, I want both AMD and NVIDIA in the PC Gaming scene but not like that. I will not tolerate things like that from any company.
Lets all informed PC Gamers raise our voice by boycotting GW games and NVIDIA Hardware until NVIDIA(and every other company) and Game Developers get the message.
Also, expose any of those tactics to the public, knowledge is power.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was too hasty in blaming NV for Prj CARS, its become clear these developers (the same one responsible for NFS Shift 1/2 which ran terrible on AMD) are just messed up.

So hold on, NV should share none of the blame? Didn't you read how when some AMD driver fixed performance, the developer admitted that NV asked them to add more PhysX to the game?

Secondly, NV presented that Project CARS was a part of GW and NV's partnership as of October 2013. NV knew that PhysX would underline the game's physics engine which meant NV knew that performance on AMD and Intel GPUs would be wiped out. Since NV publicly acknowledges the partnership with SMS and NV's PhysX is used to power the game's physics engine, NV is publicly OK with supporting such business practices or the developer favouring their own hardware. If NV was not OK with this and if NV's interest was to promote PC gaming as a whole, they would have immediately pulled out of this partnership. That means NV appears to have accepted that the developer favoured their hardware and all the consequences which would ultimate lead a virtual performance disaster for Intel/AMD users. NV was probably LOVING it that the developer decided to use proprietary PhysX as the game's physics backbone. Are you kidding? This means a GTX760 would pummel a 290X. Holly crap, there were probably free drinks for all programmers on NV the day SMS announced PhysX would power the game. ^_^

Believe me, if NV could get every developer to use PhysX as the main physics engine and throw Intel/AMD users under the bus, it would mean those GPU users would eventually succumb to the pressure and get an NV card. That's exactly what nV wants. GW is nothing more but NV's way to sell more GPUs and gain more market share at the expense of the competition. NV absolutely does not care about PC gamers as a whole or it would have never partnered with a developer that made a game that for 100% fact would cripple Intel/AMD GPU performance by virtue of how it was coded/designed.
 
Last edited:

Magee_MC

Senior member
Jan 18, 2010
217
13
81
This is the part that upsets me - I can't just go out and buy a $50 used 650, just add it to my system as a dedicated PhysX GPU because nV wants me to go out and get a $300+ card like 970 to be able to use PhysX and game at the same time. I would honestly be OK with PhysX being proprietary if NV just let us combine AMD GPU for rendering and NV GPU for PhysX. I think that would actually allow PhysX to take off faster since developers would be able to make physics via PhysX without fear that Intel/AMD owners would have poor performance.

Unfortunately, that will never happen. From NV's perspective, it would negate their ability to leverage PhysX to increase sales of their current GPUs. If a person could buy a AMD card which has a price/performance advantage over a NV card and pair it with a low end NV card for PhysX, it's quite possible that the combination would be price comparable to an equivalent NV card. I would also expect that most of the advantage in FPS NV gets from GW in the first few months after a GW title is released would disappear for the AMD + PhysX combination.

That would then lead to NV cards losing their ability to command a premium because of perceived AMD driver deficiencies and lead to lower profit margins for NV. NV would in essence be trading sales of their high end cards for lower end cards and less profit. I can't see any way that they would do it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Developers get the message.
Also, expose any of those tactics to the public, knowledge is power.

Reading some of the comments from members at other PC enthusiast forums just highlights how brainwashed by brand marketing some modern PC gamers have become. When I joined the forums to learn and the veteran AT members provided their insight, I tried to always get two or three sides to the story. It's shocking how someone can be so blindly devoted to a corporation and today's PC gamers seem to be far less critcial and lack the ability to think on their own compared to when I started building PCs. I seriously don't recall such blind devotion to PC hardware brands 10, 15 years ago. I mean we literally made fun of places like AMDZone when they wouldn't admit that Core 2 Duo/Quad was an epic CPU generation and still defended Phenoms. Today, places like NeoGAF are an open circle-**** for devoted fans. Disgusting.

Back then, if AMD released a better CPU for gaming (Atlhon XP+, Athlon 64), boom PC gamers would buy them, overclock them, share their experience and love for hardware. And then if Intel beat AMD, we would migrate to Core 2 D/Quads, with no hard feelings to AMD but at the same time no devotion to Intel - just desire for great hardware. Same thing with GPUs. Today, it oftentimes feels like there is too much blind devotion towards the most popular brands - Intel and NV.

It almost makes me think if in the future AMD were to release faster GPUs and CPUs, some insane excuses or metrics would be pulled out of thin air to justify buying Intel CPUs and NV GPUs. The unfortunate part is the new gen of PC gamers entering the industry aged 13-18 might be uninformed and when they join many online forums, they also 'learn' from the veteran PC builders of today. If many of these "veteran" PC builders are brand loyal and lack objectivity, it could create a generation of new PC gamers that are taking on these qualities of the very same PC gamer "role models" they look up to on forums. That's another reason we need more competition and more information awareness so that companies that gain too much power in the marketplace can't use marketing to insult our intelligence. We also need to be aware of marketing so that no one is defending PC equivalents of "Apple", "Beats" or "Bose" just because those brands are popular in the mainstream media and "can't do any wrong."

Just read some of the responses in this thread. It's pathetic some of these people call themselves PC enthusiasts. 10-15 years ago they would be laughed out of the room (or on our forum) and no one would take their bias seriously:
http://hardforum.com/showthread.php?t=1861353

I seriously do remember AT as the place where most people didn't really care about brands, but cared about hardware and it seems to remain one of the few forums that tries to stick to its roots. Today, many PC gamers are becoming zombies and you see insane brand devotion at places like TechReport, or HardOCP or even TPU towards NV. Try posting something controversial against nV or positive about AMD on some of their sub-forums and it's like you just invaded their personal space. It's pretty mind-blowing.

I have to say I am proud to be a part of our AT forums because here I still feel our forum is fairly objective compared to some of the major tech sites. Even our GPU reviews do remain very good as far as objectivity goes. I feel that Ryan Smith has done an awesome job of retaining his objectivity and that's probably one of the major reasons our forum attracts more objective members. When the editors of other tech sites start showing bias in reviews, they are no longer professional about their work. For that reason, props to AT staff ;)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Gikaseixas, maybe put this article by ExtremeTech into the OP so that other people who are not aware of how GW works can read the article that more or less summarizes the key info. It's highly likely that a lot of PC gamers do not really understand the difference between GameWorks and TWIMTBP/AMD Gaming Evolved.
 

maddogmcgee

Senior member
Apr 20, 2015
410
421
136
84 percent of people on this thread voted that game works does penalise AMD. That's as close to 100 percent as any survey is going to get. The only way AMD stops this is by increasing market share to a point where its not worth it for games developers to screw over such a large percentage of players, regardless of the inducements.....that is a long way from happening.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Gikaseixas, maybe put this article by ExtremeTech into the OP so that other people who are not aware of how GW works can read the article that more or less summarizes the key info. It's highly likely that a lot of PC gamers do not really understand the difference between GameWorks and TWIMTBP/AMD Gaming Evolved.

Added the link and if others whish me to add more just let me know
 

gamervivek

Senior member
Jan 17, 2011
490
53
91
After reading about the heavily tessellated jersey barriers in crysis 2, I am quite credulous of accusations against nvidia. Though back around 2008-2009 I believed that ATI would overcome these shady practices, sadly it seems they lost their way since then.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106

Failed flat as a theory when using the Win10 driver on Win8.1.

cars.jpg
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yes, it just the driver. AMD's drivers cant handle a huge amount of draw calls without getting in performance troubles. You see the same behaviour in the 3DMark feature test:
2dm9z78.png

27yr968.png


The nVidia driver is up to 3x times faster than the AMD driver with MT. And you see this in Project Cars, too.
 
Feb 19, 2009
10,457
10
76
Yes, it just the driver. AMD's drivers cant handle a huge amount of draw calls without getting in performance troubles.

I like how you pretend to know better than the developers and diagnose the problem to be DRAW CALLS and not PhysX as stated by the devs.

Nice.

Edit: Also, what the heck is with that chart, i5 Single vs MT, R9 290X gets 2.4M draw calls, gtx970 gets 1M?
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I like how you pretend to know better than the developers and diagnose the problem to be DRAW CALLS and not PhysX as stated by the devs.

Nice.

Hell, if it wasn't for Sontin's post in the other thread I wouldn't have noticed the two giant red flags. Although, he did basically cut that info out of his quotes.

Post #1: AMD phsyx is on CPU
Post #2: we reduced CPU load and got good results.

Conclusion: draw your own.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Of course gameworks is an amoral attempt to influence the market. Another dodgy practice is that of the shills who are paid to troll online forums and consistantly post negative comments about the competition.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Does anyone really think that NV got involved with the devs to add more Physx that could not be GPU accelerated, that's never happened.
 

loccothan

Senior member
Mar 8, 2013
268
2
81
loccothan.blogspot.com
Hi Bros ->
Here my optimisation for pCars and other Games
Also for CPUs and Win 8.1, SB-Z OpenAL etc. (Gamers essential)

==========
http://loccothan.blogspot.com/p/tweaks.html

==========
Here my config and FPS in pCARS :D
Please notice, that before My Mod the game on that settings (on My RIG) was totally unplayable ! Now with 21 Cars and Heavy Rain all Ultra/High etc. you have screen :D

UPD.2 For Radeon R290 & 290X you can make Grass/Particles ON (but not too high).

wiTEcbU.jpg


7iZvyEB.jpg

yIdioFw.jpg


GcBSbRv.jpg


AGi3yv1.jpg
 
Last edited:
Status
Not open for further replies.