Does GameWorks influences AMD's Cards game performance?

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gameworks, does it penalizes AMD cards?

  • Yes it defenitly does

  • No it's not a factor

  • AMD's fault due to poor Dev relations

  • Game Evolved does just the same


Results are only viewable after voting.
Status
Not open for further replies.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
With all these crazy theories going around...

How is AMD getting such good performance in Win10 compared to Win8.1 and below?

See RS's post
And it would then make sense why AMD is getting a performance boost in Windows 10 with the same driver because there is less CPU overhead which means the CPU is able to perform more calculations for PhysX.

"compared to Windows 8.1 with the Catalyst 15.4 Windows 10 works by as much as 20 to 25 percent faster! "
http://www.computerbase.de/2015-05/p...r-vergleich/3/
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
Is Windows 10 using a different API than other windows? I know the game isn't coded for DX12, what is causing the lower CPU overhead?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
With all these crazy theories going around...

How is AMD getting such good performance in Win10 compared to Win8.1 and below?

Perhaps there is less CPU driver overhead in Windows 10 than 8.1. It frees up more CPU cycles to either tackle PhysX calcs (1 theory) or to help with AMD's greater driver overhead in DX11. I think DX12 overall will help AMD's GPUs a lot more than NV's. The strange thing is this game is DX11 not DX12. You are going to need zlatan's help or another developer/software professional to tackle that one...


Wow, that's insane!

68.5 FPS, 78% GPU usage HUD off

Sh0gmPq.jpg


46.4 FPS, 58% GPU usage HUD on

2Djxy6j.jpg


Craaaaaaaaazy that HUD would have a 20+ fps performance hit. Imagine if HUD was using GW/PhysX code? LOL.
 
Last edited:

lilltesaito

Member
Aug 3, 2010
110
0
0
http://hardforum.com/showpost.php?p=1041593372&postcount=58

No, in this case there is an entire thread in the Project Cars graphics subforum where we discussed with the software engineers directly about the problems with the game and AMD video cards. SMS knew for the past 3 years that Nvidia based PhysX effects in their game caused the frame rate to tank into the sub 20 fps region for AMD users. It is not something that occurred overnight or the past few months. It didn't creep in suddenly. It was always there from day one.

Since the game uses GameWorks, then the ball is in Nvidia's court to optimize the code so that AMD cards can run it properly. Or wait for AMD to work around GameWorks within their drivers. Nvidia is banking on taking months to get right because of the code obfuscation in the GameWorks libraries as this is their new strategy to get more customers.

Break the game for the competition's hardware and hope they migrate to them. If they leave the PC Gaming culture then it's fine; they weren't our customers in the first place.
This guy really has a lot to say it seems.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
"No, in this case there is an entire thread in the Project Cars graphics subforum where we discussed with the software engineers directly about the problems with the game and AMD video cards. SMS knew for the past 3 years that Nvidia based PhysX effects in their game caused the frame rate to tank into the sub 20 fps region for AMD users. It is not something that occurred overnight or the past few months. It didn't creep in suddenly. It was always there from day one."

Oh snap, if that's the future of PC gaming, holly ****. The media needs to take such accusations extremely seriously because if this is true, this has the potential to turn PC gaming on its head, and not necessarily in a good way. The PC gaming was always about making PC games accessible to as many gamers as possible. What game developer would code a new game fully aware of such extreme bias by alienating Intel/AMD GPU users?

Also, some of those comments how a newer AMD driver improved performance, but then NV asked to incorporate more PhysX code to increase the workload into the game, WTH?! Jaw. hits. floor. I am having a really difficult time believe such craziness. This would literally be THE all time low for NV, if true.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
I am having a really difficult time believe such craziness. This would literally be THE all time low for NV, if true.
I will not be even slightly surprised if true. What's amazing is how to some it has become acceptable or even praised when Nvidia "works" with devs and as a result the game is a dog on other hardware. AMD is blamed for this because well their dev relations are not good enough, their drivers are apparently no good.

Things are unfolding exactly how some predicted years ago, the game devs will bow to the highest bidder and PC gaming as a whole will become fractured.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Xbox One version: "There is a glitch in my game where if I go off the track or brake, I lose all sound except backfire.."
http://www.reddit.com/r/pcars/comments/352b4k/glitch_on_xbox_one/

PS4 random blur.

bYWykzm.jpg


q.jpg


oli.jpg


PC Gamer on Gamepad controls:

pcgamer.jpg


LOL. What is happening to AAA game development? For crying out loud, please delay a game 6-12 months and you WILL get gamers to pay $50-60 for it. Release a broken, glitchy and unoptimized mess that favours NV GPUs only and you expect people to pay the full price? Pretty disappointing.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
so basically the dev is caught with his pants down with his lies and now there is irrefutable proof.

talk about incompetent devs. this must be one of the most embarrassing moments of his life. :(

It's very hard to believe a dev would slow AMD on purpose as it would really cut off your customer base and that can't sit well with employers/investors. However, can't ignore all the data proving that.
I do know I had the same problem with Shift 2 with GPU usage being low (sub 50% on SLI setup causing sub 60FPS) except for bumper view (still low GPU usage, but at least >60 FPS). It would be funny and sad if dev/Nvidia did purposely hinder AMD performance which cause SLI performance to tank.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
RS, perhaps that should all go in the Project CARS thread?

You may have thought you were, I thought it was that thread when I was posting earlier...
 

casiofx

Senior member
Mar 24, 2015
369
36
61
It's very hard to believe a dev would slow AMD on purpose as it would really cut off your customer base and that can't sit well with employers/investors. However, can't ignore all the data proving that.
I do know I had the same problem with Shift 2 with GPU usage being low (sub 50% on SLI setup causing sub 60FPS) except for bumper view (still low GPU usage, but at least >60 FPS). It would be funny and sad if dev/Nvidia did purposely hinder AMD performance which cause SLI performance to tank.
Nothing surprising, last time Nvidia sponsored Crysis 2 made very controversial DX11 patch.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2
barrier-dx11-full.jpg
barrier-dx11-mesh.jpg


Purposely makes the GPU work alot harder for no image quality gain, even if it hurts back GTX580 owners, just because Radeons have bigger tessellation impact.

I guess Project Cars is version 2 of this controversy, with Physx instead of hidden tessellation since 290/290X had improved tessellation performance.

The biggest difference is tessellation can be turned off in Crysis 2 (via settings or running in DX9 instead) while in project cars AMD users can't use different physics engine.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's very hard to believe a dev would slow AMD on purpose as it would really cut off your customer base and that can't sit well with employers/investors. However, can't ignore all the data proving that.

It gets worse, far worse. This developer has a history of being NV's BFF.

Full credit to railven from the other thread:

"Reading a little history on the dev side, seems some of these issues (PC specific since there were no early builds for consoles) were known and mentioned. But ignored. There is currently some kind of backlash at the main dev for the series (guess he has a sort of bad-track record, his team was behind Shift 1 and Shift 2, and you guys might remember that game's performance)

PC Games Hardware article:

Need for Speed: Shift Patch 2 - Background
"When Need for Speed: Shift was released, we criticized the surprisingly low performance of AMD's Radeon cards. Especially in scenes with many vehicles the framerate was bad - no matter if the resolution was set to 800 x 600, 1920 x 1200 or any other resolution. The first time the racing game was updated, the problem was not solved, but the second patch for Need for Speed: Shift delivers more frames per second - according to the readme because of "Improved ATI graphics card support”. The reason for the up to now poor performance of Radeon cards has not been unveiled although rumors say that certain Shader routines had not been optimized. Honi soit qui mal y pense - Shift is part of Nvidia's TWIMTBP program."

"Need for Speed: Shift Patch 2 accelerates Radeon graphics cards"

Pure magic right here....check this out.

"In order to show the huge performance benefit for Radeon cards delivered by the patch, we race against 15 computer opponents on "Brands Hatch GP” in broad daylight. We record the framerate for 30 seconds. Without the patch a Radeon HD 5850 wasn't able to exceed 45 fps on average at 1280 x 1024 or 1680 x 1050 (each with 4x MSAA and 16:1 AF), but the update to version 1.02 the performance is increased by 44 percent."

NfS-Shift-Patch2.png


TWIMTPB title = horrendous performance on AMD cards, but yo yo yo, AMD drivers - they suck! Go magic patch, go!! :rolleyes:

RS, perhaps that should all go in the Project CARS thread?

You may have thought you were, I thought it was that thread when I was posting earlier...

Without going into too much depth on Project CARS itself, the point here is we have a history of developer bias that favours NV as is and this obviously continued with GW in Project CARS as the game's physics is based on PhysX that is offloaded to the GPU for NV only. That's damn biased if you ask me. GW is looking worse and worse by the day.
 
Last edited:
Feb 19, 2009
10,457
10
76
That's damn biased if you ask me. GW is looking worse and worse by the day.

Told you guys a long time ago, GW will be the death of AMD more-so than anything else. Because its the ONE factor that AMD can never improve. They can update their uarch for more performance, more efficiency and compete there. But that won't mean much when many GW titles come broken on AMD or crippled like what we've just seen with Prj Cars.

AMD GPU, do a burn out with lots of smoke, slideshow fps. PhysX particles and weather in the game ruin AMD performance because they cannot GPU accelerate it.

GameWorks = "Game works only on NV, lolz AMD".

It's clear to me just reading the Project Cars gaming forums, the typical response from gamers who are clueless (not their fault) is this: "You have a choice, buy NV or suffer" or "That's what you get for saving a few $ going with AMD" or "AMD can't make good drivers, its their fault."

How does AMD stand a chance against that?!

This whole situation has intensified my digust for NV's tactics. I used to buy NV when they offer a good GPU deal, but I will absolutely not from now on. I will also boycott any GameWorks game that run poorly on AMD. Making a stand, one gamer at a time.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Ya gotta admit thought that is one sexy concrete barrier.

I am pretty sure with 8K texture and no tessellation the barrier would have looked better.

Batman cape and tessellated snow are hilarious though. :wub:

BatmanAO-Ground.jpg


BenchmarkWireframe-Ground.jpg


BatmanCape-Real.jpg


Cape-Benchmark-StandardTessellation.jpg


Notice how in the last 2 years NV has stayed mum on tessellation but it was hyping up tessellation like crazy during Fermi and Kepler generations? NV's tessellation performance has not increased much since 780Ti, surprising actually. I guess NV is cautious to crank too much tessellation in GW titles right now because it would cripple 980 and Titan X along with the 780Ti. Not a good strategy when getting your customer base to upgrade. Perhaps if Pascal doubles Titan X in tessellation, NV will start marketing tessellation more heavily again.

tessmark.gif
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ExtremeTech called it out years ago of how GW has the potential to indirectly cripple performance on AMD cards.
http://www.extremetech.com/extreme/...rps-power-from-developers-end-users-and-amd/2

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

"A fundamentally unequal playing field"

Nvidia’s GameWorks program is conceptually similar to what Intel pulled on AMD 8-10 years back. In that situation, Intel’s compilers refused to optimize code for AMD processors, even though AMD had paid Intel for the right to implement SSE, SSE2, and SSE3. The compiler would search for a CPU string rather than just the ability to execute the vectorized code, and if it detected AuthenticAMD instead of GenuineIntel, it refused to use the most advantageous optimizations.

The situation here is different, in that we’re discussing third-party libraries and not the fundamental tools used to build executables, but the end result is similar. AMD is no longer in control of its own performance. While GameWorks doesn’t technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the beginning of the development cycle to create a vendor-specific code path. It’s impossible for AMD to provide a quick after-launch fix.

This kind of maneuver ultimately hurts developers in the guise of helping them. Even if the developers at Ubisoft or WB Montreal wanted to help AMD improve its performance, they can’t share the code. If Nvidia decides to stop supporting older GPUs in a future release, game developers won’t be able to implement their own solutions without starting from scratch and building a new version of the library from the ground up. And while we acknowledge that current Gameworks titles implement no overt AMD penalties, developers who rely on that fact in the future may discover that their games run unexplainably poorly on AMD hardware with no insight into why.

When it comes to manipulating performance, a few percent here and there add up to a significant decline overall. In a game with four high-end DirectX 11 functions (ambient occlusion, tessellation, crepuscular rays, and soft shadowing), running the AMD code path just 2% more slowly than necessary in each area would impact total performance by about 8%. At the high end, that can mean the difference between a $400 price point and a $500 price point.

Nvidia has done a great deal for gaming over the past decade. Features like hardware PhysX support and 3D gaming may never have gone truly mainstream, but they were appreciated premium features for the gamers that wanted them. G-Sync, by all accounts, offers real advantages as well. GameWorks, however, doesn’t just offer Nvidia customers an advantage — it curtails developer freedom and sharply limits AMD’s ability to optimize as well. Even if Nvidia never deliberately sabotages GameWorks code to run poorly on AMD or Intel GPUs, the inability to optimize these functions is itself a genuine competitive disadvantage.

In the long run, no one benefits from this kind of bifurcated market — especially not end users."


Hats off to ExtremeTech - they nailed it!!
 
Last edited:
Feb 19, 2009
10,457
10
76
@RS
Tonga already showed that AMD has boosted Tessellation a lot. Their next-gen stuff should also improve upon that. So no more cheating is possible from NV via open source DX features. It's all via GameWorks & PhysX moving forward.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
How does AMD stand a chance against that?!

This whole situation has intensified my digust for NV's tactics. I used to buy NV when they offer a good GPU deal, but I will absolutely not from now on. I will also boycott any GameWorks game that run poorly on AMD. Making a stand, one gamer at a time.

NV was never a firm with strong ethical standards though. I guess because we were younger and just never noticed or these things didn't matter as much. Bumpgate, 3DMark03 IQ cheating, bi-linear texture filtering optimization cheating with GeForce 5, working with Ubisoft to remove DX10.1 optimizations in AC, Batman AA cheating fiasco, Crysis 2 invisible ocean, and now how they treated their customers with 970s.

NV basically threw Sony's PS3 and HP laptops under the bus with bumpgate. It's unthinkable how a firm in the modern age got away with that! Remember how last year they tried to disable mobile dGPU overclocking and MANY Nvidia GPU owners defended NV's decision? The brand attachment is craaaaazy. Pffft.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Currently 125 people have voted and ~84% say "Yes it definitely does" so it's clear on at least this forum the overwhelming majority don't trust Nvidia or GameWorks. Unfortunately I would venture to say in the casual gaming world (which is most people) they either don't know or don't care and only have a cursory view of things. Meaning they will just go on assuming AMD sucks.

I will say this GW is going to hurt AMD badly and that's exactly what Nvidia set out to do so props to Nvidia I guess.
Hats off to ExtremeTech - they nailed it!!
No doubt. Too bad more sites don't talk about stuff, instead we have the opposite people like BrentJ blame AMD in their reviews that's something I expect from an uneducated gamer not a review site.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's not a problem, it's a feature.

This is the part that upsets me - I can't just go out and buy a $50 used 650, just add it to my system as a dedicated PhysX GPU because nV wants me to go out and get a $300+ card like 970 to be able to use PhysX and game at the same time. I would honestly be OK with PhysX being proprietary if NV just let us combine AMD GPU for rendering and NV GPU for PhysX. I think that would actually allow PhysX to take off faster since developers would be able to make physics via PhysX without fear that Intel/AMD owners would have poor performance.
 
Status
Not open for further replies.