Forbes: 'Gears of War: Ultimate Edition' On PC Is A Disaster For AMD Radeon Gamers

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Magee_MC

Senior member
Jan 18, 2010
217
13
81
Dismissing a Directx 12 title because it runs poorly on AMD seems disingenuous. It is the first major directx 12 title released. What remains to be seen is if AMD continues to struggle in future directx 12 titles.

Then again 12 won't become the standard for many years, so people may be making much ado about nothing.

I don't think that people are dismissing it because it runs poorly on AMD, but because the game is fundamentally broken as released on AMD cards and the reason it is broken in some ways has been definitively shown to be a result of NV Gameworks code inserted into the game.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Dismissing a Directx 12 title because it runs poorly on AMD seems disingenuous. It is the first major directx 12 title released. What remains to be seen is if AMD continues to struggle in future directx 12 titles.

Then again 12 won't become the standard for many years, so people may be making much ado about nothing.
Your opinion is ill informed I fear, what seems disingenuous are your statements because they're empirically false and this is why...

1. Gears of War Ultimate uses Game works, thus NVIDIA were able to work with the IHV and not AMD.

2. NVIDIA acquired an advanced build of Gears of War in order to optimize their drivers in time for a live demonstration of the game during the XBox One spring showcase: http://news.xbox.com/2016/03/01/xbox-spring-showcase-recap/

3. Not only did AMD not gain an advanced build of the game but they also didn't get to work with the Coalition due to the nature of the Game works IP blackbox. Confirmed by AMD here: http://wccftech.com/nvidia-gameworks-visual-corruption-gears-war-ultimate-edition/
(UPDATED*03/05/2016 03:35 PM)*We’ve just learned that no one at AMD was informed by*Microsoft that review codes were being handed out to journalists for performance testing earlier this week and only found out*after the fact.


4. The Graphical glitches, in the game and on Radeon hardware, stems from the implementation of a Gameworks feature called HBAO+ as confirmed by the developers here: https://gearsofwar.com/en-us/forums...ix/5dc281ac-1975-4fb9-ac4b-3d8824107317/posts
This feature is glitchy because the developers never worked to debug with AMD and never informed AMD that the game was being released.


So lets compare this with Oxide's Ashes of the Singularity shall we?

1. Both NVIDIA and AMD have access to the Ashes of the Singularity source code as confirmed here: http://www.extremetech.com/gaming/2...ashes-of-the-singularity-directx-12-benchmark
54abcfb08d256b2083bb24b69006a6bf.jpg
95ec340f627972cd10c14567a795b654.jpg
67bc6525f2e2f00f57f1ed4b6cbac1d5.jpg

2. Both AMD and NVIDIA are able to optimize their drivers for AotS and the game is tested on both architectures.


Conclusion:
While everyone is entitled to an opinion, opinions are like farts...everyone likes the smell of their own brand. Informed opinions are based on sourced information. I have sourced information, have you? No. Therefore in all likelihood, I am being honest and forth right whilst you appear to be disingenuous. If you have sourced information, please share it with us. If not then everything you say, on this matter, is but a mere fart in the wind.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
The game is basically a tech showcase for AMD, I don't think it represents what we should expect from future DX12 titles.

I don't see the game as a tech showcase for AMD, but instead as a tech showcase for how DX12 can allow improvements in a new generation RTS game. I agree that it's probably not representative of future DX12 titles, but that is more a function of the style of game than the fact that AMD has a marketing agreement for the game.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Do you have any references or facts to back up that statement?

I havent found one presentation of this game on nVidia hardware. It is always AMD and Oxide is always promoting AMD hardware.

As far as I can tell it is flat out misinformation. Oxide has worked to make sure that both NVIDIA and AMD cards can each work as well as possible on Ashes [...].

This is not what they have made public. They dont work with nVidia. They only send them the source code so that nVidia can fix all their problems without knowing if Oxide would ever put it into the game. There is only one renderpath in the game and this one is only optimized towards AMD.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
tested by you ? Only AMD users can report Not you as Nvidia user.Period.this patch was for AMD cards.

I tested the scene from the benchmark. :\
And in this spot there is literally no difference between on/off.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
I havent found one presentation of this game on nVidia hardware. It is always AMD and Oxide is always promoting AMD hardware.



This is not what they have made public. They dont work with nVidia. They only send them the source code so that nVidia can fix all their problems without knowing if Oxide would ever put it into the game. There is only one renderpath in the game and this one is only optimized towards AMD.

LOL.

They "only send the source code" while nvidia disallows GameWorks titles to even work with AMD, not to mention suggest fixes. How is that equivalent or close to being equivalent?

Seriously, an nvidia feature completely breaks on AMD hardware. The feature (in game) is not marked as nvidia only, AMD were not told that the game is being released. How is a respectable publisher/developer even releasing a game in such a state? I couldn't care less about nvidia features, but it's obvious that if nVidia was actually interested in improving games and not gimping AMD then disallowing any contact with AMD pre-release wouldnt be part of the contract.
 
Feb 19, 2009
10,457
10
76
LOL.

They "only send the source code" while nvidia disallows GameWorks titles to even work with AMD, not to mention suggest fixes. How is that equivalent or close to being equivalent?

Indeed, they did not only just send source code. They setup a system where NV and AMD are able to submit changes to shaders to better optimize it, and they specifically said this has occurred already where NV's optimized shaders were accepted in the main build. -_-

These are NOT the actions of a bias developer.

A bias developer will do stuff like NOT give out source code, ever. Not even give them access to alpha builds to test and optimize. Not even testing on AMD GPUs during development (GoW devs with NV HBAO+, oops! We didn't know it would break!) and definitely NOT tell AMD that they plan to release the game to reviewers.
 
Mar 10, 2006
11,715
2,012
126
The game is basically a tech showcase for AMD, I don't think it represents what we should expect from future DX12 titles.

Yep. I am pretty sure AMD paid a pretty penny for this but of course the "business deals are terrible for PC gaming" crowd will surely turn a blind eye.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yep. I am pretty sure AMD paid a pretty penny for this but of course the "business deals are terrible for PC gaming" crowd will surely turn a blind eye.

I'm personally getting tired of the "it's not AMD's fault" meme. AMD are touted as the forefathers of DX12 but soon as a DX12 game comes out and AMD bombs on it, "that isn't a DX12 title" or "NV sabotaged it." While valid excuses, you'd figure the forefathers of DX12 would have SOME say in what would essentially be the first DX12 game ("but it's DX9 code updated to DX12", yeah I know but that isn't what marketing is saying or mainstreamers are thinking) to market and their hardware is dead last both in CPU and GPU.

What a cluster screw. This generation is going to be hideous. MSFT is doing a great job turning everyone away from both their consoles and now PC-front. While AMD gets lambasted with articles such as the one in the OP.
 
Feb 19, 2009
10,457
10
76
Yep. I am pretty sure AMD paid a pretty penny for this but of course the "business deals are terrible for PC gaming" crowd will surely turn a blind eye.

Here's the thing, AMD features are actually open source. Anyone is able to re-optimize it and implement it free of charge. Even YOU (really, you can go download the source and modify it lol!). So certainly NV can do it if they choose to.

You cannot accuse them of playing dirty when they release all their tools and features freely, without code encryption or black boxes.

Thus, if a game runs poorly on NV GPUs, only NV or the developers are to blame because they are free to optimize it as they see fit.

That is the major difference, because GimpWorks libraries can only be optimized by NVIDIA, so how it runs, is determined by NVIDIA.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
LOL.
They "only send the source code" while nvidia disallows GameWorks titles to even work with AMD, not to mention suggest fixes. How is that equivalent or close to being equivalent?

Nonsense. This is just fanfiction. Developers can work with them. Maybe it is AMD who doesnt care?
And Gears of War is not a Gameworks title. There is no sponsoring.

Seriously, an nvidia feature completely breaks on AMD hardware. The feature (in game) is not marked as nvidia only, AMD were not told that the game is being released. How is a respectable publisher/developer even releasing a game in such a state? I couldn't care less about nvidia features, but it's obvious that if nVidia was actually interested in improving games and not gimping AMD then disallowing any contact with AMD pre-release wouldnt be part of the contract.

nVidia is doing nothing. Blame Microsoft for pushing their developers.

Why do you guys dont accept that not everyone will ignore >80% of the market and will just optimize for <20% of it like Oxide.
 
Feb 19, 2009
10,457
10
76
I'm personally getting tired of the "it's not AMD's fault" meme. AMD are touted as the forefathers of DX12 but soon as a DX12 game comes out and AMD bombs on it, "that isn't a DX12 title" or "NV sabotaged it." While valid excuses, you'd figure the forefathers of DX12 would have SOME say in what would essentially be the first DX12 game ("but it's DX9 code updated to DX12", yeah I know but that isn't what marketing is saying or mainstreamers are thinking) to market and their hardware is dead last both in CPU and GPU.

A valid excuse... I am glad at least even you see it that way.

It appears NV partnership has prevented AMD from having a say, they didnt even get builds to optimize, they weren't even told the game would be released and the press given access.

The dirtiest thing is in fact hiding NV's HBAO+ tech behind a generic AO label.

Made worse when on the Xbone, their AO is running via DX12 Async Compute.

http://www.eurogamer.net/articles/digitalfoundry-2015-the-making-of-gears-of-war-ultimate-edition

On the GPU side, we've converted SSAO to make use of async compute and are exploring the same for other features, like MSAA.

So they actually had SSAO for Xbone, stripped that for PC, disabled Async Compute, shoved in GimpWorks HBAO+...

Yeah, not shifty at all and people still defend that. Well done guys.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
A valid excuse... I am glad at least even you see it that way.

It appears NV partnership has prevented AMD from having a say, they didnt even get builds to optimize, they weren't even told the game would be released and the press given access.

The dirtiest thing is in fact hiding NV's HBAO+ tech behind a generic AO label.

Made worse when on the Xbone, their AO is running via DX12 Async Compute.

http://www.eurogamer.net/articles/digitalfoundry-2015-the-making-of-gears-of-war-ultimate-edition



So they actually had SSAO for Xbone, stripped that for PC, disabled Async Compute, shoved in GimpWorks HBAO+...

Yeah, not shifty at all and people still defend that. Well done guys.

Maybe someone at AMD should write an angry letter and stuff it full of dollars.

Until AMD can pocket some devs, things aren't going to get better for them.

On the flip side, did you see how terrible their processors did with DX12? Woof. They can't catch a break.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I like how hiding HBAO+ is bad but hiding that a whole engine is only optimized towards AMD is alright.
 
Feb 19, 2009
10,457
10
76
Maybe someone at AMD should write an angry letter and stuff it full of dollars.

Until AMD can pocket some devs, things aren't going to get better for them.

I don't want to see AMD pocket devs to gimp NV GPUs, that just makes the situation worse. I prefer games like Alien Isolation where the visuals look great and it runs very well on all hardware.

This is exactly why I despise the way PC gaming has become, it's about throwing $ at developers to gimp the other guy. The closed propriety stuff is pushing an "ecosystem" lock-in like mobiles and consoles. It's the antithesis of PC gaming for a hardware enthusiast, you can't free select hardware based on it's merits, because it's artificially gimped. If they had their way, you have to constantly upgrade too, because again, older gen is artificially gimped.
 
Feb 19, 2009
10,457
10
76
I like how hiding HBAO+ is bad but hiding that a whole engine is only optimized towards AMD is alright.

They are not hiding it. Releasing source code to all hardware vendors and accepting source changes is not the definition of hiding.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't want to see AMD pocket devs to gimp NV GPUs, that just makes the situation worse. I prefer games like Alien Isolation where the visuals look great and it runs very well on all hardware.

Who said gimp? If tossing a few dollars to a dev prevents Gears of War, do it. Instead you get Forbes writing an article stating AMD hardware sucks.

This is exactly why I despise the way PC gaming has become, it's about throwing $ at developers to gimp the other guy. The closed propriety stuff is pushing an "ecosystem" lock-in like mobiles and consoles. It's the antithesis of PC gaming for a hardware enthusiast, you can't free select hardware based on it's merits, because it's artificially gimped. If they had their way, you have to constantly upgrade too, because again, older gen is artificially gimped.

PC Gaming? Are you new to digital gaming in general? I own a megaton of consoles so that I can play all the games that come out. because Nintendo/Sony/MSFT are all paying devs to get exclusive content/publishing/etc for their respective brands. For almost a decade I remember "The Way its Meant To be Played" slogans and banners across games I loved and would watch as NV would blow ATI away in performance.

This isn't new. Never was.
 

Game_dev

Member
Mar 2, 2016
133
0
0
Here's the thing, AMD features are actually open source. Anyone is able to re-optimize it and implement it free of charge. Even YOU (really, you can go download the source and modify it lol!). So certainly NV can do it if they choose to.

You cannot accuse them of playing dirty when they release all their tools and features freely, without code encryption or black boxes.

Thus, if a game runs poorly on NV GPUs, only NV or the developers are to blame because they are free to optimize it as they see fit.

That is the major difference, because GimpWorks libraries can only be optimized by NVIDIA, so how it runs, is determined by NVIDIA.

It's hard to take your posts seriously when you use terms like "gimpworks"
 
Feb 19, 2009
10,457
10
76
Who said gimp? If tossing a few dollars to a dev prevents Gears of War, do it. Instead you get Forbes writing an article stating AMD hardware sucks.

PC Gaming? Are you new to digital gaming in general? I own a megaton of consoles so that I can play all the games that come out. because Nintendo/Sony/MSFT are all paying devs to get exclusive content/publishing/etc for their respective brands. For almost a decade I remember "The Way its Meant To be Played" slogans and banners across games I loved and would watch as NV would blow ATI away in performance.

This isn't new. Never was.

To you, it's tossing money at devs, if that comes down to it, NV can toss more $$ at devs and they will win that game and the result: gimped AMD in Gears of Wars.

As I've said, NV has been playing dirty for a long time, starting with TWIMTBP and now expanded much more aggressively with GimpWorks.

And JHH did this, because if you recall from Anandtech's (http://www.anandtech.com/show/2549/7) article years ago, because he feared ATI would do this to NVIDIA. That ATI would bribe devs with their $$ to gimp NVIDIA GPUs.

Anandtech at the time thought this excuse was ridiculous, but JHH is a paranoid man, so he struck first, TWIMTBP!

It's ridiculous when you read their PR statements back then, when asked why NV lied about feature support (Reminds you of DX12 Async Compute?), this is what they said:

If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers.

NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?

Never let this quote die. This is historic fact.

But today this happens often, the gimping is real.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
I like how hiding HBAO+ is bad but hiding that a whole engine is only optimized towards AMD is alright.

At least you admit that the HBAO+ coded by NV that corrupted the visuals on only AMD cards was hidden from open view by users. :D

The whole engine is optimized toward AMD? Does that include the code that NV submitted and was accepted by Oxide for the engine?!?! Do you have any proof that the entire engine, part and parcel, has exactly zero optimization for NV cards and that Oxide actively tried to hide that fact?

Seriously, it seems that you make a lot of broad sweeping accusations and statements without and references, proof, or logical chains of evidence. If you don't have any data to support your claims or accusations, you should at least preface your remarks with "In my completely unverified and unsubstantiated opinion,..." That would at least give you some sort of logical cover for your remarks.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
How come this doesn't affect all AMD cards? If this was truly some diabolical plan.

Plausible deniability? :D

I don't know that it's a diabolical plan hatched in a black tower at NV headquarters. However, I do know that NV code has once again significantly negatively affected the performance of AMD cards on a game. I also know that NV contractually prevents a dev from working with AMD to fix the problems the AMD cards have with the NV code. I also know that in this case, from what is being reported, AMD was prevented from even having a chance to do anything from having this game being fundamentally broken on the majority of their hardware before it was released for review.

It does all come under the heading of circumstantial evidence. There is no single damning piece of evidence that NV is actively working to sabotage the performance of games on AMD cards. However, when the evidence continues to pile up, year after year, at some point the evidence, even if circumstantial, begins to present an overwhelmingly compelling case.
 
Status
Not open for further replies.