Forbes: 'Gears of War: Ultimate Edition' On PC Is A Disaster For AMD Radeon Gamers

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Game_dev

Member
Mar 2, 2016
133
0
0
Plausible deniability? :D

I don't know that it's a diabolical plan hatched in a black tower at NV headquarters. However, I do know that NV code has once again significantly negatively affected the performance of AMD cards on a game. I also know that NV contractually prevents a dev from working with AMD to fix the problems the AMD cards have with the NV code. I also know that in this case, from what is being reported, AMD was prevented from even having a chance to do anything from having this game being fundamentally broken on the majority of their hardware before it was released for review.

It does all come under the heading of circumstantial evidence. There is no single damning piece of evidence that NV is actively working to sabotage the performance of games on AMD cards. However, when the evidence continues to pile up, year after year, at some point the evidence, even if circumstantial, begins to present an overwhelmingly compelling case.

If NVIDIA is in bed with this many developers they must be hung like a horse :)
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
If NVIDIA is in bed with this many developers they must be hung like a horse :)
you forgot the best answer, a big wallet :D you can buy everything with a big wad of cash :sneaky: even if you are george constanza just coming out of the pool. ahhaah
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
To you, it's tossing money at devs, if that comes down to it, NV can toss more $$ at devs and they will win that game and the result: gimped AMD in Gears of Wars.

As I've said, NV has been playing dirty for a long time, starting with TWIMTBP and now expanded much more aggressively with GimpWorks.

And JHH did this, because if you recall from Anandtech's (http://www.anandtech.com/show/2549/7) article years ago, because he feared ATI would do this to NVIDIA. That ATI would bribe devs with their $$ to gimp NVIDIA GPUs.

Anandtech at the time thought this excuse was ridiculous, but JHH is a paranoid man, so he struck first, TWIMTBP!

It's ridiculous when you read their PR statements back then, when asked why NV lied about feature support (Reminds you of DX12 Async Compute?), this is what they said:





Never let this quote die. This is historic fact.

But today this happens often, the gimping is real.

Nvidia made a better business choice. Sucks for AMD. NV is now raking in the cash and spreading their seed.

But I'm now learning this is collusion. So AMD is going to sue Nvidia. Equilibrium will be reached.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
This is not what they have made public. They dont work with nVidia. They only send them the source code so that nVidia can fix all their problems without knowing if Oxide would ever put it into the game. There is only one renderpath in the game and this one is only optimized towards AMD.

They don't work Nvidia? Really? Source on that? Because the developers of the game have stated they work closer with Nvidia than AMD, so going to call bullshit on your comments.

We actually just chatted with Nvidia about Async Compute, indeed the driver hasn't fully implemented it yet, but it appeared like it was. We are working closely with them as they fully implement Async Compute. We'll keep everyone posted as we learn more.

http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/2130#post_24379702

Certainly I could see how one might see that we are working closer with one hardware vendor then the other, but the numbers don't really bare that out. Since we've started, I think we've had about 3 site visits from NVidia, 3 from AMD, and 2 from Intel ( and 0 from Microsoft, but they never come visit anyone ;(). Nvidia was actually a far more active collaborator over the summer then AMD was, If you judged from email traffic and code-checkins, you'd draw the conclusion we were working closer with Nvidia rather than AMD wink.gif As you've pointed out, there does exist a marketing agreement between Stardock (our publisher) for Ashes with AMD. But this is typical of almost every major PC game I've ever worked on (Civ 5 had a marketing agreement with NVidia, for example). Without getting into the specifics, I believe the primary goal of AMD is to promote D3D12 titles as they have also lined up a few other D3D12 games.

Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only 'vendor' specific code is for Nvidia where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware

http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1200#post_24356995



To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/

I like how hiding HBAO+ is bad but hiding that a whole engine is only optimized towards AMD is alright.

Optimized only towards AMD, yet spent more time working with Nvidia than AMD. Guess that just means Nvidia cards need a crapload more work than AMD ones eh?

So.. you are wrong, you continue to make false statements everywhere and its really tiring. Please go troll somewhere else.
 
Last edited:

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Nvidia made a better business choice. Sucks for AMD. NV is now raking in the cash and spreading their seed.

But I'm now learning this is collusion. So AMD is going to sue Nvidia. Equilibrium will be reached.
and then you see the cash flow of nvidia and you realise the truth is far from it ():)
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
They don't work Nvidia? Really? Source on that? Because the developers of the game have stated they work closer with Nvidia than AMD, so going to call bullshit on your comments.

"Work closer with nVidia"? They have created this engine on AMD hardware with AMD. Their first presentation to the public was about Mantle and how great it was and how great AMD hardware is. Dont believe every marketing pr.

Optimized only towards AMD, yet spent more time working with Nvidia than AMD. Guess that just means Nvidia cards need a crapload more work than AMD ones eh?

So.. you are wrong, you continue to make false statements everywhere and its really tiring. Please go troll somewhere else.
So like Coalition is fixing the AO problem makes Gears of War more optimized towards AMD because Coalition has to use a different renderpath? :\
 
Last edited:

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
"Work closer with nVidia"? They have created this engine on AMD hardware with AMD. Their first presentation to the public was about Mantle and how great it was and how great AMD hardware is. Dont believe every marketing pr.



So like Coalition is fixing the AO problem makes Gears of War more optimized towards AMD because Coalition need to use a different renderpath? :\
the engine is the same on star swarm the very engine nvidia was AHEAD of amd

its really fascinating that you constantly trying to debunk facts that even nvidia doesnt try to i mean even nvidia had admitted that they worked hard with oxide and yet here you are trying to say otherwise
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
At least you admit that the HBAO+ coded by NV that corrupted the visuals on only AMD cards was hidden from open view by users. :D

The whole engine is optimized toward AMD? Does that include the code that NV submitted and was accepted by Oxide for the engine?!?! Do you have any proof that the entire engine, part and parcel, has exactly zero optimization for NV cards and that Oxide actively tried to hide that fact?

Seriously, it seems that you make a lot of broad sweeping accusations and statements without and references, proof, or logical chains of evidence. If you don't have any data to support your claims or accusations, you should at least preface your remarks with "In my completely unverified and unsubstantiated opinion,..." That would at least give you some sort of logical cover for your remarks.
Come on you know he did not mean it that way.
Ashes is totally and completely geared towards running on GCN hardware AGAINST their original statements.
https://web.archive.org/web/20150306125505/http://www.ashesofthesingularity.com/game/faq
What makes Ashes of the Singularity different from other RTS games?
Until now, terrestrial strategy games have had to substantially limit the number of units on screen. As a result, these RTS's could be described as battles.

Thanks to recent technological improvements such as multi-core processors and 64-bit computing combined with the invention of a new type of 3D engine called Nitrous, Ashes of the Singularity games can be described as a war across an entire world without abstraction. Thousands or even tens of thousands of individual actors can engage in dozens of battles simultaneously.

What are the system requirements for Ashes of the Singularity?
We don’t have a formal list quite yet, but you’ll probably want a quad-core CPU (we care more about cores than clock speed) and a graphics card that isn’t ancient.

Most gamers have the necessary hardware to run Ashes. A DirectX 11, 64-bit system should do it. A DirectX 12-capable system will allow players to turn on the high-fidelity features.

Yup those " Thousands or even tens of thousands of individual actors " where supposed to be the high fidelity, extra,bonus,not normal,features.
Of course this got thrown out immediately as soon as they realized that sites would be able to turn "high-fidelity" off for their benchmarks and make AMDs cards look very bad in the process.
Kinda like sites are able to benchmark gameworks games with all of the gameworks features turned off,funny how nvidia allows that to happen and AMD doesn't.
 

Hitman928

Diamond Member
Apr 15, 2012
6,650
12,276
136
Come on you know he did not mean it that way.
Ashes is totally and completely geared towards running on GCN hardware AGAINST their original statements.
https://web.archive.org/web/20150306125505/http://www.ashesofthesingularity.com/game/faq




Yup those " Thousands or even tens of thousands of individual actors " where supposed to be the high fidelity, extra,bonus,not normal,features.
Of course this got thrown out immediately as soon as they realized that sites would be able to turn "high-fidelity" off for their benchmarks and make AMDs cards look very bad in the process.
Kinda like sites are able to benchmark gameworks games with all of the gameworks features turned off,funny how nvidia allows that to happen and AMD doesn't.

What are you trying to say with this post? I think I know what you're trying to say, but it makes no sense.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
They are utter crap by now if you even get them to work at all with new os and drivers,probably much slower then even the CPU.
well considering that 8 years ago they had the same problems from the same company its logical but who knows it will just prove once more that the game was rigged
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
and then you see the cash flow of nvidia and you realise the truth is far from it ():)

Why would AMD pay devs to sabotage their work? I mean, I knew AMD was incompetent but what you're saying puts them at completely new level!
 

Tohtori

Member
Aug 27, 2013
51
2
36
Sad to see even AMD fans want AMD to go out of business.

I rather see AMD going out of business and Nvidia butchered by antitrust court than these current anticonsumerist practices from Nvidia. Oops add Intel to list for butchering too.
 
Status
Not open for further replies.