Does GameWorks influences AMD's Cards game performance?

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gameworks, does it penalizes AMD cards?

  • Yes it defenitly does

  • No it's not a factor

  • AMD's fault due to poor Dev relations

  • Game Evolved does just the same


Results are only viewable after voting.
Status
Not open for further replies.

bowler484

Member
Jan 5, 2014
26
0
0
what do you call a game that allows a 960 to perform 10 to 20% better than a 290x? I call games like that :wub: you call games like that gameworks.

that is no different than exclusive turds.

I call it a game that AMD needs an optimization driver for.

Once again, before AMD went complaining to the media, they were getting gameworks games working just fine. Now they aren't even trying because quite frankly, they were caught telling fibs about gameworks repeatedly.

http://www.overclock.net/t/1496631/...e-to-linux-calls-mantle-an-open-source-api/30

A forum user there signed up for Gameworks. What he found directly disputes what AMD says.

I even remember Huddy's talking about not being able to optimize. Then AMD comes out with a driver a few days later with a 20% fps increase. That stuck in my mind almost as much as this comment from Hilbert at G3D.

I've spoken with Richard Huddy many times in the past, and the one thing that stuck -- always contradict and question what this man says.

Source.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I call it a game that AMD needs an optimization driver for.

Once again, before AMD went complaining to the media, they were getting gameworks games working just fine. Now they aren't even trying because quite frankly, they were caught telling fibs about gameworks repeatedly.

http://www.overclock.net/t/1496631/...e-to-linux-calls-mantle-an-open-source-api/30

A forum user there signed up for Gameworks. What he found directly disputes what AMD says.

I even remember Huddy's talking about not being able to optimize. Then AMD comes out with a driver a few days later with a 20% fps increase. That stuck in my mind almost as much as this comment from Hilbert at G3D.



Source.

Got a link to where they tell you how to optimize GPU PhysX on AMD? They need it for project cars.
 

bowler484

Member
Jan 5, 2014
26
0
0
Dude.... Guy... person... stop. Nvidia is not a deity. Please, if you have a shrine... get help. If AMD starts doing things like that, nvidia could well die. Because AMDs tech is simply better. I don't know how it would make sense though since right now they are so open with everything.

eg. tressfx.. hairworks does not look good on human characters and I would bet AMDs fur tech looks better as well. If tomb raiders hair could not be used on nvidia cards, that's damage to nvidias rep (based on the way a lot of people seem to think). Some would realize its AMD being mean, but A LOT would think "oh nvidia sucks." Freesync is another. If it costs nothing much for manufacturers to put it in, it would overtake g-sync to no end. And if only AMD could be used with it.... over 9000 damage to nvidia reputation (based on how a lot of people seem to think).

HBM is another thing.

How many years has nvidia had physx? who really cares about physx? (they didn't even make it, they just bought it). Nvidia is not as good as some people want to think. Could they even make a good graphics performing chip that also had good compute without blowing up your house?

I wouldn't mind if nvidia was replaced by another company like AMD but I wouldn't want there to be one company. I like being able to choose the best option like I thought I did when I got a 970.

This is a joke right? Not automatically throwing someone under the bus qualifies as worship around here somehow? Forgive me for wanting more proof than conjecture and rumor.

All this great stuff about AMD and how they could kill Nvidia? Hell, go to it AMD please. You don't have to kill them, you just need to cripple them enough to get market share back.

You do realize that Hynix would have told AMD to get lost if AMD wanted to dictate who HBM was sold to right? Right?

And I'm not going to bother commenting on Freesync because everybody knows in it's current form, it cannot touch g-sync. With better scalers from monitor makers perhaps someday.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Not really sure you can call that caught lying. If AMD says they can't see the source code, posting that developers are given instructions on how to call functions does not change the fact that the code is hidden. One would expect

An entire library of nVidia's features, with an explanation on how to call them. Its really that simple.

because that's what an API needs to be used properly. It was or should have been very clear to anybody that this would be true. A closed box can still accept input and spit out output, you just don't know how its doing it.

I call it a game that AMD needs an optimization driver for.

Once again, before AMD went complaining to the media, they were getting gameworks games working just fine. Now they aren't even trying because quite frankly, they were caught telling fibs about gameworks repeatedly.

http://www.overclock.net/t/1496631/...e-to-linux-calls-mantle-an-open-source-api/30

A forum user there signed up for Gameworks. What he found directly disputes what AMD says.

I even remember Huddy's talking about not being able to optimize. Then AMD comes out with a driver a few days later with a 20% fps increase. That stuck in my mind almost as much as this comment from Hilbert at G3D.



Source.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
I call it a game that AMD needs an optimization driver for.

Once again, before AMD went complaining to the media, they were getting gameworks games working just fine. Now they aren't even trying because quite frankly, they were caught telling fibs about gameworks repeatedly.

http://www.overclock.net/t/1496631/...e-to-linux-calls-mantle-an-open-source-api/30

A forum user there signed up for Gameworks. What he found directly disputes what AMD says.

I even remember Huddy's talking about not being able to optimize. Then AMD comes out with a driver a few days later with a 20% fps increase. That stuck in my mind almost as much as this comment from Hilbert at G3D.



Source.

Did anyone click the Download button and read the T&C?

4. Restrictions. You will not, and will not permit others to: (a) modify, translate, decompile, bootleg, reverse engineer, disassemble, or extract the inner workings of any portion of PhysXLab, (b) copy the look-and-feel or functionality of any portion of PhysXLab; (c) remove any proprietary notices, marks, labels, or logos from PhysXLab N or any portion thereof; (d) rent, transfer or use as a service bureau all or some of PhysXLab without NVIDIA’s prior written consent, subject to the requirements of this Agreement; (e) utilize any computer software or hardware which is designed to defeat any copy protection device, should PhysXLab be equipped with such a protection device; or (f) use PhysXLab in any manner that would cause PhysXLab to become subject to an Open Source License. "Open Source License" includes, without limitation, a software license that requires as a condition of use, modification, and/or distribution of such software that PhysXLab be (i) disclosed or distributed in source code form; (ii) be licensed for the purpose of making derivative works; or (iii) be redistributable at no charge. Unauthorized copying of PhysXLab, or failure to comply with any of the provisions of this Agreement, will result in automatic termination of this license.

Also I am pretty sure AMD can not get a license to use it.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Did anyone click the Download button and read the T&C?

4. Restrictions. You will not, and will not permit others to: (a) modify, translate, decompile, bootleg, reverse engineer, disassemble, or extract the inner workings of any portion of PhysXLab, (b) copy the look-and-feel or functionality of any portion of PhysXLab; (c) remove any proprietary notices, marks, labels, or logos from PhysXLab N or any portion thereof; (d) rent, transfer or use as a service bureau all or some of PhysXLab without NVIDIA’s prior written consent, subject to the requirements of this Agreement; (e) utilize any computer software or hardware which is designed to defeat any copy protection device, should PhysXLab be equipped with such a protection device; or (f) use PhysXLab in any manner that would cause PhysXLab to become subject to an Open Source License. "Open Source License" includes, without limitation, a software license that requires as a condition of use, modification, and/or distribution of such software that PhysXLab be (i) disclosed or distributed in source code form; (ii) be licensed for the purpose of making derivative works; or (iii) be redistributable at no charge. Unauthorized copying of PhysXLab, or failure to comply with any of the provisions of this Agreement, will result in automatic termination of this license.

Also I am pretty sure AMD can not get a license to use it.
I can't wait for his response to this :cool:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Many professionals addressed many issues of PhysX a long time ago. I am not sure if NV fixed a lot of the underlying issues of PhysX but here is a good article from 2010 about how broken PhysX was just 5 years ago when it came to running it on x86 CPUs:

"A new investigation by David Kanter at Realworldtech adds to the pile of circumstantial evidence that NVIDIA has apparently crippled the performance of CPUs on its popular, cross-platform physics acceleration library, PhysX. If it's true that PhysX has been hobbled on x86 CPUs, then this move is part of a larger campaign to make the CPU—and Intel in specific—look weak and outdated. The PhysX story is important, because in contrast to the usual sniping over conference papers and marketing claims, the PhysX issue could affect real users.

We talked to NVIDIA today about Kanter's article, and gave the company a chance to air its side of the story. So we'll first take a look at the RWT piece, and then we'll look at NVIDIA's response.

Oh my God, it's full of cruft

When NVIDIA acquired Ageia in 2008, the GPU maker had no intention of getting into the dedicated physics accelerator hardware business. Rather, the game plan was to give the GPU a new, non-graphics, yet still gaming-oriented advantage over the CPU and over ATI's GPUs. NVIDIA did this by ditching Ageia's accelerator add-in board and porting the platform's core physics libraries, called PhysX, to NVIDIA GPUs using CUDA. PhysX is designed to make it easy for developers to add high-quality physics simulation to their games, so that cloth drapes the way it should, balls bounce realistically, and smoke and fragments (mostly from exploding barrels) fly apart in a lifelike manner. In recognition of the fact that game developers, by and large, don't bother to release PC-only titles anymore, NVIDIA also wisely ported PhysX to the leading game consoles, where it runs quite well on console hardware.

If there's no NVIDIA GPU in a gamer's system, PhysX will default to running on the CPU, but it doesn't run very well there. You might think that the CPU's performance deficit is due simply to the fact that GPUs are far superior at physics emulation, and that the CPU's poor showing on PhysX is just more evidence that the GPU is really the component best-equipped to give gamers realism.

Some early investigations into PhysX performance showed that the library uses only a single thread when it runs on a CPU. This is a shocker for two reasons. First, the workload is highly parallelizable, so there's no technical reason for it not to use as many threads as possible; and second, it uses hundreds of threads when it runs on an NVIDIA GPU. So the fact that it runs single-threaded on the CPU is evidence of neglect on NVIDIA's part at the very least, and possibly malign neglect at that.

But the big kicker detailed by Kanter's investigation is that PhysX on a CPU appears to exclusively use x87 floating-point instructions, instead of the newer SSE instructions.


...

Full story

I am pretty sure that later versions of PhysX (3.0+) introduced CPU multi-threading but the key theme was that NV's always intended to use PhysX as a key competitive/marketing advantage vs. running it on Intel/AMD setups. NV never intended to improve PC gaming as a whole for all PC gamers by starting to sell dedicated PhysX cards and pushing physics in PC gaming. It's pretty clear from day 1 that NV just cared about $ first and not the betterment of making PC gaming more realistic for the PC gaming community. That's exactly why to this date we can't buy stand-alone NV cards to use as PhysX cards in systems with Intel/AMD GPUs and why we NV keeps PhysX closed-proprietary on the PC.



So much misinformation, which a lot was caused by AMD -- actually, early PhysX, a developer could add multi-thread.

Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves. One of the best examples is 3DMarkVantage which can use 12 threads while running in software-only PhysX. This can easily be tested by anyone with a multi-core CPU system and a PhysX-capable GeForce GPU. This level of multi-core support and programming methodology has not changed since day one. And to anticipate another ridiculous claim, it would be nonsense to say we “tuned” PhysX multi-core support for this case.

http://physxinfo.com/news/1757/nvidia-in-responce-to-amd-physx-is-multi-threaded/
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Instead of crying about it, maybe AMD should shore up some funds to create their own program like GameWorks? And perhaps it wouldn't be a bad idea to also have better dev relations as their rep is pretty poor throughout the industry. But nope, that won't happen, AMD will just keep crying--welcome to capitalism, it's not easy and may the top dog win.

I dunno if you missed but amd sponsored a lot of AAA titles in recent years. 2013 was a golden year for gaming thanks to amd.
NSFTiers_575px.jpg

But guess what, those games run without a problems on both sides. Sometimes even better on the opponent hardware.

Would you be ok with all those games run 20-30 fps no matter what? Everyone blaming nv drivers and refusing a suggested patch with the fix in it? Because that is what happened to every gameworks title so far.

I vote with my wallet and I spent 0$ for each of those games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
AMD could create their own program but it would not split the market because AMD doesn't own half the market. They are a minor 20% share player and are shrinking more and more everyday to the point of being irrelevant. That's why I don't see GameWorks or NVIDIA exclusivity mattering in the long run because at the end of the day, there can only be one and it will be NVIDIA. Anyone that doesn't have blinders on can see this clear as day, I'm not saying this because I prefer NVIDIA but rather based on the reality of the market.

1. So you advocate that PC game develpment truly becomes who throws more $ at developers and that PC gaming should be like consoles exclusives where I own an AMD card for AMD-favoured/coded titles and an NV card for NV-favoured/coded titles? No thanks. :colbert:

2. I hope you have shorted AMD stock and have tens of thousands of puts, since you can see that AMD will be bankrupt which means you can be a millionaire based on your prediction. Maybe you can buy all of us $2000 Titan cards every 2 years to make us forget that competition actually drives innovation and keeps prices in check.

Also, interesting how in your world AMD's market share will just continue to decline indefinitely until they are bankrupt. I guess according to you AMD will never go above 20% market share in GPUs? Oh right, you are that guy who had AMD CF issues on laptops and automatically correlated that to AMD's drivers sucking for single GPUs and for desktop CF. Genius conclusions and insights when everyone knew that AMD's laptop CF drivers were completely different in terms of their quality from their desktop drivers as far as scaling/Enduro went. :rolleyes:

Sounds like a pretty standard software EULA to me... What are you upset about?

This thread is pointless and will lead nowhere and your point proves it because some NV owners believe that putting proprietary closed-source code GW SDK or PhysX extensions that can never be altered, modified or optimized by AMD/Intel for their GPUs is perfectly OK.

The moral of the story here is that because NV owners have never been affected by unfair competitive business practices, and no firm ever tried to buy out developers with a program like GW that it's not NV's gamer's problem. Haha.

Not surprised for a second at all the NV defending and cheer-leading over the last decade (bumpgate, 970 fiasco, AC DX10.1, Batman AA, Kepler drivers, closed PhysX) considering the support everywhere for closed-source GSync with a GSync module premium. It's becoming pretty obvious that the majority of NV users on our forum will never buy AMD, ever and either wish for AMD to go bankrupt OR they only want for AMD to produce competing products just so that they can buy NV cards cheaper.

Let's not forget at all the constant comments how AMD's Silver and Gold game bundles were viewed as "AMD is so desperate to bundle games because no one wants their cards" to today's "970 and 980 are awesome values because they come with free games like TW3 and Batman AK!!! What an awesome promotion by NV".
 
Last edited:

Spanners

Senior member
Mar 16, 2014
325
1
0
Instead of crying about it, maybe AMD should shore up some funds to create their own program like GameWorks? And perhaps it wouldn't be a bad idea to also have better dev relations as their rep is pretty poor throughout the industry. But nope, that won't happen, AMD will just keep crying--welcome to capitalism, it's not easy and may the top dog win.

Where have AMD been seen crying about Gameworks? You say it twice, are you sure you're not conflating stuff people here have said here and stuff AMD has said? I don't recall AMD making any comments let alone "crying".

Can't disagree with better dev relations being a positive thing I suppose they have less resources and influence in that regard though, welcome to capitalism and all that.

I've owned around double the number of Nvidia cards vs AMD over the years but I'd drop PC gaming in a heartbeat if it becomes some Gameworks vs Gaming Evolved battleground. I hope AMD don't got down this road (maybe they couldn't make it work anyway) and I find it baffling that you think this could be anything other than a negative for the gaming industry in general.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Let's get one thing straight: the developer is way more to blame than NV here since the developer has the power to make game engine decisions that shape the entire game. However, once the developer decided to implement PhysX for the physics engine, it was NV that partnered with SMS and started marketing the game. Whatever a corporation does today sends a signal about what they stand for. Stating oh well, NV had nothing to do with it, the developer screwed the entire game, dismisses that corporations have a public image and whatever partnerships NV/AMD engage in, it is their corporate image that stands complementary with the end product, does it not?

When AMD openly goes out and publicly states that they have shared all the source code with Nvidia or anyone and any of that code can be optimized, but NV remains mum and pretends like who cares, what does that tell you?

Does it look to you like the developer just did all the work on PhysX on his own with 0 help from NV?

cars.jpg


Did you not realize what the topic of the thread is? Did AMD ever side with a developer and helped them include a proprietary closed source code that could ONLY be run on AMD's GPUs but forced Intel/NV GPUs to run it on the CPU?

Yes or No?

Did AMD ever work with a developer to provide proprietary closed source GPU code that directly favoured their GPU architectures and that code could never be optimized, altered or modified by NV/Intel?

Yes or No?

Dude, NV is literally putting its face/brand logo with a game that from the get go was designed to alienate Intel/AMD PC gamers. Do you know of any game made on this planet that was made from day 1 knowing all Intel/AMD GPU setups would be at a distinct disadvantage because of its inherent design? What happens when a person with an Intel APU in 4-5 years decided to play this game and get 2-3X less performance than an NV GPU of a similar level of performance? Too bad, I guess?

It doesn't get much obvious than this. I guess you are OK with such business practices because you only buy NV GPUs, so who cares, right? Millions of other PC gamers worldwide with Intel and AMD GPUs - who cares about them - it's their fault for not buying NV, right?! I guess we should all just not care and buy NV and let the PC gaming industry become a monopoly = $550 mid-range GPUs, $1000 flagships. Go Premiums!

1.That AMD has little to no business sense

2.if AMD shares everything openly with NV why should I buy AMD ever? what differentiates them from NV, what are the exclusive features they offer?

3.From what I have seen presented here it doesn't look like the case.

4.NV didn't create the game, the devs did.If they don't care about AMD user base why should NV?

5.About premiums, AMD used to charge $1000 for their top CPU too, they don't now because they can't.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
AMD did try this - what do you think Mantle was? I would argue that's a lot more extreme then gameworks - AMD can make gameworks features work for them - they use standard DX software calls, it's just harder to optimize. Nvidia couldn't use mantle at all. Also shouldn't AMD hold all the ace cards as they power the consoles, that's what the same people commenting here were parroting for a good year?

Really this just comes down to manpower - nvidia has the manpower to help devs, and develop features to promote their cards. AMD barely has enough to do the basic drivers. That is an unfortunate side effect of AMD screwing up for as long as anyone can remember and today reaching a position where they are starting to loose the ability to compete on equal terms.
 
Feb 19, 2009
10,457
10
76
@RS

You should understand the gaming masses don't care about the details or inner workings. They just care about whether games they want to play, play well on their hardware.

Look at the steam forum on Project Cars, everytime an AMD user complaints about poor performance, NV users would pile in and make statements like "you get what you pay for, cheapskate" or "solution: buy NV" or "how did saving a few $ work out for you?".

AMD giving everything away for free, open source, is the worse way to get a return on their $ investment coming up with those features.

Ultimately, one cannot deny (unless one is too ignorant to see reality) that GameWorks is bad news for AMD. It's goal is to give NV an advantage, at the expense of AMD. Is it wrong? NO, it's just business. Is it good use of marketing $? YES, NV can charge a big premium for their hardware due to the masses equating AMD with trash drivers. Is it going to kill AMD? YES. It's doing a bang good job of it.
 
Feb 19, 2009
10,457
10
76
AMD did try this - what do you think Mantle was? I would argue that's a lot more extreme then gameworks...

Really this just comes down to manpower - nvidia has the manpower to help devs, and develop features to promote their cards. AMD barely has enough to do the basic drivers. That is an unfortunate side effect of AMD screwing up for as long as anyone can remember and today reaching a position where they are starting to loose the ability to compete on equal terms.

Been through this before. Show me a AMD game with Mantle that ran poorly on NV GPU.

You can't find one because it doesn't exist.

The existence of Mantle in a game does not stop the developers from optimizing DX11 for NV, because there's no closed source code in that DX11 path that would hinder NV's ability to optimize. GameWorks via DX11 runs on AMD/NV but its closed source, obfuscated so only one vendor has it easy to optimize: NV.

Also, compete on equal terms would be this: all open source features or all closed source features. Not AMD being open while NV being closed source. You could say its AMD's fault for not going with closed source when they win a GE contract. But you can't say they are competing on equal terms.. that's quite a twisted view you have if you believe it.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
AMD did try this - what do you think Mantle was? I would argue that's a lot more extreme then gameworks - AMD can make gameworks features work for them - they use standard DX software calls, it's just harder to optimize. Nvidia couldn't use mantle at all. Also shouldn't AMD hold all the ace cards as they power the consoles, that's what the same people commenting here were parroting for a good year?

Really this just comes down to manpower - nvidia has the manpower to help devs, and develop features to promote their cards. AMD barely has enough to do the basic drivers. That is an unfortunate side effect of AMD screwing up for as long as anyone can remember and today reaching a position where they are starting to loose the ability to compete on equal terms.

Hardly the same thing, mantle in NO way impacts on the DX render path for nvidia. So no they did not try this.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
The moral of the story here is that because NV owners have never been affected by unfair competitive business practices, and no firm ever tried to buy out developers with a program like GW that it's not NV's gamer's problem. Haha.

Well NVIDIA users had a lot to say about Mantle, but unlike GW games and AMD Hardware, Mantle games run flawlessly on NVIDIA hardware in DX-11 mode. :whiste:

Let's not forget at all the constant comments how AMD's Silver and Gold game bundles were viewed as "AMD is so desperate to bundle games because no one wants their cards" to today's "970 and 980 are awesome values because they come with free games like TW3 and Batman AK!!! What an awesome promotion by NV".

Very true, goalposts changing all the time depending of the company that has the bundle.
But I do hope both AMD and NVIDIA will continue with more game bundles in the future, it is good for all the PC Gamers ;)
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Hardly the same thing, mantle in NO way impacts on the DX render path for nvidia. So no they did not try this.

Sure it does. If the devs sabotage the DX path - for example increase the amount of draw calls - then it will impact the performance on nVidia hardware. Eventually nVidia saw this and optimized their DX driver for less overhead.

It is quite ironic that a proprietary API is praised as the next big thing but standard DX libaries, which are working on every dx based hardware, is the devil. :rolleyes:
 
Feb 19, 2009
10,457
10
76
Sure it does. If the devs sabotage the DX path - for example increase the amount of draw calls - then it will impact the performance on nVidia hardware. Eventually nVidia saw this and optimized their DX driver for less overhead.

*IF* the devs sabotage the DX path.

Reality is no such AMD GE devs did that and every single game with Mantle, has ran very well on NV with DX.

The thing is the reverse actually happened. NV sponsored devs increase the amount of tessellation to destroy AMD performance, for no visual gains since its invisible or on flat surfaces. Every single GW game has released with poor performance on AMD requiring official patches to resolve.

So on one hand, we have a hypothetical "IF" Mantle devs sabotage NV.. on the other hand, we have ACTUAL GW devs sabotaging AMD.

Seems pretty concrete to me your reality or views of it is pretty anti-AMD without any basis.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
AMD paid Eidos to sabotage gaming on nVidia hardware, too. Why do you ignore this?

The reason that nVidia has less problems in these Mantle supported games is because they optimized the DX driver. Something AMD could have done in the past.

There is a huge difference between "care" and "do not care".
 
Feb 19, 2009
10,457
10
76
AMD paid Eidos to sabotage gaming on nVidia hardware, too. Why do you ignore this?

The reason that nVidia has less problems in these Mantle supported games is because they optimized the DX driver. Something AMD could have done in the past.

1. Proof?

2. No, the reason is always down to the developers themselves. The devs have optimized for both. But, having open source features where developers can easily understand the code & optimize, it makes the task much easier for the devs. You wanna see what happens when the devs don't optimize for NV? http://forums.anandtech.com/showthread.php?t=2430931 (there's a WHOLE list of games where NV performance is crap, scroll down).
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Sure it does. If the devs sabotage the DX path - for example increase the amount of draw calls - then it will impact the performance on nVidia hardware. Eventually nVidia saw this and optimized their DX driver for less overhead.

It is quite ironic that a proprietary API is praised as the next big thing but standard DX libaries, which are working on every dx based hardware, is the devil. :rolleyes:

That's a pretty big if, you sure you got enough tinfoil to cover your head and your imagined scenario there?
 
Feb 19, 2009
10,457
10
76
That's a pretty big if, you sure you got enough tinfoil to cover your head and your imagined scenario there?

Yeah I thought that was odd.. his imagined AMD evil move is Mantle, sabotaging DX for NV, which NEVER HAPPENED.

But the reverse (GameWorks cripples AMD) has happened and continues to happen.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
1. Proof?

The podcast from last year where nVidia accoused AMD that Eidos wasnt allowed by contract to support nVidia with the release version of the game prior the launch.
And AMD used a broken game to compare their cards with nVidia cards. :eek:


2. No, the reason is always down to the developers themselves. The devs have optimized for both. But, having open source features where developers can easily understand the code & optimize, it makes the task much easier for the devs. You wanna see what happens when the devs don't optimize for NV? http://forums.anandtech.com/showthread.php?t=2430931 (there's a WHOLE list of games where NV performance is crap, scroll down).

So, that is your only argument? Certain games run better on AMD hardware?! :rolleyes:
Call of Duty isnt a Gameworks game and runs better on nVidia hardware. Or Total War: Atilla.

What is with Evolve and Modor? Both are Gameworks games and both run equal or better on AMD cards.

Maybe it isnt just the developer.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Sure it does. If the devs sabotage the DX path - for example increase the amount of draw calls - then it will impact the performance on nVidia hardware. Eventually nVidia saw this and optimized their DX driver for less overhead.

It is quite ironic that a proprietary API is praised as the next big thing but standard DX libaries, which are working on every dx based hardware, is the devil. :rolleyes:

Interesting enough Battlefield, the poster child for Mantle, also supported DX11mt, something that AMD doesn't support. How's that for sabotaging the competition? Also Freesync monitors support HDMI 2.0 another feature that no AMD card supports but nVidia does. Those rascals AMD for using such underhanded tactics. ;)
 
Status
Not open for further replies.