[H] ASUS ROG Poseidon GTX 980 Platinum vs. AMD R9 295X2

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
AMD created mantle to so that games run faster on their cpu/gpus, it was a black box to NV too so they needed something to fight mantle and GW is the answer.
Could you have picked a worse example? Mantle became DX12 and Vulkan which will benefit Nvidia just as much as AMD. Even when Mantle was AMD only it didn't cripple in any way the DX11 code path on Nvidia hardware.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
The crux of the issue is, if the developer was using a pure DX code path, we would not be seeing any major issue, if you did, then, clearly the problem would be with the drivers of said company.

Once you start to introduce code paths that stray from the normal DX way, as in using a specific library or tweaks for GW that can't be shared with anyone outside of GW partners, then, this is where issues arise.

It really isn't that difficult to comprehend what is going on.

Not that long ago, 3dfx was doing almost the same thing, and when 3dfx wrappers came out that showed that other, non 3dfx cards were just as capable as doing what 3dfx did as fast or faster, they had a hissy fit and started to sue everyone that made the wrappers.
Wonder what would happen if someone made a wrapper for GW games...
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
The moment the individual reviewer from H starts posting here and showed his bias and lack of knowledge you realize that reviewers nowadays are just regular people with better PR relations to get free hardware to test, no more.

Having normal posters lecture him on what GW comprises for the developer and how this affects other vendors other than NV only shows the um-professionalism of reviewers nowadays. Worse yet, users take biased reviews as gospel only to make things worse.

We often ridicule clickbait articles based on suposed news that are usually third-handled rumors (wccf, ahem). Well, biased reviews like the one mentioned in the OP should be treated similarily. Such hideous "journalism" is no better that what you see in other areas such as sports journalism and paparazzis.

Tech sites should get a better idea of their influence in the mindset of potential hardware buyers and how such a unfortunate chain of bad composed, hyperbole-bloated, bias-driven paragraphs can affect the brand value of the hardware brands discussed at hand. Heck, we can even argue if the usual posts seen from the OP are just a product of such bad journalism.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Get your facts straight,

1: AMD Mantle doesnt prohibit NVIDIA from optimizing the game in DX-11 mode.

2: NVIDIA GameWorks prohibits AMD from Optimizing the game in DX-11 mode.

3: The problem for all gamers is NVIDIA and the GameWorks. If BrentJ and anyone else in the review industry cares about all the gamers it is NVIDIA that he should put more pressure for its low tactics, not AMD for crating a new API that doesnt prohibits anyone from optimizing the games in DX mode.

1.How does GW prevent AMD from optimizing a game save the NV specifiic features? FC4, ACU runs fine on AMD btw.

2.Source?

3.Nope there is nothing wrong with GW, its a good marketing point for NV and it is AMD who should work extra hard to optimize GW games.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Could you have picked a worse example? Mantle became DX12 and Vulkan which will benefit Nvidia just as much as AMD. Even when Mantle was AMD only it didn't cripple in any way the DX11 code path on Nvidia hardware.

erm NV cant say publicly mantle benefited them so they need some extra bullet points for marketing.Also this crippling thing is getting boring, WD/ACU/DL ran crap on NV cards too initially(specifically Kepler variants).
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
It is massively clear that the majority of posters in this thread have no clue what gameworks is at all.

They are just adding an optional eye candy features such as HBOA+. Gamers can choose them in the settings, or choose not to. The performance hit for features such as HBOA+ often have a similar performance hit on AMD cards as well as nvidia cards.

People are acting like Nvidia reprogrammed these entire games to sabotage performance on AMD HW when the reality is they add one or two features and that is it. Some of these features arent even available to AMD cards and therefor have absolutely no chance of effecting the performance on AMD cards.

I have no idea how people have twisted the situation to this idea that somehow the games are now programmed to run like crap on AMD HW. HBOA+ is not the reason Untiy performed better on NV hardware. TXAA is not even possible to run on AMD HW. Those two added features did not sabotage AMDs GPU performance. You guys dont know what you are talking about.

Gameworks is just a scapegoat.
There is something a lot more significant at play here. And it is completely backwards from what so many in this thread keep insisting. Its not that Nvidia is reprogramming games to sabotage AMD, it is that Nvidia is working close with these developers early on and with all this extra time they can maximize the performance on their cards.

Today, Assassin Creed Unity runs many times better on AMD HW than when it first came out. As AMD spent time optimizing, their performance jumped up many times. This is what happens when you work exclusively on improving performance. It wasnt HBAO that held back AMDs performance. That gamework feature wasnt the issue in Unity.

The closer AMD works with developers, the longer the time they spend on improving performance, the better their cards will perform. It is that simple. AMD chooses where to invest their efforts and this is what kyle is trying to say.

To confuse this to such a degree, i do think it is by design. AMD can right off games easily now and just blame gameworks. This can save them a lot of cash over time. It goes along with their Nvidia evil marketing campaign. The reality is, nvidia can throw a lot of money into improving performance on big games. They have teams working early on titles they choose. This will give them a huge leg up, cause they have been working on these games of a very very long time before launch.

So now AMD can just use this flawed blame gameworks excuse and their fans eat it up. But, just look at Unity and we see that when AMD puts in the time and effort, they can improve performance by several multitudes. It is only a matter of which games AMD chooses.

What we are seeing is the result of Nvidia trying hard to maximize performance on games they choose. We also see games they dont put a lot of effort in. There is only so many resources. There are games that launch that nvidia cards have barely acceptable performance. Games where the 290x beat the 980. There is no gameworks to blame, its just a result of Nvidia's lack of effort on those titles. They could go back and spend extra resources on these games, but most likely they will be looking at what has yet to launch and making sure they have the best performance in games they think are the most important.

You can see in games like BF4/BF3, Nvidia can really get great results when they but the effort in it. these were not gamesworks titles, but nvidia made sure that had they ultimate showing in a game they thought really mattered.

And of course, it did. We see nvidia dominate in these games even with weaker cards at the time, such as the 680 beating down the 7970ghz. I dont know how many threads resulted in people buying the slower yet more expensive 680 just because the performance in one PC shooter. Nvidia made sure they performance very well in key games, it is their strategy. This has been going on for a good while now. And it has nothing to do with gameworks. AMD is fully capable of putting in extra effort as well. But nvidia isnt gonna let AMD run away in performance on games that nvidia thinks are really important.

And now AMD has a scapegoat, just blame HBAO+.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
1.How does GW prevent AMD from optimizing a game save the NV specifiic features? FC4, ACU runs fine on AMD btw.

2.Source?

3.Nope there is nothing wrong with GW, its a good marketing point for NV and it is AMD who should work extra hard to optimize GW games.


http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd

Update (1/3/2014): According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes.
 
Feb 19, 2009
10,457
10
76

Not only that, AMD has said on the record, developers who sign onboard to GW cannot cooperate with AMD during game development, they are not given access to the early builds to optimize their drivers for. This was also confirmed by several developers themselves.

Signing on to GW by accepting NV's "help" excludes AMD.

It's clear what they are doing with GW is nastier but its within their rights to do so.

What unbiased reviewers should do is place the blame accordingly, either on lazy devs or GW practices.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
It is massively clear that the majority of posters in this thread have no clue what gameworks is at all.

They are just adding an optional eye candy features such as HBOA+. Gamers can choose them in the settings, or choose not to. The performance hit for features such as HBOA+ often have a similar performance hit on AMD cards as well as nvidia cards.

People are acting like Nvidia reprogrammed these entire games to sabotage performance on AMD HW when the reality is they add one or two features and that is it. Some of these features arent even available to AMD cards and therefor have absolutely no chance of effecting the performance on AMD cards.

I have no idea how people have twisted the situation to this idea that somehow the games are now programmed to run like crap on AMD HW. HBOA+ is not the reason Untiy performed better on NV hardware. TXAA is not even possible to run on AMD HW. Those two added features did not sabotage AMDs GPU performance. You guys dont know what you are talking about.

Gameworks is just a scapegoat.
There is something a lot more significant at play here. And it is completely backwards from what so many in this thread keep insisting. Its not that Nvidia is reprogramming games to sabotage AMD, it is that Nvidia is working close with these developers early on and with all this extra time they can maximize the performance on their cards.

Today, Assassin Creed Unity runs many times better on AMD HW than when it first came out. As AMD spent time optimizing, their performance jumped up many times. This is what happens when you work exclusively on improving performance. It wasnt HBAO that held back AMDs performance. That gamework feature wasnt the issue in Unity.

The closer AMD works with developers, the longer the time they spend on improving performance, the better their cards will perform. It is that simple. AMD chooses where to invest their efforts and this is what kyle is trying to say.

To confuse this to such a degree, i do think it is by design. AMD can right off games easily now and just blame gameworks. This can save them a lot of cash over time. It goes along with their Nvidia evil marketing campaign. The reality is, nvidia can throw a lot of money into improving performance on big games. They have teams working early on titles they choose. This will give them a huge leg up, cause they have been working on these games of a very very long time before launch.

So now AMD can just use this flawed blame gameworks excuse and their fans eat it up. But, just look at Unity and we see that when AMD puts in the time and effort, they can improve performance by several multitudes. It is only a matter of which games AMD chooses.

What we are seeing is the result of Nvidia trying hard to maximize performance on games they choose. We also see games they dont put a lot of effort in. There is only so many resources. There are games that launch that nvidia cards have barely acceptable performance. Games where the 290x beat the 980. There is no gameworks to blame, its just a result of Nvidia's lack of effort on those titles. They could go back and spend extra resources on these games, but most likely they will be looking at what has yet to launch and making sure they have the best performance in games they think are the most important.

You can see in games like BF4/BF3, Nvidia can really get great results when they but the effort in it. these were not gamesworks titles, but nvidia made sure that had they ultimate showing in a game they thought really mattered.

And of course, it did. We see nvidia dominate in these games even with weaker cards at the time, such as the 680 beating down the 7970ghz. I dont know how many threads resulted in people buying the slower yet more expensive 680 just because the performance in one PC shooter. Nvidia made sure they performance very well in key games, it is their strategy. This has been going on for a good while now. And it has nothing to do with gameworks. AMD is fully capable of putting in extra effort as well. But nvidia isnt gonna let AMD run away in performance on games that nvidia thinks are really important.

And now AMD has a scapegoat, just blame HBAO+.

Then why is it that GW titles are always far more slanted than GE titles? If AMD were truly that bad at optimizing, we'd see the same thing happening with almost every game that releases. Instead, only in GW titles do we see the 760 outperforming the 280X at launch. It's more than obvious that part of Nvidia's involvement is locking AMD out from cooperation until the game is close to completion. AMD has even said as much. We've seen first hand how far Nvidia will go for better marketing with the 970 fiasco.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It is massively clear that the majority of posters in this thread have no clue what gameworks is at all.

No, it is clear it is you and [H] who do not understand what GWs is.

They are just adding an optional eye candy features such as HBOA+. Gamers can choose them in the settings, or choose not to. The performance hit for features such as HBOA+ often have a similar performance hit on AMD cards as well as nvidia cards.

Wrong. Various GW's effects such as God Rays have a larger performance hit on AMD hardware. In fact, in The Crew, the developer blocks access to HBAO+ on AMD hardware despite this feature working in FC4.

Can you name a single GE title where any graphical feature doesn't work on NV hardware?

People are acting like Nvidia reprogrammed these entire games to sabotage performance on AMD HW when the reality is they add one or two features and that is it.

Wrong. That's not what gamers are saying. NV doesn't reprogram the entire game to hurt AMD. Instead, NV provides GW's specific SDK code to be inserted into the game. This code is specifically optimized by NV for NV hardware. This code can never be optimized by AMD or the developer, or altered in any way as no one but NV has direct access to it:

2. License. Subject to the terms and conditions of this Agreement, NVIDIA grants you (“you”) a limited, non-exclusive, non-transferable world-wide, royalty-free license to (a) internally install, use and display the NVIDIA GAMEWORKS SDK, solely for purposes of developing NVIDIA GameWorks asset content for NVIDIA GameWorks Applications; (b) internally use, copy, modify and compile the Sample Code to design, develop and test NVIDIA GameWorks assets; and (c) reproduce and distribute the Redistributable Code only in object code form and only as fully integrated into NVIDIA GameWorks Applications, in each case solely for your commercial and non-commercial purposes provided that your NVIDIA GameWorks Applications run solely on Windows PCs .

In addition, you may not and shall not permit others to:

I. modify, reproduce, de-compile, reverse engineer or translate the NVIDIA GameWorks SDK; or

II. distribute or transfer the NVIDIA GameWorks SDK other than as part of the NVIDIA GameWorks Application.
https://developer.nvidia.com/gameworks-sdk-eula

Therefore, what hurts AMD hardware is not the fact that NV reprograms the game but by virtue of inserting/sending game code/software engineers who provide the game code that no one else but NV can optimize. There is no code there that states that if AMD hardware is detected, turn on some executable/code that slows down AMD cards. But since neither the developer not AMD can change this code, it's impossible to optimize for something you have no access to.

I have no idea how people have twisted the situation to this idea that somehow the games are now programmed to run like crap on AMD HW.

Who is saying this? You are clearly not understanding what GameWorks SDK is. It's not about code that is inserted into the game to directly hurt AMD, but black box code that indirectly hurts AMD because they can't optimize for it.

HBOA+ is not the reason Untiy performed better on NV hardware. TXAA is not even possible to run on AMD HW. Those two added features did not sabotage AMDs GPU performance.

So far every single GW game is unoptimized, has average to crappy graphics for the level of hardware requirements. It would be one thing if this was found in 1-2 GWs titles but so far every single GW game is an unoptimized mess. I suppose a stronger argument can be made that the key partners of GW titles are firms that themselves are horrible developers that were never known for well optimized PC games in the last 3-4 years, like Ubisoft, Konami or Techland. We'll have to see what happens with The Witcher 3 and other GWs titles in 2015.

You guys dont know what you are talking about.

LOL! Right because CF magically works in games after the developer releases a magic patch months after launch, or a situation where a 770 beats 290X in a game where graphics are not even amazing. :rolleyes:

Its not that Nvidia is reprogramming games to sabotage AMD, it is that Nvidia is working close with these developers early on and with all this extra time they can maximize the performance on their cards.

You are confusing GWs with TWIMTBP. TWIMTBP was a program similar to AMD's GE where the hardware and software firms work closely to optimize the game for a particular GPU family/architecture. GWs takes it much much further where NV sends professional engineers/provides in-house NV built game code to be inserted into the game, that the developer can't recompile/optimize for brand agnostic hardware and AMD has no access to it either. This is completely different from how TWIMTBP and GE work.

Today, Assassin Creed Unity runs many times better on AMD HW than when it first came out. As AMD spent time optimizing, their performance jumped up many times. This is what happens when you work exclusively on improving performance.

Part of that is related to Ubisoft providing more than 5 patches for the game. The other part is AMD optimizing OPEN-source code of ACU. AMD cannot optimize any part of GW's source code whatsoever. Whatever performance improvements came in ACU were related to the other code in the game AMD's driver has access to.

The closer AMD works with developers, the longer the time they spend on improving performance, the better their cards will perform. It is that simple.

No, it's not that simple at all. AMD can only optimize drivers for a game if they have access to the most of the source code or if the developer has access to re-compile/change such code to cater to different GPU architectures' strengths. Therefore, no matter how much time AMD spends, it can never optimize for GW's source code since that code is locked/barred from being altered by the developer on behalf of AMD and from AMD directly. The more GW's source code is in a game, the more the game will run like crap on AMD hardware. GW's is 100% unfair competition. It is nothing like GE/TWIMTBP which were open-source developer relationship programs.

AMD can right off games easily now and just blame gameworks. This can save them a lot of cash over time. It goes along with their Nvidia evil marketing campaign. The reality is, nvidia can throw a lot of money into improving performance on big games. They have teams working early on titles they choose. This will give them a huge leg up, cause they have been working on these games of a very very long time before launch.

Thus far the reality is the opposite. There were lots of promises of improved tessellation for FC4 and ACU and neither game benefited from this. Had Ubisoft themselves cared to include a lot of tessellation/geometry in those games to make them look next generation, they wouldn't have needed NV's help to do that. The developer should themselves have a vision and budget for art assets. Instead, Ubisoft waited for NV to do that work but NV didn't bother. That's why AC Unity never got its tessellation patch despite the promises. That tells us right there a lot of GW's features are 100% dependent on NV, not the developer.

Also, this idea that NV can throw a lot more money on game development is exactly why programs like GE/TWIMTBP and more so GW, should be banned from the PC industry. We should never have a situation where money influences game development optimizations because it automatically means a company with more financial and engineering resources has an unfair competitive advantage. What if AMD was 100X the size of NV and it bribes 9/10 of all AAA developers to optimize 90% of major game engines to run way better on its hardware, would that be fair? I think not.

So now AMD can just use this flawed blame gameworks excuse and their fans eat it up.

This has nothing to do about NV or AMD fans. It's about the idea of brand agnostic software development and optimizations. When money and marketing dollars influence game optimizations, it's a grey area in terms of business ethics because it's altering the otherwise neutral state of game programming. It's not different when marketing dollars and ad revenue influence professional videogame reviews - it's allowed but a lot of gamers don't agree with the practice.

But, just look at Unity and we see that when AMD puts in the time and effort, they can improve performance by several multitudes. It is only a matter of which games AMD chooses.

Again, you are missing the point here. AMD can improve performance in nearly every game, but only for the source code it has access to. If AMD has access to 70% of the source code, it will never be able to extract maximum performance out of GCN products for certain graphical effects bounded by GW's code. AMD can never work with the developer to provide an alternative coding path for some GW's graphical feature because participation in GW's bars the developer from doing so. Thus, if AMD had access to the entire source code, it could improve the performance even more.

What we are seeing is the result of Nvidia trying hard to maximize performance on games they choose.

That's not what we are seeing because if that was the case, we wouldn't have Kepler bombing in modern games in the last 6 months compared to 970/980 cards or 780 barely beating 7970Ghz. What we are seeing is NV's GW SDK that runs way better on Maxwell hardware, even at the expense of older Fermi and Kepler generations. It's in NV's best interests to entice GPU upgrades for NV customers using older generation of NV cards and to get AMD users to switch after they ultimately end up with a horrendous performance out of the box/CF. If NV tried its hardest to maximize performance in GW titles, we wouldn't have GW games with such crappy graphics that run so poorly on cards like 680 and 780, and we definitely wouldn't have a situation where 970 beats 780Ti by 15-20%.

Games where the 290x beat the 980. There is no gameworks to blame, its just a result of Nvidia's lack of effort on those titles.

Games like Ryse Son of Rome ran faster on 290X at launch because the game was optimized for GCN courtesy of being made for XB1 first; and taking full advantage of GCN compute architecture. The difference is, none of the source code is black boxed from NV. That's why NV can provide updated drivers for the entire game!

And now AMD has a scapegoat, just blame HBAO+.

Ummm....no. Imagine if every single GE title had compute shaders and other graphical effects that took full advantage of superior capabilities of GCN architecture, but the entire source code for every GE graphical feature was blocked from being altered by the developer/optimized for by NV? I am pretty sure NV's customers would sing a different tune or stop buying GE titles if SLI didn't work for 4-5 months before a developer releases a patch or if 290X was 40-50% faster than a 980 for no particular reason...
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
prices. As for Nvidia they won't get a dime from me because I absolutely detest companies like them who intentionally harm the competition through unethical methods.

My sentiments exactly. Completely abandoning doing any optimizations for older architectures, lying about specs(GTX970) and now GW. What terrible business ethics. Now I will even buy worse products just to avoid buying anything from NV. I wanted to buy Acer's XB270HU but I will not because I would be supporting this terrible ethics by doing it. No money from me for this proprietary money grab when open alternative just got released in the new DP specs which AMD already supports with GCN 1.1+ products under the name Free Syns. G-SYNC was all cool and dandy when it was the only such solution available. Now it is detrimental because monitor manufacturers can't even use their own scalars and G-SYNC at the same time. I won't support something that I consider EVIL. It may not be pragmatic but I don't care anymore. I payed a lot of money for my cards and they stopped optimizing drivers for them as soon as they released a new slightly faster card which didn't even warrant an upgrade. Just look at relative performance of the Titan vs AMD's cards and Maxwell at Maxwell's launch and now.
Just terrible.... I refuse to believe it is about the hardware. Titan shouldn't be slower than 290 or GTX970, not if they cared about optimization of the drivers for Kepler and now we have cases where tahiti matches 780. Hardware wise it is GCN 1.0 which has more deficiencies not GK110, just look at how much faster tonga is at tessellation than the original tahiti.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
No, it is clear it is you and [H] which do not understand what GW's is.

Wrong. Various GW's effects such as God Rays have a larger performance hit on AMD hardware. In fact, in The Crew, the developer blocks access to HBAO+ on AMD hardware despite this feature working in FC4.

Can you name a single GE title where any graphical feature doesn't work on NV hardware?

Wrong. That's not what gamers are saying. NV doesn't reprogram the entire game to hurt AMD. Instead, NV provides GW's specific SDK code to be inserted into the game. This code is specifically optimized by NV for NV hardware. This code can never be optimized by AMD or the developer, or altered in any way as no one but NV has direct access to it:

2. License. Subject to the terms and conditions of this Agreement, NVIDIA grants you (“you”) a limited, non-exclusive, non-transferable world-wide, royalty-free license to (a) internally install, use and display the NVIDIA GAMEWORKS SDK, solely for purposes of developing NVIDIA GameWorks asset content for NVIDIA GameWorks Applications; (b) internally use, copy, modify and compile the Sample Code to design, develop and test NVIDIA GameWorks assets; and (c) reproduce and distribute the Redistributable Code only in object code form and only as fully integrated into NVIDIA GameWorks Applications, in each case solely for your commercial and non-commercial purposes provided that your NVIDIA GameWorks Applications run solely on Windows PCs .

In addition, you may not and shall not permit others to:

I. modify, reproduce, de-compile, reverse engineer or translate the NVIDIA GameWorks SDK; or

II. distribute or transfer the NVIDIA GameWorks SDK other than as part of the NVIDIA GameWorks Application.
https://developer.nvidia.com/gameworks-sdk-eula

Therefore, what hurts AMD hardware is not the fact that NV reprograms the game but by virtue of inserting/sending game code/software engineers who provide the game code that no one else but NV can optimize. There is no code there that states that if AMD hardware is detected, turn on some executable/code that slows down AMD cards. But since neither the developer not AMD can change this code, it's impossible to optimize for something you have no access to.

Who is saying this? You are clearly not understanding what GameWorks SDK is. It's not about code that is inserted into the game to directly hurt AMD, but black box code that indirectly hurts AMD because they can't optimize for it.

So far every single GW game is unoptimized, has average to crappy graphics for the level of hardware requirements. It would be one thing if this was found in 1-2 GW's titles but so far every single GW game is an unoptimized mess. I suppose a stronger argument can be made that the key partners of GW's titles are firms that themselves are horrible developers that were never known for well optimized PC games in the last 3-4 years, like Ubisoft, Konami or Techland. We'll have to see what happens with The Witcher 3 and other GW's titles in 2015.

LOL! Right because CF magically works in games after the developer releases a magic patch months after launch, or a situation where a 770 beats 290X in a game where graphics are not even amazing. :rolleyes:

You are confusing GW's with TWIMTBP. TWIMTBP was a program similar to AMD's GE where the hardware and software firms work closely to optimize the game for a particular GPU family/architecture. GW's takes it much much further where NV sends professional engineers/provides in-house NV built game code to be inserted into the game, that the developer can't recompile/optimize for brand agnostic hardware and AMD has no access to it either. This is completely different from how TWIMTBP and GE work.

Part of that is related to Ubisoft providing more than 5 patches for the game. The other part is AMD optimizing OPEN-source code of ACU. AMD cannot optimize any part of GW's source code whatsoever. Whatever performance improvements came in ACU were related to the other code in the game AMD's driver has access to.

No, it's not that simple at all. AMD can only optimize drivers for a game if they have access to the most of the source code or if the developer has access to re-compile/change such code to cater to different GPU architectures' strengths. Therefore, no matter how much time AMD spends, it can never optimize for GW's source code since that code is locked/barred from being altered by the developer on behalf of AMD and from AMD directly. The more GW's source code is in a game, the more the game will run like crap on AMD hardware. GW's is 100% unfair competition. It is nothing like GE/TWIMTBP which were open-source developer relationship programs.

Thus far the reality is the opposite. There were lots of promises of improved tessellation for FC4 and ACU and neither game benefited from this. Had Ubisoft themselves cared to include a lot of tessellation/geometry in those games to make them look next generation, they wouldn't have needed NV's help to do that. The developer should themselves have a vision and budget for art assets. Instead, Ubisoft waited for NV to do that work but NV didn't bother. That's why AC Unity never got its tessellation patch despite the promises. That tells us right there a lot of GW's features are 100% dependent on NV, not the developer.

Also, this idea that NV can throw a lot more money on game development is exactly why programs like GE/TWIMTBP and more so GW, should be banned from the PC industry. We should never have a situation where money influences game development optimizations because it automatically means a company with more financial and engineering resources has an unfair competitive advantage. What if AMD was 100X the size of NV and it bribes 9/10 of all AAA developers to optimize 90% of major game engines to run way better on its hardware, would that be fair? I think not.

This has nothing to do about NV or AMD fans. It's about the idea of brand agnostic software development and optimizations. When money and marketing dollars influence game optimizations, it's a grey area in terms of business ethics because it's altering the otherwise neutral state of game programming. It's not different when marketing dollars and ad revenue influence professional videogame reviews - it's allowed but a lot of gamers don't agree with the practice.

Again, you are missing the point here. AMD can improve performance in nearly every game, but only for the source code it has access to. If AMD has access to 70% of the source code, it will never be able to extra maximum performance out of GCN products for certain graphical effects bounded by GW's code. AMD can never work with the developer to provide an alternative coding path for some GW's graphical feature because participation in GW's bars the developer from doing so. Thus, if AMD had access to the entire source code, it could improve the performance even more.

That's not what we are seeing because if that was the case, we wouldn't have Kepler bombing in modern games in the last 6 months compared to 970/980 cards or 780 barely beating 7970Ghz. What we are seeing is NV's GW's SDK that runs way better on Maxwell hardware, even at the expense of older Fermi and Kepler generations. It's in NV's best interests to entice GPU upgrades for NV customers using older generations NV cards and to get AMD users to switch after they ultimately end up with a horrendous performance out of the box/CF. If NV tried its hardest to maximize performance in titles, we wouldn't have GW's games with such crappy graphics runs so poorly on cards like 680 and 780, and we definitely wouldn't have a situation where 970 beats 780Ti by 15-20%.

Games like Ryse Son of Rome ran faster on 290X at launch because the game was optimize for GCN courtesy of being made for XB1 first; and taking full advantage of GCN compute architecture. The difference is, none of the source code is black boxed from NV. That's why NV can provide updated drivers for the entire game!

Ummm....no. Imagine if every single GE title had compute shaders and other graphical effects that took full advantage of superior capabilities of GCN architecture, but the entire source code for every GE graphical feature was blocked from being altered by the developer/optimized for by NV? I am pretty sure NV's customers would sing a different tune or stop buying GE titles if SLI didn't work for 4-5 months before a developer releases a patch or if 290X was 40-50% faster than a 980 for no particular reason...

RS I appreciate your effort in retorting some of the trash reasons in support of Gameworks. Gameworks is clearly against fair competition and is a completely unethical business practice.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Can you name a single GE title where any graphical feature doesn't work on NV hardware?
Yes there is Lichdom treefx.Even a GTX 980 cannot go beyond 20fps while R9 290 goes 50fps to 60fps easily so plz dont post false statement or spam just for the sake of argument.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
My sentiments exactly. Completely abandoning doing any optimizations for older architectures, lying about specs(GTX970) and now GW. What terrible business ethics. Now I will even buy worse products just to avoid buying anything from NV. I wanted to buy Acer's XB270HU but I will not because I would be supporting this terrible ethics by doing it. No money from me for this proprietary money grab when open alternative just got released in the new DP specs which AMD already supports with GCN 1.1+ products under the name Free Syns. G-SYNC was all cool and dandy when it was the only such solution available. Now it is detrimental because monitor manufacturers can't even use their own scalars and G-SYNC at the same time. I won't support something that I consider EVIL. It may not be pragmatic but I don't care anymore. I payed a lot of money for my cards and they stopped optimizing drivers for them as soon as they released a new slightly faster card which didn't even warrant an upgrade. Just look at relative performance of the Titan vs AMD's cards and Maxwell at Maxwell's launch and now.
Just terrible.... I refuse to believe it is about the hardware. Titan shouldn't be slower than 290 or GTX970, not if they cared about optimization of the drivers for Kepler and now we have cases where tahiti matches 780. Hardware wise it is GCN 1.0 which has more deficiencies not GK110, just look at how much faster tonga is at tessellation than the original tahiti.

I am confident Nvidia is gimping Kepler in drivers so as to make Maxwell look like a worthwhile upgrade. 780 Ti at launch was around 8 - 10% faster than R9 290X at 1440p. If you look today they are on par and if you count only games released in the last 6 months R9 290X is faster on avg. :whiste:
 

SimianR

Senior member
Mar 10, 2011
609
16
81
Besides all the discussion around Gameworks I wanted to ask Brent about the wording in that review. It does seem a bit odd -

The R9 290X, even when overclocked insanely high, is just barely playable at 4X MSAA in this game averaging right at the 60 FPS mark.

I know it's a review meant for enthusiasts, but does anyone really consider 60 fps just barely playable? Especially with those settings? I guess for me when a game is in the 30 fps range that is "just barely" playable, and half the console titles these days are locked at 30fps.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,227
126
Yes there is Lichdom treefx.Even a GTX 980 cannot go beyond 20fps while R9 290 goes 50fps to 60fps easily so plz dont post false statement or spam just for the sake of argument.

He said "not supported". Not "supported, but doesn't perform as well". Learn to read.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I am confident Nvidia is gimping Kepler in drivers so as to make Maxwell look like a worthwhile upgrade. 780 Ti at launch was around 8 - 10% faster than R9 290X at 1440p. If you look today they are on par and if you count only games released in the last 6 months R9 290X is faster on avg. :whiste:

This is a terrible practice, GM204 is not a compelling upgrade for GK110 owners. If a card that was 10% faster than the competition starts being slower then how can someone be sure that the same won't happen to GM204 and Hawaii when Pascal gets released? I wouldn't be surprised if GM204 starts loosing to Hawaii in games that get released after Pascal's launch. Does NV want to suggest to its customers that they should buy every generation no matter how small the improvement is? How does the performance GTX580 look like compared to the 7970 at the 7970 launch and now? All of that dissuades me from buying NV ever again.
ps. Before I bought my first Titan I bought a pair of 7970s but those cards were horrible, the coil noise could be heard in an another room and the smoothness of CF was BAD. My purchase of the Titan was an instant reaction from the frustration that those terrible 7970s gave me. Even apart from that coils noise those cards were failures. Those blowers were horrible and that sorry state of CF....

This is simply terrible

Titan slower than 290 and 780 barely beating 7970GHz
perfrel_2560.gif


And how it was?

perfrel_2560.gif


I feel cheated. About 30% faster than the competition down to just over 15%? WHAT? In newer games it's going to look even worse. But I don't know if they kept the kepler from throttling or not in those tests.
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
He said "not supported". Not "supported, but doesn't perform as well". Learn to read.
I know what he said and that what i am saying it does not work.Sometimes it crashes or stays on 20fps when u enable Treefx in that game.U need to learn to understand the topic and btw u dont even know what the topic is about.
 

Hitman928

Diamond Member
Apr 15, 2012
6,754
12,500
136
I know what he said and that what i am saying it does not work.Sometimes it crashes or stays on 20fps when u enable Treefx in that game.U need to learn to understand the topic and btw u dont even know what the topic is about.

I was interested in Lichdom until I saw the reviews and never bought it. Do you have any links to benchmarks for the bad performance on Nvidia cards? I looked but couldn't find any. All I could find was this:

lichdom_perf1.png

lichdom_perf2.png


http://community.amd.com/community/...e-tressfx-hair-20-today-in-lichdom-battlemage
 

SViscusi

Golden Member
Apr 12, 2000
1,200
8
81
I know what he said and that what i am saying it does not work.Sometimes it crashes or stays on 20fps when u enable Treefx in that game.U need to learn to understand the topic and btw u dont even know what the topic is about.

You need to understand that the only thing keeping TressFX from running well on nvidia hardware is nvidia. They're not locked out from anything, they can work with game devs to optimize drivers, they just choose not to give it priority and in those cases users can disable it so they're overall playing experience isn't adversely affected.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
You need to understand that the only thing keeping TressFX from running well on nvidia hardware is nvidia. They're not locked out from anything, they can work with game devs to optimize drivers, they just choose not to give it priority and in those cases users can disable it so they're overall playing experience isn't adversely affected.


We purposely disabled TressFX on NVIDIA cards. We delivered TressFX on AMD hardware as part of our partnership with AMD.

https://steamcommunity.com/app/261760/discussions/2/620700960748580422/
 

casiofx

Senior member
Mar 24, 2015
369
36
61
Wrong. Various GW's effects such as God Rays have a larger performance hit on AMD hardware. In fact, in The Crew, the developer blocks access to HBAO+ on AMD hardware despite this feature working in FC4.
For Far Cry 4, the performance hit incurred by God Rays on AMD cards are because of lacklustre tessellation performance on GCN 1.0 and 1.1 GPUs. GCN 1.2 (Tonga) have good tessellation performance and it did not experienced any performance drop at all when turning on the God Rays.
Other GW effects in Far Cry 4 has comparable performance drop between Nvidia and AMD.

I only tried it out after months the game was released so bear in mind that my arguments aren't about when the game was first released, which really sucks on AMD hardware...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106

I love (sic) when people simply repeat a post rather than address what is being said.

We purposely disabled TressFX on NVIDIA cards. We delivered TressFX on AMD hardware as part of our partnership with AMD.
This statement does say they enabled TressFX on AMD because they were required to do so. It does not say they didn't enable it on nVidia because they were required not to.

TressFX runs fine on nVidia hardware and the source is available for nVidia to optimize their drivers for it. AMD does not do anything to prohibit it's use on nVidia hardware or it's performance.