No, it is clear it is you and [H] which do not understand what GW's is.
Wrong. Various GW's effects such as God Rays have a larger performance hit on AMD hardware. In fact, in The Crew,
the developer blocks access to HBAO+ on AMD hardware despite this feature working in FC4.
Can you name a single GE title where any graphical feature doesn't work on NV hardware?
Wrong. That's not what gamers are saying. NV doesn't reprogram the entire game to hurt AMD. Instead, NV provides GW's specific SDK code to be inserted into the game. This code is specifically optimized by NV for NV hardware. This code can
never be optimized by AMD or the developer, or altered in any way as no one but NV has direct access to it:
2. License. Subject to the terms and conditions of this Agreement, NVIDIA grants you (you) a limited, non-exclusive, non-transferable world-wide, royalty-free license to (a) internally install, use and display the NVIDIA GAMEWORKS SDK, solely for purposes of developing NVIDIA GameWorks asset content for NVIDIA GameWorks Applications; (b) internally use, copy, modify and compile the Sample Code to design, develop and test NVIDIA GameWorks assets; and (c) reproduce and distribute the Redistributable Code only in object code form and only as fully integrated into NVIDIA GameWorks Applications, in each case solely for your commercial and non-commercial purposes provided that your NVIDIA GameWorks Applications run solely on Windows PCs .
In addition, you may not and shall not permit others to:
I. modify, reproduce, de-compile, reverse engineer or translate the NVIDIA GameWorks SDK; or
II. distribute or transfer the NVIDIA GameWorks SDK other than as part of the NVIDIA GameWorks Application.
https://developer.nvidia.com/gameworks-sdk-eula
Therefore, what hurts AMD hardware is not the fact that NV reprograms the game but by virtue of inserting/sending game code/software engineers who provide the game code that no one else but NV can optimize. There is no code there that states that if AMD hardware is detected, turn on some executable/code that slows down AMD cards. But since neither the developer not AMD can change this code, it's impossible to optimize for something you have no access to.
Who is saying this? You are clearly not understanding what GameWorks SDK is. It's not about code that is inserted into the game to directly hurt AMD, but black box code that indirectly hurts AMD
because they can't optimize for it.
So far every single GW game is unoptimized, has average to crappy graphics for the level of hardware requirements. It would be one thing if this was found in 1-2 GW's titles but so far every single GW game is an unoptimized mess. I suppose a stronger argument can be made that the key partners of GW's titles are firms that themselves are horrible developers that were never known for well optimized PC games in the last 3-4 years, like Ubisoft, Konami or Techland. We'll have to see what happens with The Witcher 3 and other GW's titles in 2015.
LOL! Right because CF magically works in games after the developer releases a magic patch months after launch, or a situation where a 770 beats 290X in a game where graphics are not even amazing.
You are confusing GW's with TWIMTBP. TWIMTBP was a program similar to AMD's GE where the hardware and software firms work closely to optimize the game for a particular GPU family/architecture. GW's takes it much much further where NV sends professional engineers/provides
in-house NV built game code to be inserted into the game, that the developer can't recompile/optimize for brand agnostic hardware and AMD has no access to it either. This is completely different from how TWIMTBP and GE work.
Part of that is related to Ubisoft providing more than 5 patches for the game. The other part is AMD optimizing OPEN-source code of ACU. AMD cannot optimize any part of GW's source code whatsoever. Whatever performance improvements came in ACU were related to the
other code in the game AMD's driver has access to.
No, it's not that simple at all. AMD can only optimize drivers for a game if they have access to the most of the source code or if the developer has access to re-compile/change such code to cater to different GPU architectures' strengths. Therefore, no matter how much time AMD spends, it can
never optimize for GW's source code since that code is locked/barred from being altered by the developer on behalf of AMD and from AMD directly. The more GW's source code is in a game, the more the game will run like crap on AMD hardware. GW's is 100% unfair competition. It is nothing like GE/TWIMTBP which were open-source developer relationship programs.
Thus far the reality is the opposite. There were lots of promises of improved tessellation for FC4 and ACU and neither game benefited from this. Had Ubisoft themselves cared to include a lot of tessellation/geometry in those games to make them look next generation, they wouldn't have needed NV's help to do that. The developer should themselves have a vision and budget for art assets. Instead, Ubisoft waited for NV to do that work but NV didn't bother. That's why AC Unity never got its tessellation patch despite the promises. That tells us right there a lot of GW's features are 100% dependent on NV, not the developer.
Also, this idea that NV can throw a lot more money on game development is exactly why programs like GE/TWIMTBP and more so GW, should be banned from the PC industry. We should never have a situation where money influences game development optimizations because it automatically means a company with more financial and engineering resources has an unfair competitive advantage. What if AMD was 100X the size of NV and it bribes 9/10 of all AAA developers to optimize 90% of major game engines to run way better on its hardware, would that be fair? I think not.
This has nothing to do about NV or AMD fans. It's about the idea of brand agnostic software development and optimizations. When money and marketing dollars influence game optimizations, it's a grey area in terms of business ethics because it's altering the otherwise neutral state of game programming. It's not different when marketing dollars and ad revenue influence professional videogame reviews - it's allowed but a lot of gamers don't agree with the practice.
Again, you are missing the point here. AMD can improve performance in nearly every game, but only for the source code it has access to. If AMD has access to 70% of the source code, it will never be able to extra maximum performance out of GCN products for certain graphical effects bounded by GW's code. AMD can never work with the developer to provide an alternative coding path for some GW's graphical feature because participation in GW's bars the developer from doing so. Thus, if AMD had access to the entire source code, it could improve the performance
even more.
That's not what we are seeing because if that was the case, we wouldn't have Kepler bombing in modern games in the last 6 months compared to 970/980 cards or 780 barely beating 7970Ghz. What we are seeing is NV's GW's SDK that runs way better on Maxwell hardware, even at the expense of older Fermi and Kepler generations. It's in NV's best interests to entice GPU upgrades for NV customers using older generations NV cards and to get AMD users to switch after they ultimately end up with a horrendous performance out of the box/CF. If NV tried its hardest to maximize performance in titles, we wouldn't have GW's games with such crappy graphics runs so poorly on cards like 680 and 780, and we definitely wouldn't have a situation where 970 beats 780Ti by 15-20%.
Games like Ryse Son of Rome ran faster on 290X at launch because the game was optimize for GCN courtesy of being made for XB1 first; and taking full advantage of GCN compute architecture. The difference is, none of the source code is black boxed from NV. That's why NV can provide updated drivers for the entire game!
Ummm....no. Imagine if every single GE title had compute shaders and other graphical effects that took full advantage of superior capabilities of GCN architecture, but the entire source code for every GE graphical feature was blocked from being altered by the developer/optimized for by NV? I am pretty sure NV's customers would sing a different tune or stop buying GE titles if SLI didn't work for 4-5 months before a developer releases a patch or if 290X was 40-50% faster than a 980 for no particular reason...