• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Just how playable are Gameworks titles on AMD cards?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Sure you can. nVidia is giving access to the source to the developers and working closely with them.

But even then having access to source code doesnt mean that the developer cares about other companies. Dice is a good example as a console developer they dont bother with optimiziting for nVidia's hardware.

What exactly are you smoking? Every single DICE game in it's history has ran excellent on NV hardware.

A 780Ti is very competitive vs R290X and even 980.

Go look back at BF3, BF4, Hardline, DAI (AMD sponsored, ran excellent on all hardware).

Battlefront just repeats it. Even a 960 is putting out amazing performance at 1080p high/ultra settings.

Regarding GW source code, devs can pay to get access. They can also pay extra to modify it, but with the clause that it must not reduce performance on NV.
 
They rejected AMDs "image quality reduction" advise. You know in the same way Oxide denied nVidia's "wish" to disable Async Shaders in Ashes.

What are you saying? They did disable it for nVidia. Besides A/C actually improves performance. So there's no comparison.
 
All these games based on a version of the engine before nVidia released Maxwell.

Dice know exactly how to "cripple" Maxwell. In Battlefront the AMD hardware performs much better than they did in Battlefront 4 or Dragon Age 3: http://www.guru3d.com/articles_page...ta_vga_graphics_performance_benchmarks,5.html

A R9 285 is 26% faster than a GTX960.

285/380 is heaps faster than 960 in Fable DX12 bench.

Does that mean UE4 isn't NV optimized?

https://developer.nvidia.com/conten...creases-pace-innovation-unreal-engine-4-games

Btw, Battlefront isn't crippled for NV. You want crippling? Compare the latest Anno where Maxwell is ~50% faster than Kepler and GCN. Or Ark (UE4), for the same ~50% lead over Kepler/GCN again. Or Project Cars.

10-15% isn't crippling.

https://developer.nvidia.com/unrealengine
 
Last edited:
Sure, Gameworks is optimized for NV hardware.
Why do you think that? I saw the HairWorks source, and I saw a lot of opportunity for some optimization. There are some easy targets, which can lead to 30 percent boost on Maxwell without quality impact. That's what you call optimisation?

No difference to the Frostbite engine which is optimized for AMD hardware.
Frostbite run very well on Nvidia hardware also.

I dont know the terms for having access to the source code but nVidia allows developer to modify the code:
Than why don't they change the HairWorks code to get 30 percent boost without any negative impact?
 
All these games based on a version of the engine before nVidia released Maxwell.

Dice know exactly how to "cripple" Maxwell. In Battlefront the AMD hardware performs much better than they did in Battlefront 4 or Dragon Age 3: http://www.guru3d.com/articles_page...ta_vga_graphics_performance_benchmarks,5.html

A R9 285 is 26% faster than a GTX960.

The GTX 960 suffers from the limited memory bandwith. The new Frostbite engine has more information/fragment to get good PBR results. 128-bit/7GHz-GDDR5 for a GM206 is just not enough for this scenario. It might run better on D3D12/Vulkan with async copy.
 
Why do you think that? I saw the HairWorks source, and I saw a lot of opportunity for some optimization. There are some easy targets, which can lead to 30 percent boost on Maxwell without quality impact. That's what you call optimisation?

Sure. Why dont you develop your own hair simulation and posted it. After you saw the source code - it should be easy. You can even implement into the UE4 engine like nVidia did it with Hairworks.

... :\

Frostbite run very well on Nvidia hardware also.
Project Cars, too. A GTX960 is only 25% faster in Project Cars: http://www.techpowerup.com/reviews/MSI/GTX_950_Gaming/20.html


The GTX 960 suffers from the limited memory bandwith. The new Frostbite engine has more information/fragment to get good PBR results. 128-bit/7GHz-GDDR5 for a GM206 is just not enough for this scenario. It might run better on D3D12/Vulkan with async copy.

So they have choosen a bandwidth heavy implementation which will benefit AMD cards more. Hm, sounds like a biased implementation.
 
Last edited:
So they have choosen a bandwidth heavy implementation which will benefit AMD cards more. Hm, sounds like a biased implementation.

Almost everything in rendering benefits from a higher bandwidth ...

Transparency, shadows, texture filtering, vertex attribute emitting from the stream-output stage, volumetric rendering, tree traversal for physics and ray tracing ...

You might as well call the implementation biased for better image quality/future and that approach is very scalable so it benefits the 3 main IHVs including Nvidia. The HairWorks nonsense is only meant to hurt everything that's not Nvidia. It's not AMD that gets hurt the most but rather Intel/mobile GPUs!

You can't always eliminate bias from code but at least you can try to be as impartial as possible ...
 
We treat beta like mature releases around here. No logic as to why it's just what we do!

Well actually, all GW Game releases the last 1-2 years are Beta. Because it takes 1-2 or more months for patches and drivers to bring it to acceptable gaming levels. 🙄
 
Well actually, all GW Game releases the last 1-2 years are Beta. Because it takes 1-2 or more months for patches and drivers to bring it to acceptable gaming levels. 🙄

Gameworks games don't have a corner on the released buggy market. ALL games pretty much have stupid bugs when they come out and the consumer has to wait for the developer to get off his butt and fix what should never have been released.
 
It's probably because too many devs release betas as mature games. It gets a bit confusing. 😀

It goes back to the days of MMO gaming. Where you could release games half finished. And then still get active subscriptions paying while you finished it.

Today the business case is slightly different. Its mainly used now as a failsafe trigger. In short, if it doesn't sell, dont fix/finish it. But its bound to backfire now and then.

Gameworks/Evolved/None, same crap.
 
you can add black ops bo3 to the crapworks list that runs like dogshi* :thumbsup:

Only in Singleplayer. Its a stutter fest, maxing out CPU usage thanks to last minute additions of GameWorks/PhysX and plague with memory leaks into the SP levels, with PhysX destruction raised to the uber-degree, maxing out an i7 on all threads 100%.

It runs beautifully in MP, free from that taint.

Compare...

This is MP results:

ngKcb2e.jpg


This is SP results (similar at Guru3d with SP testing):

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_1920.jpg


http://www.gamestar.de/spiele/call-...s/call_of_duty_black_ops_3,52184,3234626.html

http://www.pcgameshardware.de/Call-...el-55478/News/Gameworks-Ankuendigung-1166977/

So, who are still in doubts about the purpose of GimpWorks?
 
Last edited:
Only in Singleplayer. Its a stutter fest, maxing out CPU usage thanks to last minute additions of GameWorks/PhysX and plague with memory leaks into the SP levels, with PhysX destruction raised to the uber-degree, maxing out an i7 on all threads 100%.

So, who are still in doubts about the purpose of GimpWorks?

You do know there have been patches out right? I know how desperately you try with the anti Gameworks campaign, so its easy to lose touch with reality.
 
Should not be released in that state for AAA pricing.

Crippling AMD performance in SP with GameWorks while adding major bugs for everyone else highlights my point about GimpWorks to the tee.
 
Glad I got a 980 Ti. You can all wait for patches! 😀

EDIT: Oh god, just saw that other SP benchmark, RIP AMD performance.
 
EDIT: Oh god, just saw that other SP benchmark, RIP AMD performance.

I'm expecting something like that from Fallout 4.

Biggest title this year next to GTA V and Witcher 3. AMD luckily did OK in those titles, but they had no where as many GimpWorks features like Fallout 4 is showcased to have.
 
I'm expecting something like that from Fallout 4.

Biggest title this year next to GTA V and Witcher 3. AMD luckily did OK in those titles, but they had no where as many GimpWorks features like Fallout 4 is showcased to have.

I told you about that rabbit, didn't I? We're gonna have a lot of console gamers here soon. 😉

(Being from Massachusetts, I really wanna play Fallout 4, I'm just not a big fan of the series).
 
I'm expecting something like that from Fallout 4.

Biggest title this year next to GTA V and Witcher 3. AMD luckily did OK in those titles, but they had no where as many GimpWorks features like Fallout 4 is showcased to have.

So what happens if Fallout 4 runs fine on AMD. Will you use some excuse to keep going on your crusade against NVidia? What happens when a Gaming Evolved game runs like crap. Free pass? Others? Not to mention you happily forget all the games with GameWorks features that runs absolutely fine. But that would also ruin the purpose of the posts. Games can only be bad due to GameWorks, not simply due to other issues, correct?
 
Black Ops 3 BETA:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III_Beta-test-BlackOps3_1920.jpg


Before NV's announcement of collaboration to "improve" the visuals with GimpWorks.

After they touched it:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_1920.jpg

Yeah, but Beta was MP, that second bench is SP - where it seems the problem is.

Not saying you aren't right, but that doesn't support your claim very well.
 
Last edited:
Back
Top