Is gameworks to blame for Fury's lackluster debut?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SimianR

Senior member
Mar 10, 2011
609
16
81
I played FC4 and Dying Light just fine at launch...

Dying light had a memory leak that caused the RAM usage to climb until it crashed to the desktop. I had that issue for about a few weeks until they patched it. I'm surprised you had no issues, I guess you were one of the lucky ones.

I think the real issue with gameworks is that it's hard enough to get an optimized, bug free PC port of console games these days let alone introducing a bunch of PC specific visual effects like hairworks/hbao+, and physx. Dev's have pretty strict deadlines and when you throw this on the pile I'm sure it just makes matters worse (especially when porting duties are handed off to other developers). Hours will get cut somewhere to add these "features" and I think it's starting to show.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Far Cry 4 : https://www.youtube.com/watch?v=M25Le3JdHH8
AC Unity : https://www.youtube.com/watch?v=SgpzT5V5Mgs
Arkham Knight : https://www.youtube.com/watch?v=bFBd5GgGkMs

Forum anecdotes do not reality make. People were claiming Arkham Knight ran great for them as well with no issues. Meanwhile the game is pulled from shelves because it's just that broken.

Cool, I don't care what some person with a camera and posts to YouTube says. Did he complain about BF4 at launch too?

Dying light had a memory leak that caused the RAM usage to climb until it crashed to the desktop. I had that issue for about a few weeks until they patched it. I'm surprised you had no issues, I guess you were one of the lucky ones.

I did have issues with hitching, but I just turned down the draw distance (or whatever they called it) and things were fine. I actually enjoyed that game, although I think it was kinda short and after the story ended it didn't have much replay value.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Exactly my point. Initially it was trying to lay the blame on Ubisoft as a crap developer - partially true :awe: - because Unity, Watch Dogs and Far Cry 4 all shipped broken. Now though we've see games from other developers with gameworks all ship broken as well.

The universal is gameworks; Dying Light, Watch Dogs, Far Cry 4, Arkham Knight, AC: Unity. All shipped seriously broken and most continue to have issues. All suffered from terrible optimization, buggy graphics etc. Arkham Knight is so bad they pulled it from shelves. Even Witcher 3 while not using gameworks, does implement hairworks and that costs you 15-20fps at all times to mildly alter the hair on Geralt. This has become all the inefficiency of gpu physx amplified to ruin game optimization.

Gameworks really is a disaster for gaming. I would expect game developers to turn away from it until it's fixed or axed. What developer wants to follow in the footsteps of Rocksteady and have to pull a game off the shelves ? The game runs fine on consoles, within those 30fps limitations. What's different on PC ? Gameworks. In fact it's even worse on PC as half the features are bugged and don't work - see ambient occlusion - broken on PC and doesn't even activate...

There may be a bit of reversed cause and effect there. The devs who are lazy enough to make awful games may well be the ones who are lazy enough to scab GW onto their stuff.

Either way, I personally find GW beneficial because it puts a nice logo on all the games I may as well ignore.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
There may be a bit of reversed cause and effect there. The devs who are lazy enough to make awful games may well be the ones who are lazy enough to scab GW onto their stuff.

Either way, I personally find GW beneficial because it puts a nice logo on all the games I may as well ignore.


:biggrin:
 

tg2708

Senior member
May 23, 2013
687
20
81
The logo was the first thing I noticed and thought it was very cool too. Took me for a sucker though because I thought the game was actually going to good from a graphical standpoint.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Exactly my point. Initially it was trying to lay the blame on Ubisoft as a crap developer - partially true :awe: - because Unity, Watch Dogs and Far Cry 4 all shipped broken. Now though we've see games from other developers with gameworks all ship broken as well.

The universal is gameworks; Dying Light, Watch Dogs, Far Cry 4, Arkham Knight, AC: Unity. All shipped seriously broken and most continue to have issues. All suffered from terrible optimization, buggy graphics etc. Arkham Knight is so bad they pulled it from shelves. Even Witcher 3 while not using gameworks, does implement hairworks and that costs you 15-20fps at all times to mildly alter the hair on Geralt. This has become all the inefficiency of gpu physx amplified to ruin game optimization.

Gameworks really is a disaster for gaming. I would expect game developers to turn away from it until it's fixed or axed. What developer wants to follow in the footsteps of Rocksteady and have to pull a game off the shelves ? The game runs fine on consoles, within those 30fps limitations. What's different on PC ? Gameworks. In fact it's even worse on PC as half the features are bugged and don't work - see ambient occlusion - broken on PC and doesn't even activate...

In concept, I think Gameworks is cool. Nvidia is adding features to make PC versions of games superior to consoles, cool.

In practice, Gameworks features are generally very minimal graphical enhancements (sometimes arguably graphics downgrades) at very severe performance impacts, and the games that feature them tend to have tons of problems. Unless you're running a top end nvidia setup, you generally have to turn off gameworks effects, and you may even want to avoid those games entirely when they use something like Physx, which falls back to an unoptimized cpu path if you don't have nvidia, and with nvidia it adds tremendous frame lag unless you have a dedicated physx card.

Also, the canned effects Nvidia adds tend to look bad. Physx looks the same in every game that uses it, and their new smoke effects also look pretty cookie cutter and out of place.
Games that use gameworks suffer from a similar problem of games that used the Unreal 3 engine, where every game ended up looking and feeling like Gears of War.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Arkham Knight has a 30fps lock and elements of the physics engine and texture streaming break when you mod it to run at 60fps. Nvidia doubled the speed of their promo video to put up a 1080p60fps video.

https://youtu.be/zsjmLNZtvxk?t=25

In the background you can hear the voices of the NPCs sounding like chipmunks from the video being sped up from 30fps to 60. :D
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Arkham Knight has a 30fps lock and elements of the physics engine and texture streaming break when you mod it to run at 60fps. Nvidia doubled the speed of their promo video to put up a 1080p60fps video.

https://youtu.be/zsjmLNZtvxk?t=25

In the background you can hear the voices of the NPCs sounding like chipmunks from the video being sped up from 30fps to 60. :D

I don't hear chipmunks, but that video has graphical effects (like rain on batman's cape) that aren't in the final version.

The game seems to run fine for me at 60fps. Texture streaming just seems to need a fast enough SSD to keep up with it, otherwise the game will crash though. Physics seems fine too.
 
Feb 19, 2009
10,457
10
76
Read this if you don't think GameWorks has a negative effect for AMD hardware:

http://www.pcgamer.com/hardware-report-card-nvidia-vs-amd/#page-2

Notice all the complaints, stem directly from GameWorks titles.

There has been zero noise about AMD's drivers in neutral titles. Because devs who are not bought, know how to optimize their game for all hardware.

What's worse are the shill tech sites.

http://www.pcgamer.com/hardware-report-card-nvidia-vs-amd/#page-4

AMD gets a B for being open with their features. NV gets a B- for being exclusive and closed source with GameWorks because "can be harmful to the platform as a whole". Those shills don't even understand the difference, being open source means any AMD feature, is freely available in raw source code without obfuscation for NV to optimize. What AMD does with their GE program, benefits all gamers because they don't cripple NV's ability to perform or optimize.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
There is one way in which this is worryingly similar to the Conroe situation: AMD failed to win the performance crown in spite of having a new technology that the competitor lacked. In this case, it's HBM. In the case of the K10, it was the on-die memory controller (Conroe still used a classic front-side bus). When Intel integrated the memory controller in Nehalem (which could basically be described as Conroe+IMC), they took another big step ahead of AMD, and AMD was never really able to catch up. What will happen when Nvidia starts using HBM2? In that case, AMD has to compete on a raw architecture basis, but die-shrunk GCN 1.2 doesn't seem like it is going to be competitive enough with Pascal (Maxwell+HBM2).

That said, AMD's CPU business still managed to do OK for a while even after Conroe's release. It was a serious blow, but not an insta-kill. Thuban was a decent budget alternative to Nehalem, especially if you did lots of multi-threaded work, and it didn't do too badly on single-threaded tasks either. It was Bulldozer that really torpedoed AMD's CPU division. Billions of dollars of R&D spent on a product that in many ways was objectively worse than its own predecessor. AMD would have been far better off with a straight die-shrink of Thuban followed by incremental tweaks.

Everybody wants to compare nVidia to Intel. nVidia is no Intel. Not even in JHH's fantasies. If it makes people feel better to compare AMD CPU's to Intel CPU's than try and say it's the same as nVidia vs AMD GPU's you are just talking foolishness.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Disagree. Gameworks is the universal constant in all these broken titles that use it. It's worth looking at the gameworks information. It goes beyond just turning on the gpu compute things like smoke or debris effects. It does shadowing, animations, fire etc. Games using it have a lot of the functionality throughout the game without the ability to disable any of it.

My comment had nothing to do with Fury X reviewing badly. The card is just bad, positioned poorly in terms of its performance, price etc. Gameworks is hurting users of either AMD or nvidia cards though, with broken games, terrible performance, abysmal optimization etc. It needs to go or nvidia needs to fix it, because currently it's a disaster for gaming.

It wins in benchmarks though. So, it's accomplishing what it was designed to do. nVidia doesn't sell games. They sell video cards and Gameworks is selling video cards.
 
Feb 19, 2009
10,457
10
76
Yet another GameWorks title that Kepler tanks in.. welldone nVidia, the "bug" is truly a "feature".

Look at how crap 780, Titan and 780Ti in relation to GCN and Maxwell.
2560.png
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Yet another GameWorks title that Kepler tanks in.. welldone nVidia, the "bug" is truly a "feature".

Look at how crap 780, Titan and 780Ti in relation to GCN and Maxwell.
2560.png

Maxwell is a generation ahead of 780Ti. I don't understand the exaggeration going on here. 780Ti has min 39 and avg. 48 compared to GTX970's min of 41 and avg. of 55. 960 is showing it's architectural advantage over last gens 770 by a good margin but barely playable at this res and settings. the 770 will need reduced settings to be playable.

Kepler seems to be right where it's supposed to be in relation to a newer generation of GPUs. I don't remember Fermi getting slammed when Kepler outshined it and Fermi fell behind in performance improvements.
I'm not saying there won't be any improvements over time for Kepler. There was just released a performance improvement for Kepler in a recent driver.

At any rate. Overblown is the word of the week around here.
Yes. Kepler is behind Maxwell. As it should be. Nowhere near the "tank" name you use as your descriptor.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Maxwell is a generation ahead of 780Ti. I don't understand the exaggeration going on here. 780Ti has min 39 and avg. 48 compared to GTX970's min of 41 and avg. of 55. 960 is showing it's architectural advantage over last gens 770 by a good margin but barely playable at this res and settings. the 770 will need reduced settings to be playable.

Kepler seems to be right where it's supposed to be in relation to a newer generation of GPUs. I don't remember Fermi getting slammed when Kepler outshined it and Fermi fell behind in performance improvements.
I'm not saying there won't be any improvements over time for Kepler. There was just released a performance improvement for Kepler in a recent driver.

At any rate. Overblown is the word of the week around here.
Yes. Kepler is behind Maxwell. As it should be. Nowhere near the "tank" name you use as your descriptor.

I think what's making people turn heads isn't the fact that it's slower than Maxwell. It's the fact that it's regressing against Hawaii. Kepler was extremely competitive with Hawaii. However, the moment Maxwell dropped, Kepler cease to keep up with Hawaii. Of course, you can look at it from different perspectives.

AMD's GCN is more forward thinking than Kepler; and as a result, in newer games, Kepler's weakness is showing. This makes sense because the consoles are GCN based. As a consequence, console ports will be slightly more optimize to take advantage of GCN.

Or, Nvidia decided to put more effort towards optimizing Maxwell at the expensive of Kepler. This also makes sense because resource is limited and Nvidia wants to focus their efforts on newer products to make sure Maxwell is competitive and well optimized against AMD as much as possible.

Or, a combination of both of the above. IMO, I think it's a combination of both. Kepler's weakness being exposed and Nvidia's prioritizing Maxwell's optimization over Kepler's.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Trying to use Batman as an example of anything seems a bit silly being as there are few more broken PC ports out there...

As for gamesworks working better with maxwell, well yes they code the effects to play to maxwells strengths not keplers. Previously they coded the effects to work best for kepler, and maxwell arrived and was still a bit faster, but obviously when you code for maxwell the gap will grow.

The same thing would probably happen for AMD - it's nothing to do with forward thinking, it's just that their cards architecture has barely changed for 3.5 years. If they actually released a next gen architecture the current gen would fall away rapidly. In the past AMD were notorious for that - the drivers only supporting the current gen well.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Trying to use Batman as an example of anything seems a bit silly being as there are few more broken PC ports out there...

As for gamesworks working better with maxwell, well yes they code the effects to play to maxwells strengths not keplers. Previously they coded the effects to work best for kepler, and maxwell arrived and was still a bit faster, but obviously when you code for maxwell the gap will grow.

The same thing would probably happen for AMD - it's nothing to do with forward thinking, it's just that their cards architecture has barely changed for 3.5 years. If they actually released a next gen architecture the current gen would fall away rapidly. In the past AMD were notorious for that - the drivers only supporting the current gen well.

Yes, I agree. You're basically saying Nvidia is optimizing for Maxwell more than Kepler. That's most people are concluding. The forward thinking part has more to do with GCN on consoles. Let's be honest. It doesn't take a rocket scientist to recognize that having GCN on consoles helped AMD a bit. Having consoles taking advantage of the strengths of GCN is a good thing for AMD (pending competent porting, of course). It explains the drop in Kepler's performance, too.

Of course, Nvidia's counter to having GCN optimized code on consoles is GameWorks. I must say, it's working out nicely for Nvidia. It makes Maxwell more attractive compared to other GPUs. Then again, you can make the argument that the GameWorks isn't optimized much at all. GameWorks pretty much tanks performances with it on regardless if you're on Maxwell, Kepler or GCN for , IMO, very minimal visual gains. It just takes the least amount of performance hits on Maxwell.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
IMO Kepler has a slightly weird architecture which is more dependent than others on good drivers. As soon as the driver is de-prioritized (I'm sure they still work on it, just with a lot of the bodies assigned to Maxwell now instead) it would suffer disproportionately. The number of shaders per SMX requires driver finagling to get it to perform most efficiently across such a large group of shaders per SMX.

This was the case on AMD's VLIW 5xxx and 6xxx architectures.

I'm sure that the GCN in consoles helps, and that GCN seemed to be a very forward looking architecture too.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
GameWorks pretty much tanks performances with it on regardless if you're on Maxwell, Kepler or GCN for , IMO, very minimal visual gains. It just takes the least amount of performance hits on Maxwell.

I think making it push the video card hard is quite intentional on nvidia's part - if you want to get people to spend big $$$ on gpu's you've got to give them a good reason. Look at [H], they are always looking for games that push the cards to their limits. If the game is just a console port (ie. most games minus gamesworks) then all you have to do to max it is run at a higher resolution, with better textures and more AA. That's too easy, so gamesworks add a selection of high powered effects that bring lesser gpu's too their knee's and make you want to buy a GTX 980Ti.

There must be chart in nvidia somewhere, they know that your average gamer won't spent too much so they sell them a mid range card that runs all games fine. To upsell they then push:
super high res (e.g. 4K): needs a lot of gpu power
3D/VR: needs both to draw everything twice and do that at very high fps
gamesworks effects: needs a lot of gpu power

The benefits of all the above are limited, most people will enjoy the game just as much on their 1080p screen and GTX960 - but if you can convince them they must have a system that's 4k ready, or VR ready, or can run the game with all the extra's they can drain your wallet much faster :)
 
Last edited: