Why Gameworks is detrimental to Pc Gaming

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Witcher 3 Hairworks demonstrated that some Gameworks features are implemented in an anti-consumer fashion. Unless you own shares of Nvidia I don't know why a regular PC gamer would be happy that there was no Low-Medium-High option for Hairworks in Witcher 3. It is obviously technically possible and most likely very low programmer hours cost to provide given AMD's driver level tesselation adjustment worked with no custom code required and the AA level could be adjusted by editing an ini file.

Then with the latest Batman game, part of the complaints gaining momentum to the point WB pulled the PC version was people noting that not even current Maxwell 2 cards could run Gameworks features reasonably well. Nvidia's current gen card owners wondering why their cards were struggling to run some of the features shared with the PS4 version (rain effects were especially comparable iirc) caused quite a bit of bad publicity for WB.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Witcher 3 Hairworks demonstrated that some Gameworks features are implemented in an anti-consumer fashion. Unless you own shares of Nvidia I don't know why a regular PC gamer would be happy that there was no Low-Medium-High option for Hairworks in Witcher 3. It is obviously technically possible and most likely very low programmer hours cost to provide given AMD's driver level tesselation adjustment worked with no custom code required and the AA level could be adjusted by editing an ini file.

I personally hope that does debacle does lead to a slider/option for future games. Let the users decide.

Then with the latest Batman game, part of the complaints gaining momentum to the point WB pulled the PC version was people noting that not even current Maxwell 2 cards could run Gameworks features reasonably well. Nvidia's current gen card owners wondering why their cards were struggling to run some of the features shared with the PS4 version (rain effects were especially comparable iirc) caused quite a bit of bad publicity for WB.

Rocksteady through their port guys under the bus. Not cool.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
36
51
Hairworks could easily be fixed simply by adding an intensity slider, and hopefully we'll start getting one if enough of a ruckus is made.

Regarding Batman, Rocksteady, and the company that did the porting, they all screwed up. Rocksteady should've checked behind their work better, and the porting company should have know they were creating such a pile of dog poo.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Tomb Raider has only a "on/off" switch for TressFX, too. Never heard someone complaining about it.

In Dirt:Showdown the "advanced lightning" setting was "on/off" only. Never heard someone complaining about it.

At some point developers dont want to spend time for optimizing their assets for too many settings.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Tomb Raider has only a "on/off" switch for TressFX, too. Never heard someone complaining about it.

In Dirt:Showdown the "advanced lightning" setting was "on/off" only. Never heard someone complaining about it.

At some point developers dont want to spend time for optimizing their assets for too many settings.

Tomb Raider runs perfectly fine on both nVidia and AMD hardware. And it doesn't use tessellation set to 64x like The Witcher 3 does. It uses a much more sane 16x, which has no perceivable visual difference from 64x.

Not sure how the advanced lighting could be anything but off or on?

Adding a high/low setting to Hairworks is not some huge project, its literally somebody spending 10 minutes to add the UI side, and then somebody passing in a parameter for level of tessellation to use. Its extremely basic. But its pretty clear that nVidia wanted AMD cards and their own older cards to look bad. Thereby making people think they need to upgrade.

But in the end AMD users actually get better performance as we can set a max tessellation level, nVidia users cannot. So the AMD user can set 16x and only have about 5fps hit, compared to the nVidia user that is stuck with the 20fps hit.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
Never heard someone complaining about it.

I won't kid myself by thinking you're not fully aware, but I'll drop you some hints anyway.

55130.png


55129.png

tombraider-fr.png


Now stop acting like TressFX and GI are the same as GW.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What is detrimental to PC gaming is waiting for others to do the work and being reactionary.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
W

There is the attitude on this forum that "hurr derp, MAXED OUT" is the only way to play.

This is true,i mentioned a few times in random threads how more benchmarks need to include 1366x768 numbers for those on lap tops and those with televisions or lower end monitors.Steam statistics show this resolution as one of the most popular still used but yet still more then one occasion i got my head nearly bit off for even suggesting it lol.

The benchmarks for some games is insane,here is Crysis 3 with a 750 at 1080p getting 20fps on max settings enjoy.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
In Dirt:Showdown the GTX770 goes from as fast as to 20% slower than the 7970GHz. So why is this "okay"?
BTW: Hardware.fr has numbers, too: http://www.hardware.fr/articles/869-14/benchmark-dirt-showdown.html

TressFX is the same. It costs a huge amount of performance and was only usable for high-end users back in 2013. No difference to Hairworks in The Witcher 3.

A GTX770 had a $399 price tag in 2013. And yet it only archives 40FPS in 1440p. The GTX980 was reduced to $499 two weeks after the release of the Witcher 3 and gets 32FPS with the ultra quality and Gameworks in The Witcher 3:
http://hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/3#.VaPNg2OU-PU

And The Witcher 3 looks much better than Tomb Raider.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
In Dirt:Showdown the GTX770 goes from as fast as to 20% slower than the 7970GHz. So why is this "okay"?
BTW: Hardware.fr has numbers, too: http://www.hardware.fr/articles/869-14/benchmark-dirt-showdown.html

TressFX is the same. It costs a huge amount of performance and was only usable for high-end users back in 2013. No difference to Hairworks in The Witcher 3.

A GTX770 had a $399 price tag in 2013. And yet it only archives 40FPS in 1440p. The GTX980 was reduced to $499 two weeks after the release of the Witcher 3 and gets 32FPS with the ultra quality and Gameworks in The Witcher 3:
http://hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/3#.VaPNg2OU-PU

And The Witcher 3 looks much better than Tomb Raider.
Do note the 770 had driver issues with tressfx that were later fixed. Tw3 is a game issue that's there by design hairworks had tessellation on 64x when they could have set it much much lower.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
I'm sorry but this whole thread seems like it's based on a not very well thought out argument.

You can't argue that your aged card should be able to run modern games on high or ultra, you simply can't. Graphics technology has come so far and PC gaming at the highest settings has ALWAYS been pay to play. Most gamers do not have high-end hardware as the most popular games (CS:GO, DOTA2, Starcraft II) have pretty low requirements. Not to mention most gamers play at resolutions at or below 1080P.

You are basing your argument on flawed logic that your purchase you made based on then-available 2013 GPU technology should have guaranteed effectiveness in today's modern GPU climate. Market prices also fluctuate, you simply can't expect that $200 in 2012 will net you product equivalents in 2015, which I'm sure you understand.

That is a silly expectation and a moderately selfish one.

If you want to blame companies for trying to make money off proprietary technology you might as well be yelling at clouds.

Sure hardware diversity is shrinking, but like another member mentioned software sales are still pretty strong. Gameworks is not killing PC gaming.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Seems like a very poor idea to intentionally cripple your older cards and alienate some portion of your customer base.

Just by chance, you will lose many customers who have to upgrade.

Then if the secret gets out, you will certainly lose many more customers.

It's very difficult to believe this is any sane company's business plan.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Honestly I'll say that I do not like the kind of software feature fragmentation that Gameworks can create. However, I've yet to see a real detrimental effect for AMD users, since most Gameworks features are generally just effects that you can live without. Devs seem to be smart enough to not go to far beyond a certain level of parity. Even then, settings can always be reduced when you run into framerate issues. I lived with a Radeon 5850 for five years, and was extremely happy about it's longevity. As new games came out, I just reduced the settings which honestly happened quite quick as new DX11 games came out in 2010 that tended to lean towards Nvidia GTX 400 series levels of tessellation, not AMD 5000. It would still be a perfectly usable card today, but moving up to the R9 270 for me was quite nice :awe:
 
Last edited:

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Seems like a very poor idea to intentionally cripple your older cards and alienate some portion of your customer base.

Just by chance, you will lose many customers who have to upgrade.

Then if the secret gets out, you will certainly lose many more customers.

It's very difficult to believe this is any sane company's business plan.

I'm sorry but what is it that you guys expect of companies? Do you expect that game requirements should remain stagnant forever? That we've reached a level of visual fidelity that represents the pinnacle of gaming?

Hell no. Why would we at consumers ever want technology to stagnate?

Put red and green preferences aside, people are going to sit here and suggest that games released today do not LOOK and FEEL worlds better than games released in 2012/2013? You don't think new technology is the reason for this? Software optimizations can only do so much, modern graphics require some serious hardware, plain and simple.

High and Ultra settings don't make difference? Get outta here. They have and they always will, people just have unrealistic expectations as to what their money spent will net them in terms of performance because we have come so far in recent years in terms of IQ.

Come on, you simply HAVE to pay to play at high settings, sorry. A GTX 680 released in 2012 had 3.5 billion transistors, the GTX 980 Ti has 8 BILLION!

Old cards are just that, old cards.
 

amenx

Diamond Member
Dec 17, 2004
4,707
2,999
136
Seems like a very poor idea to intentionally cripple your older cards and alienate some portion of your customer base.

Just by chance, you will lose many customers who have to upgrade.

Then if the secret gets out, you will certainly lose many more customers.

It's very difficult to believe this is any sane company's business plan.
Exactly. Thats why all indications are its not intentional. Kepler is just an older architecture that has not kept up with the times. Have you or anyone else been following Nvidias forums on the Kepler issue? Lots of people were preparing to jump ship on that. Last thing Nvidia needs is to alienate owners of last gen cards, since if it becomes a pattern, Nvidia will be abandoned by large portions of its customer base. I think Nvidia is keenly aware of this and will probably strive to make sure that this doesnt happen to maxwell. They better, or bye bye Nvidia from me.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
I'm sorry but what is it that you guys expect of companies? Do you expect that game requirements should remain stagnant forever? That we've reached a level of visual fidelity that represents the pinnacle of gaming?

Hell no. Why would we at consumers ever want technology to stagnate?

Put red and green preferences aside, people are going to sit here and suggest that games released today do not LOOK and FEEL worlds better than games released in 2012/2013? You don't think new technology is the reason for this? Software optimizations can only do so much, modern graphics require some serious hardware, plain and simple.

High and Ultra settings don't make difference? Get outta here. They have and they always will, people just have unrealistic expectations as to what their money spent will net them in terms of performance because we have come so far in recent years in terms of IQ.

Come on, you simply HAVE to pay to play at high settings, sorry. A GTX 680 released in 2012 had 3.5 billion transistors, the GTX 980 Ti has 8 BILLION!

Old cards are just that, old cards.
Under normal circumstances I'd agree with you higher I.q comes with performance hits. However gameworks hasn't added anything to justify the large performance penalties it incurs. It just uselessly taxes hardware. Case in point is hairworks. They did not have to make it 64x tessellation as it did not have any effect on iq but took a major chunk of performance. That's not Moving visual fidelity forward. That's just taxing hardware needlessly.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Under normal circumstances I'd agree with you higher I.q comes with performance hits. However gameworks hasn't added anything to justify the large performance penalties it incurs. It just uselessly taxes hardware. Case in point is hairworks. They did not have to make it 64x tessellation as it did not have any effect on iq but took a major chunk of performance. That's not Moving visual fidelity forward. That's just taxing hardware needlessly.

That is one example, in one game. Surely you are not trying to dismiss all of Gameworks based on an insignificant setting in one game?

If it doesn't work well, turn it off? Witcher 3 is not even that great of a game, its not the end all be all of gameworks api implementations going forward. Just one instance with mixed results.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Could be worse. It could be an entirely new API trying to compete with DirectX in an attempt to bring back the fragmentation that existed in the old 3dfx days to create major lock-in. *That* would be bad. Extra features that you can turn off if they won't work well enough are nothing in comparison.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
That is one example, in one game. Surely you are not trying to dismiss all of Gameworks based on an insignificant setting in one game?

If it doesn't work well, turn it off? Witcher 3 is not even that great of a game, its not the end all be all of gameworks api implementations going forward. Just one instance with mixed results.
Arkham knight is one too, along with unity. There have been more bad implementations of gameworks than good ones by a long long shot.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Could be worse. It could be an entirely new API trying to compete with DirectX in an attempt to bring back the fragmentation that existed in the old 3dfx days to create major lock-in. *That* would be bad. Extra features that you can turn off if they won't work well enough are nothing in comparison.

Like g-sync?
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Can anyone here make a list of GW infested featured games that didn't have major issues at launch? Well, other than The Witcher 3, surprisingly. I'm curious.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Arkham knight is one too, along with unity. There have been more bad implementations of gameworks than good ones by a long long shot.

Unity is just a poor port and a poor game, all around. Ubisoft desperately needs new IPs.

To me, it sounds like you just not a fan Nvidia in general which makes it hard for you to form an un-bias opinion on the matter.

I guess my point is this:

If you think gameworks is bad, don't use it. If enough people share your opinion, things will change

If you don't like Nvidia, don't buy Nvidia

If you want to play new games on high/ultra with AA, you have to upgrade

Don't know what else there is to say, AMD has not been competitive enough to compete with Nvidia in the dGPU market. Not saying that's a good thing for the market (I think it is not so good) but that is how things are.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
As a current 970 owner I don't like the game works way of doing things. The hit isn't worth the eye candy to me. Guess adding sliders to control the effects would help tremendously at times. 1080P 144Hz currently. I prefer raw fps over eye candy.
 
Status
Not open for further replies.