[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Why is AMD entitled to optimize Hairworks? Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing. Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.


This is pathetic.

But it gets to the point, Gameworks is the problem because it allows for (and is being used) by nVidia too give too much control to nVidia into a games presentation and performance. Tough luck for those who have a problem with that.

These aspects of a game should not be being dictated by nVidia,... despite your argument that nVidia should be able to dictate things in the game by planting gameworks code in games, and that AMD should just take whatever it gets here... no.

If that's the idea of gameworks, then please remove gameworks from all games and let the Dev put in what they want and allow for AMD and nVidia to optimize. Avoid all blackbox code that is obviously creating issues and frustrations for gamers.


Are we really saying it's good to have nVidia put in blackbox code that AMD can not optimize for into our games? And saying so with the knowledge that nVidia purposefully uses that blackbox code to present AMD performance in worst possible light?

Yikes? Obviously AMD and nVidia need to be able to optimize for code in games their cards are running. This is why it was imporant that nVidia could optimize for TressFX, it allowed them to run it as well as they could. This is not the case for AMD and Gameworks, that is a big problem.


Sounds like we just need to get rid of Gameworks if nVidia can't handle this responsibility properly.

Question is how to make game Devs aware of this growing sentiment from gamers?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Question is how to make game Devs aware of this growing sentiment from gamers?

I think you'll find it's a growing sentiment from the same old few dozen forum die hards, and the rest of the world doesn't give a ... Mostly because it's not really a problem. The code is standard DX11 that everything can run, the devs can get the source code, it works fine on AMD cards with a few tweaks, and it's optional anyway. Take out gamesworks and we just get straight console ports again (like we have now if you turn off all the gameswork features).
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
Take out gamesworks and we just get straight console ports again (like we have now if you turn off all the gameswork features).

More lies.

If the game had an "AMD feature", they could had just scaled it up on the PC and everyone would benefit.

GameWorks is not "running" on the consoles, it literally is extra code to screw everything up.

Obviously a paid feature for the sole benefit to make the 900 series look better.
 

SPBHM

Diamond Member
Sep 12, 2012
5,076
440
126
Why is AMD entitled to optimize Hairworks? Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing. Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.

Nvidia is getting a return for it anyway with their brand being exposed and the advantage for optimization, even without "locking" the competition out they have a a return

the game is also targeting AMD GPU owners
it's a PC game, a Windows DX11 game not a Nvidia only game, so AMD should be allowed to run it and optimize it,

even the Mantle games worked nicely on Nvidia GPUs when using DX11 to produce the same result.
the TressFX thing was not ideal, but it seems like it was easy enough for Nvidia to get it working well with their hardware.

By not buying that awful game the witcher 3 :biggrin:

good luck.

well, most people don't care about this whole thing, hairworks off looks great and the game is very playable and good looking with AMD cards,

and Witcher 3 is a fantastic game, it's easily on my "top 5" from over 20 years of gaming, so yes, good luck!
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Can we stop the TressFX example?

Tomb Raider was broken on nVidia hardware at the release. Neither Eidos nor AMD had cared about nVidia user before they released the game. AMD even used this broken game to show how superior their hardware would be:
<It seems AMD deleted all of their blogs post before 2015...>

Hey, Eidos even integrated an ingame benchmark which doesnt reflect the ingame gameplay:
http://pclab.pl/art52447-4.html

And TressFX was only published a few days after the launch. So nVidia never had any real chance to optimize for the game at all.

So, stop using TressFX and Tomb Raider as a good example.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1. You said Mantle was not intended to be cross platform. That was a lie. I just corrected it.

2. I doubt a highly engineered SDK doesn't have a simple option like that. But it does, it has already been shown! Even on AMD it can be tuned...and yes, I sure think they take their time to make it seem difficult.

3. This game was developed with 700 series hardware in mind, people keep &quot;missing&quot; that. Not only that, the RECOMMENDED GPU is a GTX 770. You cannot play the game, full featured, with HW, on 1080p, with that card unless you LOVE hitting 20 FPS every now and then.

1.I said mantle didn't run on NV cards and you couldn't disprove it at all.
2.No I don't think so at all unless you have seen the code base firsthand.
3.Recommended doesn't mean play with everything maxed.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
This is pathetic.

But it gets to the point, Gameworks is the problem because it allows for (and is being used) by nVidia too give too much control to nVidia into a games presentation and performance. Tough luck for those who have a problem with that.

These aspects of a game should not be being dictated by nVidia,... despite your argument that nVidia should be able to dictate things in the game by planting gameworks code in games, and that AMD should just take whatever it gets here... no.

If that's the idea of gameworks, then please remove gameworks from all games and let the Dev put in what they want and allow for AMD and nVidia to optimize. Avoid all blackbox code that is obviously creating issues and frustrations for gamers.

Are we really saying it's good to have nVidia put in blackbox code that AMD can not optimize for into our games? And saying so with the knowledge that nVidia purposefully uses that blackbox code to present AMD performance in worst possible light?

Yikes? Obviously AMD and nVidia need to be able to optimize for code in games their cards are running. This is why it was imporant that nVidia could optimize for TressFX, it allowed them to run it as well as they could. This is not the case for AMD and Gameworks, that is a big problem.

Sounds like we just need to get rid of Gameworks if nVidia can't handle this responsibility properly.

Question is how to make game Devs aware of this growing sentiment from gamers?

Witcher 3 runs fine on AMD cards without hairworks.

Nothing else seems to be the problem except hairworks. Of course this is dictated by nvidia...they designed it. Its their IP and they have all rights to do whatever they want with it.

Nvidia didn't put anything in; they may have pushed for it but ultimately Redkit did. And obviously it was worth it for them as they put it in.

You seem to be confusing what you want and what you think is good with what someone is permitted to do.

The bolded is NOT Nvidia's responsibility. It was not and is not AMD's responsibility to make sure Mantle worked on Nvidia or Intel's cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
3. This game was developed with 700 series hardware in mind, people keep "missing" that. Not only that, the RECOMMENDED GPU is a GTX 770. You cannot play the game, full featured, with HW, on 1080p, with that card unless you LOVE hitting 20 FPS every now and then.

You're getting disingenuous now with your rants --- recommended doesn't mean full featured with HW, at 1080p

For crying-out-loud, Dirt Showdown, which created awareness for the HD 79xx family, with the advanced lighting abilities of forward+ and global illumination -- recommended GPU was the 5870.

http://store.steampowered.com/app/201700/
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Of course this is dictated by nvidia...they designed it. Its their IP and they have all rights to do whatever they want with it.

Indeed! However, considering it is using industry standards and brand agnostic to many degrees would like to see nVidia open up more libraries or middlewares, primarily for more cooperative adoption, and what nVidia actually did for PhysX.

It's going to take a push from developers I feel for nVidia to make a change.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Can we stop the TressFX example?

Tomb Raider was broken on nVidia hardware at the release. Neither Eidos nor AMD had cared about nVidia user before they released the game. AMD even used this broken game to show how superior their hardware would be:
<It seems AMD deleted all of their blogs post before 2015...>

Hey, Eidos even integrated an ingame benchmark which doesnt reflect the ingame gameplay:
http://pclab.pl/art52447-4.html

And TressFX was only published a few days after the launch. So nVidia never had any real chance to optimize for the game at all.


So, stop using TressFX and Tomb Raider as a good example.

it was the first game using tressfx. Did anyone even know about tressfx before that? iirc it was a surprise that it had a new way of doing hair. In that case it seems a good thing the code was available just days later.

The other thing is that since nvidia is supposedly good with developer relations, they should have seen it. Its not closed, they can see the code with the tomb raider code if they cared.

1.I said mantle didn't run on NV cards and you couldn't disprove it at all.
2.No I don't think so at all unless you have seen the code base firsthand.
3.Recommended doesn't mean play with everything maxed.

nvidia doesn't support freesync either. Its their choice if they adopt these things in their current gen or next gen.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
it was the first game using tressfx. Did anyone even know about tressfx before that? iirc it was a surprise that it had a new way of doing hair. In that case it seems a good thing the code was available just days later.

So, dont share your code is okay as long as AMD is doing it? :|

The other thing is that since nvidia is supposedly good with developer relations, they should have seen it. Its not closed, they can see the code with the tomb raider code if they cared.
They got the release version only a few days prior the launch. It was just not possible to optimize the driver for it. And the game was so broken that Eidos needed to release a patch in combination with the new driver to fix the game.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
1.I said mantle didn't run on NV cards and you couldn't disprove it at all.
2.No I don't think so at all unless you have seen the code base firsthand.
3.Recommended doesn't mean play with everything maxed.

1. Why would I disprove? I said it didn't. You also said it wasn't intended to run on them. But it was.

2. Then you think that they are stupid. And that's fine. Why support people that you think are stupid tho? Seriously, tessellation effects is like a volume knob.

3. It's 1080p(the "standard") and with a promoted feature. So, recommended is nowhere near close "The way it's meant to be played"? *cough* Alright.


You're getting disingenuous now with your rants --- recommended doesn't mean full featured with HW, at 1080p

For crying-out-loud, Dirt Showdown, which created awareness for the HD 79xx family, with the advanced lighting abilities of forward+ and global illumination -- recommended GPU was the 5870.

IMHO, "recommended" should be how to get a great experience with all the features of the game running at what is a "standard resolution". At least 30 FPS...but, that's just my opinion.

And I would had pointed out Dirt Showdown if I had known too. =/
 

96Firebird

Diamond Member
Nov 8, 2010
5,749
345
126
You really think the "recommended specs" should be able to max out a game at 1080p? Has it ever been that way?

Few example...

Rockstar recommends a 660/7870 for GTA V, when in reality the 660 gets 23FPS average and the 7870 gets 27FPS average, @ 1080p very high settings with no MSAA.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Why is AMD entitled to optimize Hairworks?
Why is nVidia entitled to full performance of PCIe/x86 on Intel/AMD chipsets/processors?

Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?
Exactly the same applies to nVidia enjoying free access to x86 and chipset IP from Intel/AMD.

Its not their money that was spent creating and developing x86/PCIe. Why should they reap any rewards?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing.
IMO AMD/Intel is very graciously allowing nVidia to enjoy free access to x86 and PCIe that they (AMD/Intel) have spent millions of dollars developing and implementing.

Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.
Would you defend AMD/Intel equally as zealously as nVidia if they decided to lock out nVidia from freely enjoying full performance of their technologies?

Are you honestly arguing such a lockout would benefit PC gaming consumers?
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
So, dont share your code is okay as long as AMD is doing it? :|

Don't share your code is not cool, share your code is cool, don't but then do share your code is less cool but in the long run still a lot better than never share your code.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You really think the "recommended specs" should be able to max out a game at 1080p? Has it ever been that way?

Few example...

Rockstar recommends a 660/7870 for GTA V, when in reality the 660 gets 23FPS average and the 7870 gets 27FPS average, @ 1080p very high settings with no MSAA.

Never but recent gamer sentiments for some reason are starting to think recommended means max out and that if they spend 2k+ in gpus they should be able to run every single game at max settings and max aa
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Can we stop the TressFX example?
TressFX can be optimized for performance at the driver level by nVidia, and was.

The same doesn't apply for Hairworks for AMD, which can never be optimized at the driver level. This has been confirmed by both the developer and by AMD.

This difference is extremely elementary to understand.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Why is nVidia entitled to full performance of PCIe/x86 on Intel/AMD chipsets/processors?


Exactly the same applies to nVidia enjoying free access to x86 and chipset IP from Intel/AMD.

One is truly anti-competitive and one isn't though! nVidia can't create PCIe/x86 on Intel/AMD chipsets/processors, while nothing is stopping AMD or Intel to create their own middlewares or libraries -- Heck Intel has Havok.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
You really think the "recommended specs" should be able to max out a game at 1080p?

Yes, I think a standard of quality should be done if there is none.

As you have seen and have shown, "recommended" literally means nothing. It is only a means to fool customers to think that they will have a great experience.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
TressFX can be optimized for performance at the driver level by nVidia, and was.

The same doesn't apply for Hairworks for AMD, which can never be optimized at the driver level. This has been confirmed by both the developer and by AMD.

This difference is extremely elementary to understand.

This doesnt make sense. If AMD couldnt optimize Hairworks at the driver level Hairworks wouldnt run on their hardware.

The developer only said that Hairworks cant optimize for AMD but not that AMD couldnt optimize it through the driver. Hairworks technique isnt suited for AMD hardware.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1. Why would I disprove? I said it didn't. You also said it wasn't intended to run on them. But it was.

2. Then you think that they are stupid. And that's fine. Why support people that you think are stupid tho? Seriously, tessellation effects is like a volume knob.

3. It's 1080p(the &quot;standard&quot;) and with a promoted feature. So, recommended is nowhere near close &quot;The way it's meant to be played&quot;? *cough* Alright.




IMHO, &quot;recommended&quot; should be how to get a great experience with all the features of the game running at what is a &quot;standard resolution&quot;. At least 30 FPS...but, that's just my opinion.

And I would had pointed out Dirt Showdown if I had known too. =/

1.My point was mantle never ever run on NV cards and AMD used mantle to give themselves a competitive advantage.
2.Nope I don't think so at all, it is just their decision not to.
3.They played with a 980 when they enabled the HW, so I am really not sure why a 770 is expected to run this game with max settings @ 1080P.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
If you're going to cry about mantle cite some examples where nvidia performance is extremely hampered in mantle games or stop posting about it.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
So, dont share your code is okay as long as AMD is doing it? :|

Who said that? Can't always have it out right away. A few days after the first game using it is out is not late imo.

They got the release version only a few days prior the launch. It was just not possible to optimize the driver for it. And the game was so broken that Eidos needed to release a patch in combination with the new driver to fix the game.

Then they should improve their developer relations. ^_^

Tomb raider was most definitely not broken. I was running that well on a 5770 and q9300 @ 3GHz iirc possibly with hairworks on.

At the end of the day, there were no artificial barriers to nvidia being optimized for, that's the point
 

ph2000

Member
May 23, 2012
77
0
61
Why is AMD entitled to optimize Hairworks? Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing. Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.

locked to nvidia only would be better ala gpu physx :whiste:
 
Status
Not open for further replies.