[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Hairworks is garbage and nobody said the performance hit was a myth.

The point is AMD just optimized two more games that they claimed they were crippled and unable to optimize for.

They didn't say they couldn't improve performance but that the output they were seeing out of Gameworks code made them think there is deliberate intent to make it difficult to address via driver tweaks. There is a difference between coding to maximize your own hardware versus choosing to do things that while not ideally optimized for your hardware run even worse on your competitors.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I think you're seeing things, the x64 tesselation is only for Hairworks.



No, you're ignoring the real question here. This is not about the difference in performance penalty on Maxwell, Kepler and GCN.
Why is it a good thing to cut the average fps on Kepler/Maxwell by 26% and the minimums by 55-60% for no IQ gain? Why do people defend Nvidia flushing so much performance down the drain on all hardware for absolutely no reason?

You are gonna tell me what I am ignoring in this post. You got to be kidding

You guys are all overr the place. There are at least 39 different tactics and the target is shifting where ever it has to in order to repeat a million times how nvidia is so terrible.

It is cool you like to play games but I couldn't have been more clear in what I was saying. Not only was my post clear, the post I responded to was completely clearin its subject, something you completely ignored yourself.

There are arguments that can never be won, arguments that get nowhere. They are always met with an ever changing and slippery opponent who will completely swap the subject or matter from one sentence to the next. In their minds, they are winning. They don't really want to get anywhere, they like the fight and chaos too much.

of course, I have no interest in pointless arguing. If you are really wanting to have a conversation with conclusion, what I would call meaningful, then it is best to talk about one thing at a time rather than 39 all at once. See, the only path to resolution is to talk about each card or claim than have someone constantly throwing one of their 39 jokers down and yelling, I win I win.

I didn't ignore anything. The one thing I addressed, is one I thought was important to point out. just seemed like a good time to chime in.
See, This one is just pure misinformation and it keeps getting repeated over and over. But you are cool with that, apparently.
I kind like rational and real conversation.

I have no idea why you would quote me and try to manipulate my point into something else, don't you think it would be better if we addressed some of these wild claims one at a time?

Just in your post, you suggest that I am defending nvidia for flushing performance down the drain. Defend? I was just pointing out misinformation and then you turn it into something that it wasnt. Why?

I never said I liked hairworks or that it is worth it. i never said it isn't. That is a completely different conversation. not one i ignored, just not one i was currently talking about. I couldn't have been more clear with my post. Even the post I replied to, it was clearly a matter all to itself. And if I get around to it, I will pick another falsehood to address at a later time.
But I have no interest in pointless back and forth with slipping and swapping subjects.

So lets be real, can we?

Don't manipulate and keep the chaos going. Unless that is your whole reason for being in this thread.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Hairworks is garbage and nobody said the performance hit was a myth.

The point is AMD just optimized two more games that they claimed they were crippled and unable to optimize for.

You are either misunderstanding or misrepresenting AMD's position.

They can't optimize Gameworks. The dev even posted saying so as well. They never said that they couldn't optimize anything in the entire game. The dev gave them 20 game keys a few days befor release for them to use for optimizing (project cars). Is the picture any clearer for you?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think you're seeing things, the x64 tesselation is only for Hairworks.



No, you're ignoring the real question here. This is not about the difference in performance penalty on Maxwell, Kepler and GCN.
Why is it a good thing to cut the average fps on Kepler/Maxwell by 26% and the minimums by 55-60% for no IQ gain? Why do people defend Nvidia flushing so much performance down the drain on all hardware for absolutely no reason?

Who says its not a developer issue. Anyone tried benchmarking after patch 1.03 and 1.04 that both improve hairworks performance?

Also AMDs new driver with ~10% increase.
http://forums.anandtech.com/showthread.php?t=2432978
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
So let me try again which NV card did run mantel?

How does this matter? I said, no. Everyone knows it didn't. I responded to the lies given. Mantle was intended to be cross vendor and cross platform, and it did as intended when AMD went to Khronos and gave them the spec to use for Vulkan.

a kepler fix is coming soon we will see

Didn't know it would take more than a few minutes to change the default tessellation value on a specific card architecture.


Did AMD forsake their 6XXX series and customers with DirtShowdown or simply HD 7970's architecture was more forward thinking and robust?

I think the answer is "obvious", the GCN architecture has been incredible. Just look at some of the current benchmarks most people have posted, they avoid the 290X and only show the 290. And it beat a Titan X in DX12 api overhead calls. The 290X still has more "processing power" in SP and DP than the 980.


So I believe it only affects NV users then as the AMD owners can already do that so what is the big deal? for a long time we have been trying to achieve higher image fidelity and that is the reason we have spent so many resources into it, it would be foolish to start compromising on that factor now.

Choice and good decisions aren't a compromise.

The point is AMD just optimized two more games that they claimed they were crippled and unable to optimize for.

Lies. They can't optimize GameWorks, they aren't allowed.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think the answer is "obvious", the GCN architecture has been incredible. re "processing power" in SP and DP than the 980

It has proven to be a robust and forward thinking architecture. However, it needed time for drivers to mature and for applications/games to take advantage of its strengths and one can make the same points about Maxwell, when compared to Kepler.
 

showb1z

Senior member
Dec 30, 2010
462
53
91

I think you're taking this a little too personal. My point wasn't aimed directly at you, but that might've been unclear, so sorry about that.
I merely quoted you because frankly this whole AMD vs Nvidia discussion is besides the point and the graph in your post illustrated it perfectly.
It doesn't matter whether you have a Maxwell, Kepler or GCN card, everyone gets shafted by Gameworks. Performance on every single card out there takes a nose-dive at zero IQ improvement. Gameworks benefits nobody, whichever "team" you are on.
The only one who has anything to gain out of this is the Nvidia PR department.

Now, all the pro-Nvidia people in this thread are saying Gameworks is no big deal. I've asked the following question 4 or 5 times and I still have to recieve an answer:
Why is it ok to severely neuter performance on all hardware when it offers no IQ improvement? Why are any of you ok with this, when it's so easily fixed?

Who says its not a developer issue.

If CDPR was allowed to lower tesselation to sane levels they would've done so.
Maybe Nvidia will finally allow it if the outrage gets big enough.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
How does this matter? I said, no. Everyone knows it didn't. I responded to the lies given. Mantle was intended to be cross vendor and cross platform, and it did as intended when AMD went to Khronos and gave them the spec to use for Vulkan.

But AMD themselves used fear mongering:

AMD said:
There will be no DirectX 12. That was it. As far as we know there are no plans for DirectX 12

http://www.hardwarecanucks.com/news/amd-roy-taylor-directx12/

Selling no DirectX 12 -- and AMD's Mantle and our Radeon brands are superior theme.


Choice and good decisions aren't a compromise.

There are always compromises with fidelity settings and choices to consider.

Lies. They can't optimize GameWorks, they aren't allowed.

Using source code, but nVIdia claimed they mostly optimize from binary builds - has anyone asked if AMD can optimize at all from this method?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
If CDPR was allowed to lower tesselation to sane levels they would've done so.
Maybe Nvidia will finally allow it if the outrage gets big enough.

Got any edvidence? Got any benches of patch 1.03 and 1.04? Else its just FUD.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
If CDPR was allowed to lower tesselation to sane levels they would've done so.
Maybe Nvidia will finally allow it if the outrage gets big enough.


The hit with Hairworks Fur in FarCry 4 wasn't nearly as dramatic -- why didn't nVidia purposely cripple then?
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Now, all the pro-Nvidia people in this thread are saying Gameworks is no big deal. I've asked the following question 4 or 5 times and I still have to recieve an answer:
Why is it ok to severely neuter performance on all hardware when it offers no IQ improvement?

imho,

IN crysis 2 -- I Understood why x64 may of been needed -- lowering the tessellation factor, one lost detail and the effect exhibited more aliasing and odd artifacts.

Adding something for an artificial performance advantage is wrong because it may do harm to many but it's not clear yet and too much of a rush to judgement and blanketing GameWorks. However, it is fair game to question it and to be vocal, asking really tough questions, considering it isn't ideal and proprietary in nature.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
Got any edvidence? Got any benches of patch 1.03 and 1.04? Else its just FUD.

Do you? Got any benches or posts somewhere showing how those patches boosted everyone's performance by 20%?
Maybe those patches enabled 8-bit mode and everyone's framerate went over 9000. How could we possibly know.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
It has proven to be a robust and forward thinking architecture. However, it needed time for drivers to mature and for applications/games to take advantage of its strengths and one can make the same points about Maxwell, when compared to Kepler.

Yes and no. On this case, a "setting" was set to a high default. That is all there is to it. It doesn't take much "tuning" here. You can even have it at half of it, still almost no difference on a static image and it would run faster. And one thing that I have mentioned awhile and people have ignored. This game recommends a GTX 770 to be used, and a GTX 780 has a hard time with it. Shouldn't this outrage a few people? The game runs below 30 FPS at 1080p on a GTX 770!!!

Selling no DirectX 12 -- and AMD's Mantle and our Radeon brands are superior theme.

My answer to this would be, what did Microsoft say about it. I don't remember any backlash from MS. And, he, Roy, did say...“There will be no DirectX 12. That was it. As far as we know there are no plans for DirectX 12,” he said to Heise. “If this should not be, and someone wants to correct me – wonderful.”

Sounds more like he was pushing Microsoft to leak some info.

There are always compromises with fidelity settings and choices to consider.

Agree, but you see the impact of it on various products. There was no "consideration", you compromise when there is something worth gaining, on this case, the only thing gained was a better view in benchmarks for the 900 series.

Using source code, but nVIdia claimed they mostly optimize from binary builds - has anyone asked if AMD can optimize at all from this method?

I don't know the answer to that, but, why should anyone have any trust on the company that screws it's own users...?


The hit with Hairworks Fur in FarCry 4 wasn't nearly as dramatic -- why didn't nVidia purposely cripple then?

You quoted me with something I didn't write. >.>"
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
imho,

IN crysis 2 -- I Understood why x64 may of been needed -- lowering the tessellation factor, one lost detail and the effect exhibited more aliasing and odd artifacts.

Adding something for an artificial performance advantage is wrong because it may do harm to many but it's not clear yet and too much of a rush to judgement and blanketing GameWorks. However, it is fair game to question it and to be vocal, asking really tough questions, considering it isn't ideal and proprietary in nature.

That's not true for Crysis 2 and you know it Pauly - they used tessellation on everything even when it wasn't needed; on the oceans that weren't even onscreen at the time; bricks which literally showed no difference between tessellation and non-tessellation. 64x wasn't needed then certainly isn't needed now.

Its used to cripple not only AMD's side but Nvidia's side.....just not as much
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
The hit with Hairworks Fur in FarCry 4 wasn't nearly as dramatic -- why didn't nVidia purposely cripple then?


Witcher 3>>>>> Far Cry 4 by leaps and bounds - besides CD Projek Red doesn't actually flat out hate PC games unlike Ubisoft ;) While CD's releases aren't perfect. They aren't completely broken mess unlike Ubisoft's releases.....
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That's not true for Crysis 2 and you know it Pauly - they used tessellation on everything even when it wasn't needed; on the oceans that weren't even onscreen at the time; bricks which literally showed no difference between tessellation and non-tessellation. 64x wasn't needed then certainly isn't needed now.

Its used to cripple not only AMD's side but Nvidia's side.....just not as much

That's just a myth --- to use just wire frame was disingenuous and this was clearly offered by the developers and tessellation by itself wasn't this huge performance hit --- the bigger hit was in conjunction with tessellation, pom and advanced shaders. This was another example of a rush to judgement.
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
Besides - just adding; Nvidia's word is pretty much crap - they've proven time after time; they have no problem lieing to their customers to cover their asses. We saw if after they got caught cheating in benches *Hey ATI was just as guilty but they came cleanish after caught. Bump gate - they not only royally screwed millions of users by willfully dumping faulty chips on the market they also screwed all their partners at the HP/Dell/Apple; and were found guilty in courts for this.

Again they lied about 970 memory set; when caught - it was PR spin - with false advertising also involved. They were forced to recant with gameworks - as originally no one could see the black box code until they added the *ability to license it* never has a price to license to see this code ever been said......

Nvidia is about as anti-consumer as it gets; yet people still want to defend them. AMD is not perfect by any means; still need to get drivers out - get their new cards out; actually start fighting back with some decent marketing. I for one don't want to see Nvidia ever rule by itself; or we would be all screwed badly.
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
That's just a myth --- to use just wire frame was disingenuous and this was clearly offered by the developers and tessellation by itself wasn't this huge performance hit --- the bigger hit was in conjunction with tessellation, pom and advanced shaders. This was another example of a rush to judgement.

Its not a myth Pauly it was plainly SHOWN; not by just one website but almost all of them -

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

http://hothardware.com/news/Indepth...-Shows-Highly-Questionable-Tessellation-Usage

I could continue - once you applied the patch the tessellation applied would be to things not even there still having massive tessellation like water.

BTW I can post more websites of of evidence if you still keep going on about a myth.

Nvidia has done this time and time again; they don't care if they make their customers suffer as long as it hurts AMD more -
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
How does this matter? I said, no. Everyone knows it didn't. I responded to the lies given. Mantle was intended to be cross vendor and cross platform, and it did as intended when AMD went to Khronos and gave them the spec to use for Vulkan.



Didn't know it would take more than a few minutes to change the default tessellation value on a specific card architecture.




I think the answer is "obvious", the GCN architecture has been incredible. Just look at some of the current benchmarks most people have posted, they avoid the 290X and only show the 290. And it beat a Titan X in DX12 api overhead calls. The 290X still has more "processing power" in SP and DP than the 980.




Choice and good decisions aren't a compromise.



Lies. They can't optimize GameWorks, they aren't allowed.

1.The point is AMD made mantle to give their cards a competitive advantage against NV.

2.Just because it seems easier to do doesn't necessarily mean it is easier to implement, also it may be that NV is reluctant to add and call that a "fix"

3.You have a choice already don't run hairworks if your card is inadequate.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Its not a myth Pauly it was plainly SHOWN; not by just one website but almost all of them -

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

http://hothardware.com/news/Indepth...-Shows-Highly-Questionable-Tessellation-Usage

I could continue - once you applied the patch the tessellation applied would be to things not even there still having massive tessellation like water.

BTW I can post more websites of of evidence if you still keep going on about a myth.

Nvidia has done this time and time again; they don't care if they make their customers suffer as long as it hurts AMD more -

crytech said:
1) Tessellation shows up heavier in wireframe mode than it actually is, as explained by Cry-Styves.
2) Tessellation LODs as I mentioned in my post which is why comments about it being over-tessellated and taking a screenshot of an object at point blank range are moot.
3) The performance difference is nil, thus negating any comments about wasted resources, as mentioned in my post.
4) Don't take everything you read as gospel. One incorrect statement made in the article you're referencing is about Ocean being rendered under the terrain, which is wrong, it only renderers in wireframe mode, as mentioned by Cry-Styves.

Wireframes by themselves are disingenuous.


Performance at those mythical jersey barriers:


http://maldotex.blogspot.com/2011/09/tesselation-myth-in-crysis-2-el-mito-de.html


You quoted me with something I didn't write. >.>"

Sorry, did correct it, thanks!:)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Its not a myth Pauly it was plainly SHOWN; not by just one website but almost all of them -

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

http://hothardware.com/news/Indepth...-Shows-Highly-Questionable-Tessellation-Usage

I could continue - once you applied the patch the tessellation applied would be to things not even there still having massive tessellation like water.

BTW I can post more websites of of evidence if you still keep going on about a myth.

Nvidia has done this time and time again; they don't care if they make their customers suffer as long as it hurts AMD more -

I already posted this.

http://www.cryengine.com/community/viewtopic.php?f=355&t=80565&hilit=tessellation+too+much

Wireframe mode disables all culling and tessellation is treated as if you were right in front of it (maximum detail).

If you bothered to educate yourself its pretty obvious. In all the images displayed you can see the polygons on the back of the object (there is no culling).

Linking to techreport and other sites who lack the technical knowledge is like asking my family doctor about specialized cancer treatments. If I want to know about cancer treatments I go see an oncologist who knows far more of the specific details involved.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I have no idea what forums has to do with anything. This is honestly getting tiring.

What's tiring to me is seeing all of the new accounts with extremely strong views that always seem to be really interested in speculation threads when another big release is coming up for one camp or the other.

I don't see why there's even a debate about this.

Nvidia is more than welcome to innovate and add features for their products like phsx, hairworks, etc.,

If they want to make those features proprietary and pay game devs to use them, that's their right too.

The above is not the problem - the PROBLEM is that they (nvidia & gamedevs) implement those features with either gross negligence or purposefully in a way that negatively affects competitors' products (and even nvidia's older generation products).

Good thoughts here.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
AMD can optimize the game in general but they can't improve Hairworks performance through the driver. That's what this whole thread is about.

Why is AMD entitled to optimize Hairworks? Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing. Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
1.The point is AMD made mantle to give their cards a competitive advantage against NV.

2.Just because it seems easier to do doesn't necessarily mean it is easier to implement, also it may be that NV is reluctant to add and call that a "fix"

3.You have a choice already don't run hairworks if your card is inadequate.

1. You said Mantle was not intended to be cross platform. That was a lie. I just corrected it.

2. I doubt a highly engineered SDK doesn't have a simple option like that. But it does, it has already been shown! Even on AMD it can be tuned...and yes, I sure think they take their time to make it seem difficult.

3. This game was developed with 700 series hardware in mind, people keep "missing" that. Not only that, the RECOMMENDED GPU is a GTX 770. You cannot play the game, full featured, with HW, on 1080p, with that card unless you LOVE hitting 20 FPS every now and then.



Why is AMD entitled to optimize Hairworks? Its not their code. Its not their money that was spent sending engineers and helping Project RED to implement Hairworks. Its not their money that was spent creating and developing hairworks. Why should they reap any rewards? More importantly, why should Nvidia allow AMD to view the source code of their own IP?

IMO, Nvidia is very graciously allowing AMD's cards to run features that they (nvidia) have spent millions of dollars developing and implementing. Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.

Totally agree. Tho everyone complained when Mantle was shown...hmm. Anyways...

The issue was AMD getting blamed for something they can't control.

The other issue seems to be the support NV gets even tho they and CDPK lied about the specs needed to play the game and the default HW settings that cripples performance for little to no gain other than better Maxwell benchmark scores.
 
Status
Not open for further replies.