[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
You expect most PC gamers to do this? It has to be integrated into the Advanced Graphics options menu to make it easy. Also, what you are describing only became available after CDPR released a patch after complaints of downgraded graphics and poor performance with GameWorks. As a result, they are giving gamers these choices only now, after all the complaints. That doesn't strengthen your argument at all since the ini tweaks came post launch due to gamer demands.


You offered basically there is nothing the developer could do for the broken MSAA tessellation -- just pointing out the inaccuracy --- You offered there is basically an on/off setting -- just pointing out the inaccuracy.
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,616
136
And probably gamers that purchased the GTX 580 6-8 months before the 6XX series probably felt ripped off, especially to see a GTX 660 -- a 199 dollar sku -- almost reach a fully enabled GF-110 core.

So it's a "feature" that the 780Ti still destroys a gtx960 & matches or faster than 970 as long as NV isn't actively involved/sponsoring a game's development. Kepler owners rejoice, NV hasn't forsaken ye! Wait.. no.

I remember some posters several months ago mentioning that GCN appeared to be aging more gracefully than Kepler and did not really take a lot of notice. Now however the performance delta appears to be accelerating. Its not like a new range of cards appear and demolish the old as you appear to be claiming. Taken together with what Silverforce says, someone might say "hey, whats happening here, this is starting to look contrived".
If a new driver fixes Kepler, then that is proof in my book. Nvidia have often boasted that there are more staff employed in the software side of the company to ensure the best experience possible. If true, they could not have missed the drop in performance of Kepler. Impossible, therefore it had to be something that was allowed to happen.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
PhysX, Gameworks, Gsync, and all the other exclusive features are selling points. They're not simply successful because they're attached to the market leader. AMD's marketshare falling to ~20% is a recent development with a perfect storm of GameWorks, G-sync, and Maxwell.

What makes you think that worthless things like Gameworks or G-sync sell Nvidia cards? What sells their cards is great brand recognition and weakness in their competition (who also have worthless features that make for cool sound bites).
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
*Value add for their customers on the latest GPUs.

Unless you believe its a coincidence that in many GW title so far, Kepler has run worse than its normal performance delta versus Maxwell?

How do you even believe what NV is doing is good for their customers when in the most recent GW titles, Proj Cars, a gtx960 is spanking Titan, 780 and matching 780Ti. Then in Witcher 3, all Kepler is tanking hardcore. Poor performance at 1080p for a game that's graphically similar to console quality..

"Oh, we found some issues with Kepler performance (that we already knew about during the game's development since we sponsored it!!)... only after everyone went berserk over our planned obsoletion tactic and it made it to the Reddit front page."

Indefensible to throw Kepler owners under the bus so soon.

Demand better for your hard earnt $.
I said already the Kepler owners should storm the NV forums :biggrin: and ask them to cater to the Kepler owners .I absolutely don't support what they did with Kepler but it has got nothing to with AMD.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
I cant believe that the tin-foil hat wearing pitch-fork mob still believed that Nvidia tried to implement a 1 year product life cycle for their cards.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I found it quite strange that when the developers claimed that Nvidia did not pay them, many took it at face value and not realize, not all 'payments' are cash transfer.

I think the quote was, "nVidia didn't pay them a single penny."
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think Intel should drop PCI-E because they don't need it, and AMD should lock their PCI-E slots to just run with AMD GPU's because "why should they care about nVidia customers". I'm sure that all of these people defending nVidia's proprietary practices will be just fine with that.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Sounds like you really don't care about a fair competitive playing field . There really is nothing stopping AMD to compete with GameWorks.
 
Last edited:

arandomguy

Senior member
Sep 3, 2013
556
183
116
Some gamers are naive. They think unless JHH handed a briefcase full of 100 bills, NV is not showing priority / bribing its partners.

1. Ryan Shrout given preferential seating arrangements to get better photographs for his website.

2. Tim Sweeney getting a Free Titan X and NV probably working a back-door marketing/promotion promises for many UE4 games is a two-way relationship. Tim Sweeney isn't just nice to JHH cuz they play golf together or something. He wants to promote Epic Games which means partnering with a hardware manufacturer gets him more exposure for next gen PC games, while NV gets the benefit of selling more GPUs when gamers are wowed by next gen effects of UE4 games (of course naturally more GW features should make their way to UE4 games, PhysX, TXAA, etc.).

The review sites get exclusive first looks on newly announced products which drivers readership and their search on Google/Ad revenue.

http://www.slashgear.com/nvidia-geforce-gtx-titan-x-revealed-handed-to-tim-sweeney-04372104/

Really a $1000 hardware gift "With love" from the CEO of a hardware manufacturer? How can we not expect UE4 to run better on NV cards? :hmm:

So I'm curious what you're feelings then are on the following and would you take equal vigor in also bring up these points -

1) AMD had their 30 years of gaming event where Kyle from HardOCP was invited as a special guest. HardOCP is one of the very vocal parties that the current crossfire solution is now a better experience which I believe you have cited before. Do you question the objectivity here then as well as AMDs marketing tactics? Do you feel that AMD does not play to the press for favorable coverage?

2) Johan Andersson, lead for DICE's Frostibite Engine, has just been gifted AMDs upcoming flagship GPU. Do you now question the objectivity of Frostbite as well as DICE's recent partnership with AMD? This has always been spun as a natural partnership without benefits exchanging hands.

Do you honestly feel that AMD does not employ the exact same marketing, including guerilla marketing (for instance disguising planned PR as spontaneous community reactions), tactics that Nvidia or really any modern major corporation uses?
 

DA CPU WIZARD

Member
Aug 26, 2013
117
7
81
Of course both companies give incentives for them to support their products, or at least support their more than they would have normally vs the competitions offerings... But you have to remember these incentives do not prove that the products that they support would not have been supported without the incentives to begin with.
 
Last edited:

showb1z

Senior member
Dec 30, 2010
462
53
91
Sounds like you really don't care about a fair competitive playing field . There really is nothing stopping AMD to compete with GameWorks.

AMD competing against Nvidia with their own suite of proprietary effects is about the worst possible thing that could happen. Why would anyone want this? Next step is that they start selling their own consoles.
Stuff like Gameworks is only detrimental to the PC ecosystem as a whole, AMD en Nvidia owners alike. In the end they're hardware companies, they should compete by engineering superior hardware
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
"Superior Hardware"? nVidia is much faster with Geometry processing but yet the consoles are using hardware which has a problem with Geometry.

The PC and console world is different. Which mean there is room for more and other effects than you wouldnt get on the console plattform.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Sure.

Bringing back the point of superior hardware:
Ubisoft used terrain tessellation for HAWX2 five years ago. Result was that AMD complained about it.

Crytek used tessellation for nearly every assets in Crysis 2 three years ago. Result was that AMD complained about it.
Fun fact: They scrapped nearly all of these assets in Crysis 3.

WB used tessellation for the cape of Batman and the snow 18 months ago. Result was that AMD complained about it.

nVidia has the superior hardware for geometry processing but for what reason they are not allowed to use it.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
nVidia has the superior hardware for geometry processing but for what reason they are not allowed to use it.

64x tesselation with no IQ improvement is not just using their hardware. It's abusing it to make Maxwell look better compared to GCN/Kepler cards.
The point is, Nvidia shouldn't be calling the shots on software development. No way CDPR would use 64x tesselation if they had a choice, it's utterly pointless.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
AMD competing against Nvidia with their own suite of proprietary effects is about the worst possible thing that could happen. Why would anyone want this?

imho,

So developers could save resources and time adding fidelity options for their games. So gamers may choose to enable these features for added immersion. The biggest complaint, to me, has been the lack of focus from developers of adding content for the PC platform when the major focus has been for consoles. So what do you do? Nothing? Reactionary complaints? nVidia attempts to try to improve the PC experience for their customers by creating GameWorks by investing and spending resources and actively trying to get features in titles.

I believe it's a valid criticism about GameWorks considering it uses industry standards at times and brand agnostic to some degree and would be nice if nVidia was more cooperative with their source code and allow others to fully optimize. . The idealist view from me would like nVidia to rethink this strategy. However, they spent the millions, innovated with their talents, did all the work, and desire to protect their IP considering the goal is for their customers and shareholders.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
Seriously?

Clearly, you have no idea how any of this really works.

Do you?

This is what I have learned about it:
Fur and hair is usually done as a layer or "extra" to the base 3D model. Probably to the "highest layer" of the base 3D model so that when LOD kicks in, it goes down gracefully. Which is why the surrounding, farthest, tessellated "things" go away sooner than the player's model and other "important" models.


See AMD created mantle to give their cards a boost and it was never intended to run on NV just like GW.

Mantle was intended to run everywhere. Why do people keep making this false information? Did Mantle run everywhere? No! But, it was intended to. And it became a reality.

Mantle became Vulkan. Khronos(which includes people from pretty much every company, including NV) gave a huge "Thank you" to AMD a COUPLE OF TIMES at the Game Developer Conference 2015.

For all purposes, Mantle should not be mentioned anymore, it did what it was supposed to do, it became what it was intended to. Mantle was a success.




And, on a different topic. I am actually quite interested that no one has pointed out, specifically, how horrible The Witcher 3 performs on the Recommended GPU Hardware, the GTX 770. Jeez, Nvidia Fanboys, you guys have problems.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Crytek used tessellation for nearly every assets in Crysis 2 three years ago. Result was that AMD complained about it.

WB used tessellation for the cape of Batman and the snow 18 months ago. Result was that AMD complained about it.

nVidia has the superior hardware for geometry processing but for what reason they are not allowed to use it.

They are allowed to use it, but come on.
Tessellated underground ocean? Tessellated to the hell and back concrete block which looks exactly the same a regular game concrete block?
Overtessellating geometry to the point in which one pixel is defined by multiple triangles? Ofcourse people don't like when their GPU performance is wasted. It would be better to run bitcoin mining in the background - more productive....
 

SirCanealot

Member
Jan 12, 2013
87
1
71
Sure.

Crytek used tessellation for nearly every assets in Crysis 2 three years ago. Result was that AMD complained about it.
Fun fact: They scrapped nearly all of these assets in Crysis 3.

Fun fact: Crytek used useless levels of tessellation on lots of assets in Crysis 2 which didn't actually improve the image quality but tanked performance. They also had a massive tessellated ocean of water running under stages. Read this: http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

Put yourself in AMD's position and tell me what you would do?

The ability of humans to defend practice like this is really terrifying — I'm comparing it to other areas of discussion in humanity and it makes me seriously scared for the future (not mentioning anything specific as to not derail thread).
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
What would you do if amd partnered with DICE and made Frostbite 3 use excessive amounts of draw calls for nothing. Just a draw calls for under the map gravel particles you can't see. And I'm talking about hundreds of thousands draw calls per frame.
api-overhead-4790k-titan-x.jpg

api-overhead-4790k-290x.jpg


That would be a very dirty trick leveraging their competitive advantage their acquired by developing mantle.

I would be against that aswell.

If you are going to use your advantage, do so in the way it actually benefits the users, not punishes everyone, but much more those who not have your hardware.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Fun fact: Crytek used useless levels of tessellation on lots of assets in Crysis 2 which didn't actually improve the image quality but tanked performance. They also had a massive tessellated ocean of water running under stages. Read this: http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

All of this gets culled from the pipeline.

Put yourself in AMD's position and tell me what you would do?

Build better hardware. Hardware which is better suited for geometry processing.

Here is a bit from the HAWX2 complain:
In fact, Nvidia estimates HAWX 2 with tessellation averages about 18 pixels per polygon. Interestingly, that's just above the 16 pixel/polygon limit that AMD Graphics CTO Eric Demers argued, at the Radeon HD 6800 series press day, is the smallest optimal polygon size on any conventional, quad-based GPU architecture.
http://techreport.com/review/19934/nvidia-geforce-gtx-580-graphics-processor/8

AMD complained about something which wasnt existent. nVidia's hardware was and is just "superior" when it comes to geometry processing.

The same with Crysis 2. The performance impact of tessellation was around 18% on Fermi: http://www.hardware.fr/articles/838-7/influence-tessellation-son-niveau.html

On the other hand the performance impact of Forward+ in Dirt:Showdown was 58% on Kepler:
http://www.hardware.fr/articles/869-14/benchmark-dirt-showdown.html

How is this okay? This is even a bigger performance impact than people have with Hairworks in The Witcher 3. :|
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
I would be against that aswell.

Different case. DirectX12 and Vulkan run on "all" GPU's. They are leveraged by all companies. AMD has the better hardware such scenarios. What's sad is that a Titan X can't compete with a 290X, in such ways yet.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
crytech said:
1) Tessellation shows up heavier in wireframe mode than it actually is, as explained by Cry-Styves.
2) Tessellation LODs as I mentioned in my post which is why comments about it being over-tessellated and taking a screenshot of an object at point blank range are moot.
3) The performance difference is nil, thus negating any comments about wasted resources, as mentioned in my post.
4) Don't take everything you read as gospel. One incorrect statement made in the article you're referencing is about Ocean being rendered under the terrain, which is wrong, it only renderers in wireframe mode, as mentioned by Cry-Styves.

So many times I've read views about wireframes, especially the blanket view of the wireframe ocean.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
What do you expect? The wireframe shots from Crysis 2 and Batman:AO came directly from AMD. They are using them to spread the fud.

It is just easier to complain instead of building better hardware. However, when they can leverage their strenghts they blame nVidia for the bad performance. :D
 
Status
Not open for further replies.