[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
1.So you mean when the AMD cards ran mantle and NV couldn't? lol its funny considering MS started working on DX 12 years ago.
2.They don't want to compromise on image quality probably.
3.Yes recommended doesn't mean playing with all the features turned on.

1. Erm, yes? I have been saying this from the start. I simply corrected your lie and you just kept going with this other obvious thing. Stupid cycle.
2. Yeah, for static images.
3. Already pointed that out. Was actually surprised that no one wished for a standard on that.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The Dev never said they disabled it at AMD's request. They said they were only obligated to enable it for AMD. Not at all what you are trying to make it sound like. More like the dev didn't want to have to deal with supporting the feature for nVidia when they weren't getting anything to do it. Maybe nVidia should have paid the dev so they would have offered it to nVidia's customers?

So when a AMD partner doesnt care for nVidia user - they are responsible and it has nothing to do with their partnership.
When a nVidia partner doesnt care for AMD user - nVidia is responsible.

I can really see where you stand. :(

There are two games with TressFX:
The first was broken on nVidia hardware at launch and needed a game patch to fix it.
The second doesnt allow nVidia user to use this feature.

Both games come under the Gaming Evolved banner and were advertised by AMD.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
So when a AMD partner doesnt care for nVidia user - they are responsible and it has nothing to do with their partnership.
When a nVidia partner doesnt care for AMD user - nVidia is responsible.

I can really see where you stand.

There are two games with TressFX:
The first was broken on nVidia hardware at launch and needed a game patch to fix it.
The second doesnt allow nVidia user to use this feature.

Both games come under the Gaming Evolved banner and were advertised by AMD.

The developers fault. Completely.

In the case of this thread tho. It's the developer and Nvidias fault. (Because of how it performs on non Maxwell cards and the default settings used.)
 
Feb 19, 2009
10,457
10
76
There are two games with TressFX:
The first was broken on nVidia hardware at launch and needed a game patch to fix it.

Wrong. Tomb Raider TressFX performance was fixed by NV with a DRIVER update, ~2 weeks after the game was released. Get your facts straight first.

Game was released early March 2013. This article was published on the 20th March 2013.

http://www.hardocp.com/article/2013...deo_card_performance_iq_review/6#.VWmLDs-qpBc

13632141234v2TkTbPdM_6_6.jpg


"Before the latest driver release, NVIDIA GPUs were struggling with TressFX, and became unplayable with framerates between 5 and 10 FPS."

AMD's features are all open source without code obfuscation. How developers implement it is ENTIRELY up to them, they have full code, SDK & documentation access.

This is my 3rd time debunking the lies regarding TressFX from the usual suspects. Get it into your head, NV has full access to the TressFX source code, it allows them IF they want to, to optimize it quickly without stabbing in the dark like AMD have to for GameWorks, obfuscated closed source code.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Facts? This game was released in a broken state because AMD paid the developer not to send a release copy to nVidia to verify the game on their hardware in time. It needed a new driver from nVidia and a game patch from Eidos to fix it.

Nothing will change the fact that AMD didnt care about nVidia user.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So when a AMD partner doesnt care for nVidia user - they are responsible and it has nothing to do with their partnership.
When a nVidia partner doesnt care for AMD user - nVidia is responsible.

I can really see where you stand. :(

There are two games with TressFX:
The first was broken on nVidia hardware at launch and needed a game patch to fix it.
The second doesnt allow nVidia user to use this feature.

Both games come under the Gaming Evolved banner and were advertised by AMD.

Don't need the dev to care. Just for nVidia to make the source available so AMD can take care of their customers. And as far as my last statement I was only turning the argument around so you could see the fallacy. It must have gone right over your head.
 
Feb 19, 2009
10,457
10
76
Per [H] Tomb Raider benchmark review:

"NVIDIA - We are using the 314.21 driver released on 3/15/2013. This driver contains up to 45-60% performance improvement over the previous 314.14 driver in this game."

If AMD doesn't care about gamers, they would have done the GameWorks close source, code obfuscation approach like NV. Then the developers themselves would not be able to quickly release an optimization patch and certainly NV wouldn't be able to have easy access to the source code to release an optimized driver.

Didn't you read the article from AnandTech from 2008? NV was worried AMD would bribe developers with their "huge cash stash" to gimp NV GPU performance... so NV instead chooses to do that first, preemptive strike. Talk about a bunch of un-ethical people projecting their poor thinking upon others.

Damning insight to their twisted thinking, directly from the source:

"NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?"
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Don't need the dev to care. Just for nVidia to make the source available so AMD can take care of their customers. And as far as my last statement I was only turning the argument around so you could see the fallacy. It must have gone right over your head.

Ha, it didnt.
The source code argument doesnt make sense because Hairworks runs on their hardware. They can optimize it through the driver. But we have dealt with it as a discussion.

However, how is nVidia responsible for developers who dont want to work with them? Eidos didnt send them a release copy and the Lichdom developer deactivates Tressfx for them:
We purposely disabled TressFX on NVIDIA cards. We delivered TressFX on AMD hardware as part of our partnership with AMD.
https://steamcommunity.com/app/261760/discussions/2/620700960748580422/#c620700960792940362


Hardly nVidia's fault.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Ha, it didnt.
The source code argument doesnt make sense because Hairworks runs on their hardware. They can optimize it through the driver. But we have dealt with it as a discussion.

However, how is nVidia responsible for developers who doesnt want to work with them? Eidos didnt send them a release copy and the Lichdom developer deactivates Tressfx for them.

Hardly nVidia's fault.

First, just because someone states AMD doesn't need the source to optimize it's drivers doesn't mean anything. Anyone can say anything they want to, it doesn't make it true. Someone says,They don't need source code. Well, that just settles that then. They don't need it. They are just making it up.

When trying to optimize for a "black box" you are just taking a shot in the dark and using your best guess/intuition. Hardly a very effective process.

For the second time Even though you said you got it you just went and repeated yourself again. I wasn't actually saying it was nVidia's fault. I was using the same logic others use when they say AMD didn't pay for it, why should their customers benefit? Or when nVidia says, why should we allow GPU PhysX on AMD cards which they claim they would have to support? So I said, why should the dev support TressFX on nVidia cards? Maybe nVidia should for their support just so you could see how stupid an argument that really is when it's used against nVidia. Well, you obviously see the ignorance in the argument, but you didn't get where I was coming from.

So please, don't explain to me again how it's not nVidia's fault the dev turned off TressFX. I know it.

As far as TressFX source release, please stop repeating yourself there. The code is released and has been since a couple of weeks after the feature was first released. If you want to use that comparison, please ask nVidia to do the same. Then there will actually be something to compare.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Per [H] Tomb Raider benchmark review:

"NVIDIA - We are using the 314.21 driver released on 3/15/2013. This driver contains up to 45-60% performance improvement over the previous 314.14 driver in this game."

Totally driver problem.

On top of that devs optimized their implementation with nvidia - something that is not possible for amd to do as the gameworks code can't be altered.

Optimizing a driver for a black box yelded 10% perfomance increase in project cars... woohoo.. how is that comparing to the gains in tomb raider?

Again, that tells the whole story:

Yeah that's a consideration to take into account for sure. I think next year is the year we get a purpose built arm system intended for home gaming. I think all it would take is a dual channel ddr4 interface and memory wouldn't be a bottleneck.

WOW WOW WOW I just found something truly mind blowing. This is /thread worthy.

From the nVidia 280 review here http://www.anandtech.com/show/2549/7

We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.

Next, the idea that developers in collusion with ATI would actively try to harm pc gaming and frustrate gamers is false (and wreaks of paranoia). Developers are interested in doing the fastest most efficient thing to get their desired result with as little trouble to themselves as possible. If a techique makes sense, they will either take it or leave it. The goal of a developer is to make the game as enjoyable as possible for as many gamers as possible, and enabling the same experience on both AMD and NVIDIA hardware is vital. Games won't come out with either one of the two major GPU vendors unable to run the game properly because it is bad for the game and bad for the developer.

Just like NVIDIA made an engineering decision about support for DX10.1 features, every game developer must weight the ROI of implementing a specific feature or using a certain technique. With NVIDIA not supporting DX10.1, doing anything DX10.1 becomes less attractive to a developer because they need to write a DX10 code path anyway. Unless a DX10.1 code path is trivial to implement, produces the same result as DX10, and provides some benefit on hardware supporting DX10.1 there is no way it will ever make it into games. Unless there is some sort of marketing deal in place with a publisher to unbalance things which is a fundamental problem with going beyond developer relations and tech support and designing marketing campaigns based on how many games dispaly a particular hardware vendors logo. (WOW)

So who really suffers from NVIDIA's flawed policy of silence and deception? The first to feel it are the hardware enthusiasts who love learning about hardware. Next in line are the developers because they don't even know what features NVIDIA is capable of offering. Of course, there is AMD who won't be able to sell developers on support for features that could make their hardware perform better because NVIDIA hardware doesn't support it (even if it does). Finally there are the gamers who can and will never know what could have been if a developer had easy access to just one more tool.

So why would NVIDIA take this less than honorable path? The possibilities are endless, but we're happy to help with a few suggestions. It could just be as simple as preventing AMD from getting code into games that runs well on their hardware (as may have happened with Assassin's Creed). It could be that the features NVIDIA does support are incredibly subpar in performance: just because you can do something doesn't mean you can do it well and admitting support might make them look worse than denying it. It could be that the fundamental architecture is incapable of performing certain basic functions and that reengineering from the ground up would be required for DX10.1 support.

NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?


I seriously can't believe what I'm reading. Crazy. Inb4 all the investopedia "capitalists" claiming its ok to maximize profits lol.


Nvidia is not cool with amd having the upper hand because of support for standard industry features that dramatically increase performance for their users.

But they are cool when they use some shady libraries that noone can see and modify, nor can he optimize his hardware and software for it other than nvidia.

How this approach have any backers is beyond me.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1. Erm, yes? I have been saying this from the start. I simply corrected your lie and you just kept going with this other obvious thing. Stupid cycle.
2. Yeah, for static images.
3. Already pointed that out. Was actually surprised that no one wished for a standard on that.

1.Lol what lie? that mantle never ever ran on NV ? no it is correct it didn't.
2.Lol wot?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Wrong. Tomb Raider TressFX performance was fixed by NV with a DRIVER update, ~2 weeks after the game was released. Get your facts straight first.

Game was released early March 2013. This article was published on the 20th March 2013.

http://www.hardocp.com/article/2013...deo_card_performance_iq_review/6#.VWmLDs-qpBc
.

Well get your facts straight, tomb raider had multiple patches for tressfx. Game was released March 5. There was a large tressfx patch on the 11th and the 19th.

http://www.pcgamer.com/tomb-raider-patch-provides-hair-care-for-nvidia-users/

http://www.ausgamers.com/news/read/...ressfx-memory-costing-plus-new-nvidia-drivers

If AMD doesn't care about gamers, they would have done the GameWorks close source, code obfuscation approach like NV. Then the developers themselves would not be able to quickly release an optimization patch and certainly NV wouldn't be able to have easy access to the source code to release an optimized driver.

Or they could just have Tressfx run on AMD hardware only. Problem solved.

(On battlemage lichdom there is an easy workaround to get tressfx working on Nvidia cards).

Optimizing a driver for a black box yelded 10% perfomance increase in project cars... woohoo.. how is that comparing to the gains in tomb raider?

How much time and effort did they put in?
 
Feb 19, 2009
10,457
10
76
@Enigmoid

Per [H] Tomb Raider benchmark review:

"NVIDIA - We are using the 314.21 driver released on 3/15/2013. This driver contains up to 45-60% performance improvement over the previous 314.14 driver in this game."

Did you read the links you put up? 1st patch was a stability fix and improved rendering of TressFX hair.

* Addressed some stability and startup issues on machines that have both Intel and NVIDIA graphics hardware.

* Some small improvements to TressFX hair rendering.

Second patch was to improve stability again, with the devs stating that the latest NV drivers will bring performance.

Here's the notes (notice this is global improvements, not NV specific):

- Cost of TressFX reduced, especially in combination with SSAA.
- TressFX memory usage reduced for AMD Eyefinity and NVIDIA Surround setups.
- TressFX simulation and graphical quality improvements.
- New Ultra-quality shadow mode for contact hardening shadows. This is not enabled by default in any quality profile, but can be enabled from the advanced settings.

"We’ve been working closely with NVIDIA to address the issues experienced by some Tomb Raider players. In conjunction with this patch, NVIDIA will be releasing updated drivers that help to improve stability and performance of Tomb Raider on NVIDIA GeForce GPUs. We are continuing to work together to resolve any remaining outstanding issues. We recommend that GeForce users update to the latest GeForce 314.21 drivers (posting today) for the best experience in Tomb Raider."

Gamers who got the updated drivers on the 15th already got 45-60% performance gains. That is 12 days after the game's release.

All NV GPU users who enjoyed Tomb Raider should be grateful that AMD's features are open source, it enables developers to optimize easily due to full access without obfuscation and it allows NV to fast-track their driver optimizations. That is the point of difference between blackbox GameWorks and open source GE.

AMD cares about all gamers which is why they did set their features to only run on AMD hardware. They want all gamers to enjoy it. They do this by making it easy for devs & NV to optimize.

Have a look at recent AMD GE titles: Alien Isolation, Civ BE, Dirt Rally, all runs excellent on all hardware, including KEPLER, where a 780ti is still up there matching or beating a 970, not all the way down to 960 performance. Open source allows devs full access to optimize, and all good devs want their games highly optimized for as many hardware as possible.
 
Last edited:
Feb 19, 2009
10,457
10
76
There has been no AMD GE game that required official patches to massively improve a performance deficit for NV GPUs. Thus, AMD's GE program & devs who sign on to it, are not actively gimping NV hardware during development.

The few examples in which NV ran poorly on release: Dirt 3 (Global Illumination) and Tomb Raider was resolved by NV themselves with an optimized driver which massively boosted performance.

Anyone remember Deus Ex: HR? It was one of the first showcase title for AMD GE, ran excellent on NV hardware on release. That has been the norm, and games that ran poorly were the exception. Every single NV GameWorks title has ran poorly on AMD and indeed, they all have official game patches to improve AMD performance a few months later (months, not days).
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
The debate gets circular and will allow the market place to decide.

The only thing circular are nonsense arguments the Nvidia apologists are spouting.

1.) Nvidia's not doing that! (Shown evidence they are in fact doing that)
2.) Maybe they are, but AMD is just as bad! (Is shown that AMD is not "as bad")
3.) Well that's just because there's nothing to complain about in the first place, because (go to #1)
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
There are two sides of the coin, not one, and both companies have valid points.

Imho,

One side of the coin and valid:

A simple point: nVidia is spending immense resources, innovating, working hard to create libraries for value incentives for their customers and to shareholders. Nothing is actually stopping AMD to do the same thing.

The other side of the coin and just as valid:

Proprietary is divisive and may offer chaos and fragmentation of the marketplace, GameWorks, even though it is using industry standards and brand agnostic to some degree, doesn't allow competitors to fully optimize, where one may define this as an anti-competitive action that does harm to a cooperative point-of-view.

You have two opposing points-of-view -- they're both right -- and why it becomes circular
 
Feb 19, 2009
10,457
10
76
@SirPauly

There's definitely two business approach, as a gamer, I feel more confident if code features were open, allowing game developers to best use it and optimize it as they see fit.

If AMD follow the route NV has taken and make their own closed source propriety features, it's not a good outcome for gamers who will have to put up with a situation like on console exclusive titles because whichever company sponsors the game's development, will have a major advantage in that game, to a point where its no longer worthwhile to play that game if you're not using their hardware.

How can that situation be a good result for PC gamers?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't think nVidia has a major advantage in the Witcher 3 over-all -- the core game-play runs really good over-all for Radeons and can receive a very good experience -- and ironically, Radeon owners can probably enjoy more performance with HairWorks than most GeForce sku's.
 
Aug 11, 2008
10,451
642
126
These assertions that nVidia has an obligation to share their code for gameworks with AMD are simply bull. If you follow this reasoning, there should be no licensed software. Adobe should make full photoshop free. Sigmaplot graphing program for which we paid several hundred dollars for our lab, oh, that should be free and open as well. So I have no problem at all with nVidia not giving access to AMD for the code of gameworks features. AMD can either use it with reduced performance, bypass it, or make their own alternative and compete with nVidia to get developers to implement it. It may sound harsh or unfair, but that is just business. In fact, if all software met this utopian dream of AMD fans that everything was free and open, ultimately I think this would harm the consumer, because it would eliminate a huge motivation for development of new software.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
There are two sides of the coin, not one, and both companies have valid points.

Imho,

One side of the coin and valid:

A simple point: nVidia is spending immense resources, innovating, working hard to create libraries for value incentives for their customers and to shareholders.

Shareholders I agree... but consumers? What value there is in gimping performance for no visual benefit?
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
These assertions that nVidia has an obligation to share their code for gameworks with AMD are simply bull. If you follow this reasoning, there should be no licensed software. Adobe should make full photoshop free. Sigmaplot graphing program for which we paid several hundred dollars for our lab, oh, that should be free and open as well. So I have no problem at all with nVidia not giving access to AMD for the code of gameworks features. AMD can either use it with reduced performance, bypass it, or make their own alternative and compete with nVidia to get developers to implement it. It may sound harsh or unfair, but that is just business. In fact, if all software met this utopian dream of AMD fans that everything was free and open, ultimately I think this would harm the consumer, because it would eliminate a huge motivation for development of new software.

Just asking. Is showing your code the same as sharing your code?
 

gamervivek

Senior member
Jan 17, 2011
490
53
91
It doesn't work. But yeah AMD should've replied in kind instead of taking the high road, it's only works when you're winning.

The part that I totally hold in contempt is the appalling way they added MSAA support that uses standard DirectX calls - absolutely nothing which is proprietary in any useful sense. They just did ordinary stuff, a completely standard recommendation that they make and that we make to developers for how to do MSAA, and they put it in and locked it to their hardware knowing it would run just fine on our hardware. And indeed, if you simply spoof the vendor ID in the driver - which we and other people have documented - it runs absolutely fine on AMD hardware. There's nothing proprietary about it in that sense, nothing new. I think that's exceptionally poor.

What I could have done as the Developer Relations guy at AMD is say "actually, what they're doing is a reasonable business investment and I'll do exactly the same thing for all the DirectX 11 code we are adding. We'll just go in an add it, and since I can't QA it on Fermi because all they've got still is a faked up board that they showed off recently, what I'll do is I'll lock it to our hardware." Morally I think that would be reprehensible, but from a business point of view I could argue in favour of it, but we think it's really the wrong thing to do and we've not locked a single line of DirectX 11 code. That's the difference in the way [AMD] works - we work through enablement and open standards. [Nvidia] works through closed standards and disablement, which, to me is inexcusable; it's as bad as that.

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
The only thing circular are nonsense arguments the Nvidia apologists are spouting.

1.) Nvidia's not doing that! (Shown evidence they are in fact doing that)
2.) Maybe they are, but AMD is just as bad! (Is shown that AMD is not "as bad")
3.) Well that's just because there's nothing to complain about in the first place, because (go to #1)

I think it is sad to say that this is not far off.

I really do not understand it. No matter who you are you should be wanting to prevent a schism in the PC Gaming world. I will seriously just buy a console if it comes to that.
 
Status
Not open for further replies.