[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

coercitiv

Diamond Member
Jan 24, 2014
7,492
17,909
136
Hairworks could have been locked to Nvidia only. People are given an inch and expect a mile.
This has been said before: had Nvidia locked out AMD from running Hairworks, none of this outrage would have happened. (then again you might have seen another solution being used instead but that's another scenario to explore)

The key of this conundrum is being able to benchmark the effect, showcase an advantage over the competition, therefore having it run on both brand GPUs is imperative: take that away and all reviewers will ignore HW in their review. No review comparison, no showcase. It's not like the feature transforms the game to such degree as to make it a must have.

And make no mistake, this in turn applies to any initiative AMD might have: when they bring something novel to the market (on the software side), they need it to work on Nvidia GPUs too. But they'll bring something that plays well to their strengths, not their competition's.

What Nvidia did was very smart from a competitive point of view, but they over did it in the sense that the performance hit was too high even on their own hardware. It's easy to accept such sacrifice when there's no way around it, but considerably harder when people show significant improvements with little or (subjectively) insignificant IQ loss.

It's one thing when people are taken an inch, and a completely different story when they're taken a mile.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I don't really agree with locking it to nvidia. These are elements of games that would be there without either of these companies being involved. It therefore should not be removed from one party just because a particular method is being used. That's not how PC gaming has been or should be.

If there is an alternative that works on the competition then locking it down is unnecessarily depriving owners of the other GPU what they would otherwise have without nvidia meddling.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
If you're going to cry about mantle cite some examples where nvidia performance is extremely hampered in mantle games or stop posting about it.
imho,


It was hampered to some degree because Mantle was proprietary and closed to Intel and nVidia, while AMD was showcasing their innovation and leveraging it for Radeon. Had no problem what-so-ever, it was AMD's risk and resources and trying to offer innovation and improved gaming experiences for their customers. It was a great idea and offered enough awareness that it probably made the industry take notice -- proprietary, at times, improves the competitive landscape and forces others to improve or compete more.

nVidia didn't reactionary complain too much, questioned spending resources, downplayed it a bit, and competed just fine through some innovation with DirectX 11 and concentrated their focus on DirectX 12.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
imho,


It was hampered to some degree because Mantle was proprietary and closed to Intel and nVidia, while AMD was showcasing their innovation and leveraging it for Radeon. Had no problem what-so-ever, it was AMD's risk and resources and trying to offer innovation and improved gaming experiences for their customers. It was a great idea and offered enough awareness that it probably made the industry take notice -- proprietary, at times, improves the competitive landscape and forces others to improve or compete more.

nVidia didn't reactionary complain too much, questioned spending resources, downplayed it a bit, and competed just fine through some innovation with DirectX 11 and concentrated their focus on DirectX 12.

It is more like an optimization for a specific hardware vendor which is not something we are frowning on. As long as nvidia can still do their own thing (which they did with a driver).

There's nothing nvidia could complain about. Its not like mantle was running on their hardware and they couldn't do anything about it. That's like complaining amd is better at opencl or amd has trueaudio or crying about freesync.

I doubt nvidia tried to use mantle because they are like that. If they had asked and wanted to invest in a driver for it, who knows what AMD would have done.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
The fun never ends. Gameworks for Android everybody!!!!!!!!!

http://techreport.com/news/28354/nvidia-gameworks-program-goes-mobile

Stepping outside of the AMD/nVidia debate for one moment, this could actually be good. I dont care WHAT it takes, but android needs better games, period. The super awesome nVidia shield console that was just featured on AT is a powerhouse of a device, but what games would even stress it? Shield has 256 Mawwell shaders which is ~ 400 GCN shaders. This thing has reached literally half of an Xbox One GPU. Next Tegra platform will be equal to current gen console GPU power. Crazy stuff.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Stepping outside of the AMD/nVidia debate for one moment, this could actually be good. I dont care WHAT it takes, but android needs better games, period. The super awesome nVidia shield console that was just featured on AT is a powerhouse of a device, but what games would even stress it? Shield has 256 Mawwell shaders which is ~ 400 GCN shaders. This thing has reached literally half of an Xbox One GPU. Next Tegra platform will be equal to current gen console GPU power. Crazy stuff.

First, there is way more that goes into performance than raw shader count. You cannot really say X shaders from company A is equal to X shaders from company B.

The other thing you have to take into account is thermals. A console has a MUCH higher thermal limit than a tablet. Tablets (ESPECIALLY android ones) are known for having great performance for about a minute, then dropping off a cliff as they drop their clocks down to prevent overheating.

Maybe when tablets hit 14nm they will get closer to consoles, but even then I don't see them being able to run the same games at the same quality levels. Tablets also have to deal with a lot less CPU power. The Jaguar cores in the consoles are not super fast, but they still greatly outperform tablets. Not only because they have 8 cores, but because they can sustain clocks indefinitely, unlike tablets.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
First, there is way more that goes into performance than raw shader count. You cannot really say X shaders from company A is equal to X shaders from company B.

The other thing you have to take into account is thermals. A console has a MUCH higher thermal limit than a tablet. Tablets (ESPECIALLY android ones) are known for having great performance for about a minute, then dropping off a cliff as they drop their clocks down to prevent overheating.

Maybe when tablets hit 14nm they will get closer to consoles, but even then I don't see them being able to run the same games at the same quality levels. Tablets also have to deal with a lot less CPU power. The Jaguar cores in the consoles are not super fast, but they still greatly outperform tablets. Not only because they have 8 cores, but because they can sustain clocks indefinitely, unlike tablets.
Sheild console is a set to box.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
First, there is way more that goes into performance than raw shader count. You cannot really say X shaders from company A is equal to X shaders from company B.

The other thing you have to take into account is thermals. A console has a MUCH higher thermal limit than a tablet. Tablets (ESPECIALLY android ones) are known for having great performance for about a minute, then dropping off a cliff as they drop their clocks down to prevent overheating.

Maybe when tablets hit 14nm they will get closer to consoles, but even then I don't see them being able to run the same games at the same quality levels. Tablets also have to deal with a lot less CPU power. The Jaguar cores in the consoles are not super fast, but they still greatly outperform tablets. Not only because they have 8 cores, but because they can sustain clocks indefinitely, unlike tablets.

Lol what? Did you read my post? I literally said "Shield Console".

From anandtech's own review.... http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/4


And with the Tegra X1 packing a 256 CUDA core implementation of the very potent Maxwell GPU architecture, an estimated 1GHz clockspeed, and all the power and cooling it needs to keep from throttling....

And while there may be differing factors.. Yes you can definitely compare shader count within a reasonable margin of error in the case of the Geforce 750 TI and Radeon 260X. They more or less perform exactly the same.

Both have 128-bit memory bus and both are clocked exactly the same.

The 750 TI has 640 shaders and the 260x has 896 GCN 1.1 shaders (same as in the consoles, ta-da!)

2qn1sgp.png


So using this math: 896/640 = 1.4. So each Maxwell shader is good for around 1.4 GCN 1.1 cores.

This new Shield console has 256 shaders with 16 ROPs. Probably sitting at close to 400 GCN shaders and I think the Xbox one has about 900. So like I said, could definitely see the next years shield being close if not equal to the current gen consoles.

Apologize for the thread jack! Just excited that we may finally be seeing a true third platform emerge. nvidia powering Android, AMD powering consoles, and both going at it in PCs. Great for competition :)
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Better forget about NvidiaCars.

TW3 Hairworks implementation is fair, IMO. They only have to remove the Hworks enablement on AMD hardware.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,139
4,010
136
Lol what? Did you read my post? I literally said "Shield Console".

From anandtech's own review.... http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/4




And while there may be differing factors.. Yes you can definitely compare shader count within a reasonable margin of error in the case of the Geforce 750 TI and Radeon 260X. They more or less perform exactly the same.

Both have 128-bit memory bus and both are clocked exactly the same.

The 750 TI has 640 shaders and the 260x has 896 GCN 1.1 shaders (same as in the consoles, ta-da!)

2qn1sgp.png


So using this math: 896/640 = 1.4. So each Maxwell shader is good for around 1.4 GCN 1.1 cores.

This new Shield console has 256 shaders with 16 ROPs. Probably sitting at close to 400 GCN shaders and I think the Xbox one has about 900. So like I said, could definitely see the next years shield being close if not equal to the current gen consoles.

Apologize for the thread jack! Just excited that we may finally be seeing a true third platform emerge. nvidia powering Android, AMD powering consoles, and both going at it in PCs. Great for competition :)

so how you covering the in order of magnitude memory throughput difference :colbert:....
 

bowler484

Member
Jan 5, 2014
26
0
0
What benefit do you get from this blind, asinine, uneducated support of GameWorks? I guess if you pull the wool far enough over your eyes, you'd miss the crap that GameWorks does to any game it touches.

I'm not supporting gameworks, I'm supporting the truth.

There are aspects of Gameworks that suck the performance out of any card when run. Which is why I called Hairworks garbage earlier.

One thing I want to touch on here is how driver releases play into the GameWorks optimization question. When AMD has said it can’t optimize for GameWorks, what that means is that AMD can’t optimize the specific GameWorks function. In other words, in The Witcher 3, AMD can’t really do much to improve HairWorks performance.

AMD can still optimize other aspects of the game that aren’t covered by GameWorks, which is why you’ll still see performance improve in GW-enabled titles.

Source.

So once again, there are aspects to Gameworks that can truly hurt AMD performance. And there are others such as HBAO+ that don't.

But we're being conditioned by AMD to hate Gameworks regardless of which features a game runs. Your comment to me proves it because now you think Gameworks = bad regardless of the truth. Watch Dogs led to a huge Gameworks story about how AMD performance was crippled. Yet reality was that Dogs had two Gameworks features. HBAO+ which was proven by multiple sites to have the same effect on either camp's card. And TressFX which the AMD card's weren't even allowed to run. The truth is AMD had access to the whole game to optimize with the exception of those two features and one of them they couldn't use anyway. But somehow, this was turned into a huge media story to bring negative attention to Nvidia.

I hate companies lying to us. I hope Nvidia loses all the market share the 970 gained because of the memory lies. But it's not fair to sit back and call one out for lies and not the other. AMD is not telling the whole truth about Gameworks plain and simple.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
so how you covering the in order of magnitude memory throughput difference :colbert:....
Yeah that's a consideration to take into account for sure. I think next year is the year we get a purpose built arm system intended for home gaming. I think all it would take is a dual channel ddr4 interface and memory wouldn't be a bottleneck.

WOW WOW WOW I just found something truly mind blowing. This is /thread worthy.

From the nVidia 280 review here http://www.anandtech.com/show/2549/7

We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.

Next, the idea that developers in collusion with ATI would actively try to harm pc gaming and frustrate gamers is false (and wreaks of paranoia). Developers are interested in doing the fastest most efficient thing to get their desired result with as little trouble to themselves as possible. If a techique makes sense, they will either take it or leave it. The goal of a developer is to make the game as enjoyable as possible for as many gamers as possible, and enabling the same experience on both AMD and NVIDIA hardware is vital. Games won't come out with either one of the two major GPU vendors unable to run the game properly because it is bad for the game and bad for the developer.

Just like NVIDIA made an engineering decision about support for DX10.1 features, every game developer must weight the ROI of implementing a specific feature or using a certain technique. With NVIDIA not supporting DX10.1, doing anything DX10.1 becomes less attractive to a developer because they need to write a DX10 code path anyway. Unless a DX10.1 code path is trivial to implement, produces the same result as DX10, and provides some benefit on hardware supporting DX10.1 there is no way it will ever make it into games. Unless there is some sort of marketing deal in place with a publisher to unbalance things which is a fundamental problem with going beyond developer relations and tech support and designing marketing campaigns based on how many games dispaly a particular hardware vendors logo. (WOW)

So who really suffers from NVIDIA's flawed policy of silence and deception? The first to feel it are the hardware enthusiasts who love learning about hardware. Next in line are the developers because they don't even know what features NVIDIA is capable of offering. Of course, there is AMD who won't be able to sell developers on support for features that could make their hardware perform better because NVIDIA hardware doesn't support it (even if it does). Finally there are the gamers who can and will never know what could have been if a developer had easy access to just one more tool.

So why would NVIDIA take this less than honorable path? The possibilities are endless, but we're happy to help with a few suggestions. It could just be as simple as preventing AMD from getting code into games that runs well on their hardware (as may have happened with Assassin's Creed). It could be that the features NVIDIA does support are incredibly subpar in performance: just because you can do something doesn't mean you can do it well and admitting support might make them look worse than denying it. It could be that the fundamental architecture is incapable of performing certain basic functions and that reengineering from the ground up would be required for DX10.1 support.

NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?

I seriously can't believe what I'm reading. Crazy. Inb4 all the investopedia "capitalists" claiming its ok to maximize profits lol.
 
Last edited:

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Also wanted to add that the above was written by none other than Anand Lal Shimpi himself. Seems like what was once considered unethical and dirty is now suddenly "smart and fair business". When did we stop defending ourselves as consumers?
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
Yeah that's a consideration to take into account for sure. I think next year is the year we get a purpose built arm system intended for home gaming. I think all it would take is a dual channel ddr4 interface and memory wouldn't be a bottleneck.

WOW WOW WOW I just found something truly mind blowing. This is /thread worthy.

From the nVidia 280 review here http://www.anandtech.com/show/2549/7



I seriously can't believe what I'm reading. Crazy. Inb4 all the investopedia "capitalists" claiming its ok to maximize profits lol.

Great find. Now I sit back and read the responses.
 
Aug 11, 2008
10,451
642
126
Just wow. People cant find enough to whine about in gameworks now, so they have to go back to 2008 to find something else to complain about? In any case it is totally off topic in relation to Witcher 3.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Just wow. People cant find enough to whine about in gameworks now, so they have to go back to 2008 to find something else to complain about? In any case it is totally off topic in relation to Witcher 3.

I was going back looking at historical graphics card release prices in order to provide perspective on today's prices. I (an enthusiast) just started reading that review to see how technology has changed over the years. My research actually defends nVidia a bit at least in terms of pricing!

I stumbled upon that and when I realized the content of the words had a moment of pure awe. Only a fool disregards history. And truth doesn't cease to be truth just because time passes! That article is Anand himself condemning the practice of vendor specific features and optimizations, its super relevant to this discussion.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
That article is Anand himself condemning the practice of vendor specific features and optimizations, its super relevant to this discussion.

Not to mention the fact that Nvidia's prior statements and current behaviour are extraordinarily hypocritical.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
Just wow. People cant find enough to whine about in gameworks now, so they have to go back to 2008 to find something else to complain about? In any case it is totally off topic in relation to Witcher 3.
As far as it goes to whether a leopard can change its spots, it could be admissible depending on which justice system is in play.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Not to mention the fact that Nvidia's prior statements and current behaviour are extraordinarily hypocritical.
It's this hypocrisy in action that stunned me. I don't think it's ok for anyone to go back on their word. Not nvidia. Not amd, Intel or anyone for that matter. They condemned this practice and now aggressively pursue it.
 
Feb 19, 2009
10,457
10
76
WOW WOW WOW I just found something truly mind blowing. This is /thread worthy.

From the nVidia 280 review here http://www.anandtech.com/show/2549/7

I seriously can't believe what I'm reading. Crazy. Inb4 all the investopedia "capitalists" claiming its ok to maximize profits lol.

"NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?"

So NV was afraid that AMD would play dirty so they did it first, pre-empt them to sink to the bottom first on ethics. Nice play.

http://www.anandtech.com/show/2536/6

Ubisoft (these jerks again) intentionally crippled AMD performance with a PATCH after they accidentally made an NV sponsored game run faster on AMD hardware, with the 3870 destroying the 8800GT

"So why did Ubisoft remove DirectX 10.1 support? The official statement reads as follows: "The performance gains seen by players who are currently playing AC with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly." An additional render pass is certainly a costly function; what the above statement doesn't clearly state is that DirectX 10.1 allows one fewer rendering pass when running anti-aliasing, and this is a good thing. We contacted AMD/ATI, NVIDIA, and Ubisoft to see if we could get some more clarification on what's going on. Not surprisingly, ATI was the only company willing to talk with us, and even they wouldn't come right out and say exactly what occurred.

Reading between the lines, it seems clear that NVIDIA and Ubisoft reached some sort of agreement where DirectX 10.1 support was pulled with the patch. ATI obviously can't come out and rip on Ubisoft for this decision, because they need to maintain their business relationship. We on the other hand have no such qualms. Money might not have changed hands directly, but as part of NVIDIA's "The Way It's Meant to Be Played" program, it's a safe bet that NVIDIA wasn't happy about seeing DirectX 10.1 support in the game -- particularly when that support caused ATI's hardware to significantly outperform NVIDIA's hardware in certain situations."

So back then it was considered DIRTY what NV was doing with its TWIMTBP program, and AnandTech was brave enough to call them out about it. Which US tech sites are still independent enough to call out GameWorks which takes TWIMTBP to another level? Certainly not Forbes who blames AMD for poor performance in NV sponsored games.

Let this sink in again, cannot be emphasized enough:

"So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?"

NV that's who. They don't care about the gaming scene, they just want people to constantly upgrade to their latest & greatest and without competition from AMD, they will neuter "obsolete" GPUs actively in their sponsored titles to enforce planned obsoletion. Keep on supporting them guys!
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Yeah that's a consideration to take into account for sure. I think next year is the year we get a purpose built arm system intended for home gaming. I think all it would take is a dual channel ddr4 interface and memory wouldn't be a bottleneck.

WOW WOW WOW I just found something truly mind blowing. This is /thread worthy.

From the nVidia 280 review here http://www.anandtech.com/show/2549/7



I seriously can't believe what I'm reading. Crazy. Inb4 all the investopedia "capitalists" claiming its ok to maximize profits lol.

That's crazy!

Kool-Aid detox I hear is painful.
 
Status
Not open for further replies.