[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
1) The 970 fiasco. Gimping the last 1/2GB of RAM is a feature. lol.

Using wrong specs intentionally or unintentionally is wrong and was very vocal about it. It was more-so an engineering feature so the SKU could offer more performance and more memory for cut down cores --- improve margins for nVidia and still be offered at an attractive price-point. Laugh all you want.

If you think AMD is doing enough and nVidia is to blame -- that's fine. I agree with the author of the Forbes article.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
It could be a lot of things. I can see how lack of resources comes to mind as AMD doesn't have the cash to send it's people to work with as many devs as Nvidia can. Nor do they have the same size driver team.

But as you said, the pattern is there for anybody willing to look. All the Gameworks games that AMD said they could not optimize for have all gotten optimizations after the fact. But the fact that we seem to always get tech articles ripping Nvidia each time before the game gets fixed is rather convenient in it's timing.

For the most part, Gameworks is crap. It would not hurt my feelings one bit if Hairworks and Physx disappeared. But HBAO+, TXAA and PCSS have benefits that are worth saving.

I find it humorously conspiratorial to believe AMD is actively providing assistance to such things as being locked out of HBAO+ and AA for no apparent technical reasons and having fixes they have worked on actively refused by developers *cough* Batman AA *cough*. On the face of it the effort to actively sabotage the competitor and, through collateral damage, the people using the competitors products is squarely on Nvidia in this gaming GPU duopoly.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Using wrong specs intentionally or unintentionally is wrong and was very vocal about it. It was more-so an engineering feature....

That benefits Nvidia by allowing them to save a few dollars per die. This is not a positive engineering feature in fact it should not be called a feature at all. BTW excellent post 3DVagabond. :thumbsup:
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
At least they are innovative at it, first cut down die whose cut down logic hampered its effective memory beyond the cut down specs would indicate. Hopefuly nvidia learned their lesson and they will start making more robust xbars
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Many people on this thread are gonna blow off what you are saying but there is a clear pattern. I really dont think many people have the guts to say what you have for fear of the push back.

I have seen enough lately that makes me wonder. It seems that part of AMDs plan could be to twist Nvidia advantages into something bad, evil, or awful. I dont want to get too far off topic and i will only touch on this. As an example, nvidia comes out with Gsync and AMDs response was fist to attack and downplay then manipulate. Just look at the name of their alternative technology and clearly you can see the psychological intent.

Back to your theory on gameworks, I was confused when people blamed gameworks for poor performance in watchdogs, AC untiy, etc. Yet the smear campaign seemed to take hold very well, at least on tech forums.
I just couldnt see how though. For most the big games, HBAO+ was the only gameworks feature that even ran on AMD HW.....yet the performance hit was not out of proportion on AMD cards. The hit was similar on both Nvidia and AMD cards.
Clearly, there was/is something else going on. It was not gameworks at all.

This is proven when, all of a sudden, several weeks after the game has launched AMD GPUs magically get a huge boost in performance. I think it was 40% increase for Unity. That is massive. But how? If gameworks was the issue, if HBAO+ was the issue, how did AMD get 40% increase in performance? I thought there was nothing they could do?

Obviously there was something that could be done.
When I brought up the fact that HBAO+ doesnt penalize performance a crazy amount on AMD HW, or that it doesnt even have to be used at all, that you can turn off completely....that the gameworks feature HBAO+ just cannot be responsible for the performance issues in those games, then you get met with a list of other mixed up reasoning. Like, AMD cant work with developers on gameworks titles, that Nvidia prevents them in their contracts.

These are interesting claims. A mixture of facts with fallacy, like all good conspiracy theories are. Nvidia does prevent developers from sharing gameworks code. This is stretched to become........the entire game. Nvidia doesnt allow developers to share HBAO+ or TXAA code and this is twisted to become something far far more sinister. If AMDs goal is to downplay or turn Nvidia advantages, this sure would be a great way to try to do it.

Ultimately, there is a huge problem with those claims. See the fact that AMD somehow manages to get up to 40% more performance down the road, it is completely at odds. Obviously AMD could optimize and gain performance on a gameworks game. They have done this, time and time again. Why do they wait till after the fact? It could be by design, as you suggest.

I think AMD does have a strategy and does try to down play/reduce the image of Nvidia advantages. I believe there is this very real matter of resources that all people can accept but the people might be blind to the smear campaign that i think is in full force. I believe both are guilty of smearing from time to time.
As for AMD purposely not optimizing just because it is a gameworks title, it might be a little more complicated. Perhaps resources are at play here as well. AMD may be taking a back seat and responding once they see how popular these games become. In the meantime, to save face it is gameworks and nvidia who gets the blame. Can we expect them to say, "well we only have so much money"

It is more than obvious that gameworks is a scapegoat. HBAO+ and TXAA had no hand in harming AMD performance in those games. Even without running gameworks features, the performance wasnt too strong. When Nvidia spends time on games they think are gonna be big, they maximize performance and optimize as much as they can. They do a lot of work before the game even comes out. So for a game like unity, this kind of work AMD done after the game launched. All the while, gameworks was blamed.

So basically i can see what you are getting at. There is little doubt in my mind, the pattern that plays out. I just dont know if i would say that AMD is waiting after a game launches to optimize in a plan to attack gameworks/nvidia. I propose a different reason.

I so agree with this!:thumbsup:
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
...As an example, nvidia comes out with Gsync and AMDs response was fist to attack and downplay then manipulate. Just look at the name of their alternative technology and clearly you can see the psychological intent...
The name you seem reluctant to mention is FreeSync. It's called that because it is royalty free and now part of the DisplayPort 1.2a spec, unlike the proprietary and arguably inferior Nvidia G-Sync. To use this as an example really undermines your argument, imo, because it clearly shows that regardless of the execution, the end result is that AMD produced a standard that can benefit all consumers for free, and Nvidia as usual clings to their proprietary and exclusionary tactics.
 
Feb 19, 2009
10,457
10
76
The name you seem reluctant to mention is FreeSync. It's called that because it is royalty free and now part of the DisplayPort 1.2a spec, unlike the proprietary and arguably inferior Nvidia G-Sync. To use this as an example really undermines your argument, imo, because it clearly shows that regardless of the execution, the end result is that AMD produced a standard that can benefit all consumers for free, and Nvidia as usual clings to their proprietary and exclusionary tactics.

To be fair, current Freesync implementation is unusable due to the 40hz lower limit (don't even know what they were thinking releasing such useless FS models). It needs to go down to 30hz before you can say its on-par with GSync.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
To be fair, current Freesync implementation is unusable due to the 40hz lower limit (don't even know what they were thinking releasing such useless FS models). It needs to go down to 30hz before you can say its on-par with GSync.
Some might find the performance hit imposed by G-Sync more significant than the 10Hz delta in lower limit. I've not heard FreeSync characterized as unusable by other than you; gamers that can afford such advanced tech will usually have the means to avoid such low minimums.
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,502
136
Some might find the performance hit imposed by G-Sync more significant than the 10Hz delta in lower limit. I've not heard FreeSync characterized as unusable by other than you; gamers that can afford such advanced tech will usually have the means to avoid such low minimums.

There is no performance hit with Gsync, nothing significant enough that you'd ever notice anyway. Both technologies have advantages and disadvantages, but performance penalty is not one of them.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Some might find the performance hit imposed by G-Sync more significant than the 10Hz delta in lower limit. I've not heard FreeSync characterized as unusable by other than you; gamers that can afford such advanced tech will usually have the means to avoid such low minimums.
hehe, exactly. spending extra just to buy a fs or gs monitor, why not put the extra money towards a better gpu and you would have no fps problems in the first place.

the only way for gs or fs to make sense is for them to be standards, not extra cost, especially not 100+.

I thought this point was obvious :thumbsup:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
To be fair, current Freesync implementation is unusable due to the 40hz lower limit (don't even know what they were thinking releasing such useless FS models). It needs to go down to 30hz before you can say its on-par with GSync.

Buy a 144Hz monitor and run it at less than 40Hz? That's just the nVidia spin machine at work. Remember, we aren't just talking at playing @ less than 40fps. We are talking having the refresh rate below 40Hz.

The ROG Swift flickers @40Hz according to Anandtech's review. It was also reported by either PCPer or TechReport, I honestly can't remember which. Bottom line it's a fringe situation. It's no big deal. Which is pretty much how it was reported. But since Freesync doesn't handle less than 40FPS/Hz as gracefully as Gsync does, it's turned into a major issue. I don't know how many graphs HWC had to create for their freesync monitor review. I honestly got tired of reading them. Bottom line I got out of it is turn off Vsync with Freesync on if you are going to game at such a low refresh rate to stop the stuttering when Vsync starts chopping the FPS to 20fps to match up with the 40Hz refresh rate. You'll get screen tearing, but it's the lesser of the two evils, if you are going to game there. Again, if you want to game on your 144Hz monitor refreshing @ 40Hz.

Honestly Silverforce11, would you game @ below a 40Hz refresh rate. The motion blur would be horrendous. It would defeat the entire reason people buy 120Hz-144Hz monitors in the first place. Typical mountain out of a molehill nVidia marketing spin.

hehe, exactly. spending extra just to buy a fs or gs monitor, why not put the extra money towards a better gpu and you would have no fps problems in the first place.

the only way for gs or fs to make sense is for them to be standards, not extra cost, especially not 100+.

I thought this point was obvious :thumbsup:

We need more models. Right now there isn't enough competition between the vendors for Freesync to just become a standard feature. It's cheaper than Gsync, but it still has a premium.
 
Last edited:
Feb 19, 2009
10,457
10
76
You guys don't pay attention to recent benchmarks of modern games? Its very difficult to sustain >40 fps at 1440p, it would require SLI Titan X class performance or major IQ downgrade.

The entire point of Adaptive Vsync is you want it to work when the FPS tanks. The lower the limit, the better it is. As it is, GSYNC will look smooth down to 30 fps but FS only to 40 fps, I would say 30 fps is a better target as fps below that aren't going to be good regardless.

AMD & DP1.2a monitor makers need further improvements before they would entice me to upgrade my monitor, that's the honest truth.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Im at 44 fps in Witcher 3 running with no hairworks at 1440p with some settings turned down to high that gave me the most fps increase so yeah it's on the limit with a 780 TI. Just a good expample :)

Edit: This was pre 1.04 patch, now im at 35 with dips to 28. Game unplayable for me and im waiting for nvidia driver, if that still dont help ill continue playing it when i upgrade my gpu.
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
You guys don't pay attention to recent benchmarks of modern games? Its very difficult to sustain >40 fps at 1440p, it would require SLI Titan X class performance or major IQ downgrade.

The entire point of Adaptive Vsync is you want it to work when the FPS tanks.
Indeed.
The ability to perfectly display any framerate is huge advantage. (not just fps dividable with display refresh.)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Buy a 144Hz monitor and run it at less than 40Hz? That's just the nVidia spin machine at work. Remember, we aren't just talking at playing @ less than 40fps. We are talking having the refresh rate below 40Hz.

The ROG Swift flickers @40Hz according to Anandtech's review. It was also reported by either PCPer or TechReport, I honestly can't remember which. Bottom line it's a fringe situation. It's no big deal. Which is pretty much how it was reported. But since Freesync doesn't handle less than 40FPS/Hz as gracefully as Gsync does, it's turned into a major issue. I don't know how many graphs HWC had to create for their freesync monitor review. I honestly got tired of reading them. Bottom line I got out of it is turn off Vsync with Freesync on if you are going to game at such a low refresh rate to stop the stuttering when Vsync starts chopping the FPS to 20fps to match up with the 40Hz refresh rate. You'll get screen tearing, but it's the lesser of the two evils, if you are going to game there. Again, if you want to game on your 144Hz monitor refreshing @ 40Hz.

Honestly Silverforce11, would you game @ below a 40Hz refresh rate. The motion blur would be horrendous. It would defeat the entire reason people buy 120Hz-144Hz monitors in the first place. Typical mountain out of a molehill nVidia marketing spin.



We need more models. Right now there isn't enough competition between the vendors for Freesync to just become a standard feature. It's cheaper than Gsync, but it still has a premium.

Listen to yourself man.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You guys don't pay attention to recent benchmarks of modern games? Its very difficult to sustain >40 fps at 1440p, it would require SLI Titan X class performance or major IQ downgrade.

The entire point of Adaptive Vsync is you want it to work when the FPS tanks. The lower the limit, the better it is. As it is, GSYNC will look smooth down to 30 fps but FS only to 40 fps, I would say 30 fps is a better target as fps below that aren't going to be good regardless.

AMD & DP1.2a monitor makers need further improvements before they would entice me to upgrade my monitor, that's the honest truth.

It's not just <40fps it's <40Hz refresh rate. That in and of itself will destroy IQ.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Im at 44 fps in Witcher 3 running with no hairworks at 1440p with some settings turned down to high that gave me the most fps increase so yeah it's on the limit with a 780 TI. Just a good expample :)

Edit: This was pre 1.04 patch, now im at 35 with dips to 28. Game unplayable for me and im waiting for nvidia driver, if that still dont help ill continue playing it when i upgrade my gpu.

A lower refresh rate isn't going to fix that, unfortunately. Hopefully nVidia has heard all of the complaints and will fix that for you.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
This I agree with over-all:

Forbes said:
If you want a huge AAA game release to look great on your hardware, you take the initiative to ensure that it does. What you don’t do is expect your competitor to make it easier for you by opening up the technology they’ve invested millions of dollars into. You innovate using your own technologies. Or you increase your resources. Or you bolster your relationships and face time with developers.

In short, you just find a way to get it done.

Even though I am vocal about a more open GameWorks from a gamer, idealistic point-of-view -- I just don't see nVidia opening Gameworks on GPU related features. There is going to be a lot of talking and blaming in the near future with Gameworks' titles.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Is causing enough bad press that NV makes gameworks play nicer not a valid way of dealing with any roadblocks it places? Part of the whole idea of the free market is that actions that hurt the consumer are punished. If there's no fear of backlash, then any mechanism by which the market prevents damaging actions is unraveled, and raw money will always find a way to win.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I suppose if nVidia lost share dramatically because of GameWorks may force them to rethink their strategy, but they gained share so far.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Listen to yourself man.

What a convincing argument -- you really logically defeated each and every point he raised. Bravo

More cogently; the lower limits on both gsync and freesync are likely reflections of the limitations of LCD screens at low refresh rates. I must agree that a 30 minimum would be a lot more useful than 40. I find 45 fps playable if not optimal for most types of games even without any sync tech. That range between 30 and 40 is unplayable without the tech, imo, and so with a sync tech it could become playable.

Though the economics on all of them doesnt really work out. Since (x)sync monitors cost a lot more than others, it's cheaper to get a faster graphics card and peg it at 60.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
It's simple.

You can't have nVidia putting in locked down code into games that arbitrarily and in a malicious manner hampers performance of their competitors cards. It's incredibly obtuse to then blame said competitor for being hampered by this, when after all that is the goal from nVidia.

It ends badly for gamers. We as gamers and enthusiasts are now accustomed to this outcome as this has been the norm with nVidia, not the exception.

Time to start looking at what nVidia does, and stop listenting to what they say about their shameworks program.

Of course this is not up to nVidia, it's up to Devs and gamers to put a stop to this nonsense from nVidia.

The forbes article is rubbish.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
What? Gameworks is a feature for nVidia user and not for AMD customers.

It's up to AMD to provide their own customer base an equal experience. It is not the job of nVidia to do so.

Project Cars is the best example that AMD just doesnt care about pc gaming anymore. So instead of pushing graphics on the pc plattform and helping developers they expect that they do all the work on their own.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Is there a Crapworks title out there that isn't crap or broken even on the hardware crapworks is intended to be a feature for?

If you want to discuss the merits of a graphics technology, please do it in a civilized way.
-Moderator Subyman
 
Last edited by a moderator:

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,502
136
Project Cars is the best example that AMD just doesnt care about pc gaming anymore. So instead of pushing graphics on the pc plattform and to help developers they expect that they do all the work on their own.

What an absurd statement. I guess that Wolfenstein the old blood is the best example that Nvidia just doesn't care about pc gaming anymore. How else do you explain a 280x beating a 970 and 780Ti?

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Wolfenstein_The_Old_Blood-test-w_1920_u.jpg


Also, I guess your example of project cars is also the best example that Nvidia doesn't care about their Kepler customers anymore since Maxwell inexplicably dominated Kepler in that game.
 
Status
Not open for further replies.