• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I think people are entitled to their opinions. I said those that think. Some people don't think a lot. Doesn't make them morons though. That was mean spirited of you to say so.

Then you are completey not self aware of what you write. That crap was offensive.
 
Comparing optional graphical features that can be disabled to completely disabling hardware doesn't seem close to the same level, but what do I know...
There are a number of problems with this statement, but the most obvious is: who decides what's an optional graphic feature?

Is HDR optional? How about large textures? High resolutions? Shadows? Polygons? Water effects? Filtering?

Who decides: nVidia, the customer, or someone else?
If nVidia does, do they also make the decision on behalf of AMD's customers?

What if AMD disables SSE2 and CPU caches when nVidia driver code is preparing shadow volumes? "That's ok" says AMD, "just disable shadows because they're optional!"

How about throttling PCIe back to 1x when the AMD chipset detects a large texture being streamed in? "That's ok" says AMD, "just disable large textures because they're optional!"

Then after all this, AMD can say, "it's nVidia's problem they didn't provide engineering resources" and "it's not our job to share IP to the benefit of a competitor".

Do you feel those promoting the current PhysX and GameWorks lockout would champion AMD's actions just as strongly?
 
Last edited:
I'm typing from my phone, but I'll try to answer all your questions...

Who decides what features are optional? Well that's an easy one, the game developer does! They set all the menu options, and sometimes settings can be changed via ini files and the like.

That also answers the next set of questions, oddly enough you didn't list the devs as a choice...

AMD can disable anything they want when their chipsets are used, but good luck selling many CPUs when this happens... They are in no position to do such a thing, so arguing that point is, well, pointless.

As for your last question, I don't speak on other's behalf, nor do I have any way to know how they would react. It's a silly question, to be honest.
 
Bottom Line:

I'd rather have TressFX in The Witcher 3 than nVidia's Hairworks

TressFX:
Runs Faster than nVidia Hairworks
Looks better than nVidia Hairworks
TressFX also only simulates the hair of one woman, not a huge range of creatures. If you want hair on a wolf that's different to hair on a woman. Sure having it done for Lara is a start but who's going to develop it so it works for wolves - the devs, open source fairies? It's AMD's job to develop it and they haven't.

Not sure if this has been mentioned but seems AMD users can get decent hairworks performance with a little tweaking:

http://www.guru3d.com/news_story/the_witcher_3_hairworks_on_amd_gpus_with_normal_performance.html

So on one hand we have the fanboys raging that AMD can do nothing about hairworks - it's all nvidia's fault and AMD are blameless for doing nothing and complaining about the unfairness of it all.

On the other someone bothers to spend 10 minutes looking into this instead of whining and works out if you turn down tesselation to 16* and AA to 4* it runs fine with barely any loss of quality. Why couldn't AMD have done this before release?
 
Last edited:
TressFX also only simulates the hair of one woman, not a huge range of creatures. If you want hair on a wolf that's different to hair on a woman. Sure having it done for Lara is a start but who's going to develop it so it works for wolves - the devs, open source fairies? It's AMD's job to develop it and they haven't.

AMD-TressFX-3.01.jpg
 
The icing on the cake is Nvidia is sabotaging their own hardware!

Why would they do that?

Because it hurts the other guy worse, and it makes their new stuff look better to people with the old stuff.

Ever heard of planned obsolescence?

Of course. So please correct me if I am wrong: you guys are suggesting they are intentionally giving their products a life span of one year and that they are doing this, presumably, to increase sales?


Nvidia are working on a driver update for kepler.

Do any of you want to explain how this fits into your theory?
 
Nvidia are working on a driver update for kepler.

Do any of you want to explain how this fits into your theory?

I think I can.

Worried that their own fan base [Kepler owners] being very angry about being sacrificed in order to degrade the opposition.


ManuelG @ Nvidia Customer Care:
We discovered a couple of issues in regards to Kepler GPUs and are working on driver updates.

REALLY, with all the pretesting done, you now discovered some issues. People actually believe these excuses?
 
I think I can.

Worried that their own fan base [Kepler owners] being very angry about being sacrificed in order to degrade the opposition.


ManuelG @ Nvidia Customer Care:
We discovered a couple of issues in regards to Kepler GPUs and are working on driver updates.

REALLY, with all the pretesting done, you now discovered some issues. People actually believe these excuses?

I could have sworn that since the dawn of time Nvidia had wonderful, amazing, fantastic drivers. That Nvidia's drivers were always top notch. I pretty sure just a couple pages back people were telling me how inept AMD's driver team was and how the mighty Nvidia's driver team and software engineers with their amazing work on Gameworks libraries were the tops in the industry. And yet here we are with those same driver teams and engineers going "Oops, I completely forgot about that whole massive Kepler line, that line that probably constitutes the majority of our 'gaming' install base. Yea, we forgot about it. Yea, we forgot to do some testing to see if there were any issues for our Kepler line." Yea.....

So, is Nvidia's driver team THAT terrible? This isn't something new, it has been going on since last year. I don't want to see anything from Nvidia except a "fixed" driver. What I do want to see though is the response from the Nvidia camp here who have been telling me for years how magnificent Nvidia is and how they can do no wrong.
 
I could have sworn that since the dawn of time Nvidia had wonderful, amazing, fantastic drivers. That Nvidia's drivers were always top notch. I pretty sure just a couple pages back people were telling me how inept AMD's driver team was and how the mighty Nvidia's driver team and software engineers with their amazing work on Gameworks libraries were the tops in the industry. And yet here we are with those same driver teams and engineers going "Oops, I completely forgot about that whole massive Kepler line, that line that probably constitutes the majority of our 'gaming' install base. Yea, we forgot about it. Yea, we forgot to do some testing to see if there were any issues for our Kepler line." Yea.....

So, is Nvidia's driver team THAT terrible? This isn't something new, it has been going on since last year. I don't want to see anything from Nvidia except a "fixed" driver. What I do want to see though is the response from the Nvidia camp here who have been telling me for years how magnificent Nvidia is and how they can do no wrong.

Nvidia is between a rock and a hard place with this situation.
There is no false dichotomy between these next two choices.

1) The driver team missed Kepler tanking with Hairworks enabled. [Note: this is one of the hyped premium features]
2) It was a deliberate attempt to make Maxwell look better than every other line, previous Nvidia included.

Remember everyone, corporations have ONE intention. To remove what's in your account to their own. In the process they will offer either products or services that you have to decide if worth the exchange. Don't be a sheep.
 
One can be a victim once or twice. When it is becomes a pattern then you want to be the victim. AMD constantly not being on top of major game releases tells me they want to be the victim. Blindsided by a major project 2 months before release? An online user was able to accomplish in 24 hours what AMD couldn't in months or years? Poor AMD, always a victim.
 
Nvidia is between a rock and a hard place with this situation.
There is no false dichotomy between these next two choices.

1) The driver team missed Kepler tanking with Hairworks enabled. [Note: this is one of the hyped premium features]
2) It was a deliberate attempt to make Maxwell look better than every other line, previous Nvidia included.

Remember everyone, corporations have ONE intention. To remove what's in your account to their own. In the process they will offer either products or services that you have to decide if worth the exchange. Don't be a sheep.

Kepler performance is bad with hairworks off.

It's just complete disregard for kepler. Because the witcher 3 is a very popular (and great) game they're now being spammed on their forums by pissed off kepler owners.

If they don't fix it reasonably quick maybe amd can profit by selling some of those hbm gpu's to current 780Ti owners. They should still get rid of their marketing goons huddy and whatshisname though.
 
I could have sworn that since the dawn of time Nvidia had wonderful, amazing, fantastic drivers. That Nvidia's drivers were always top notch. I pretty sure just a couple pages back people were telling me how inept AMD's driver team was and how the mighty Nvidia's driver team and software engineers with their amazing work on Gameworks libraries were the tops in the industry. And yet here we are with those same driver teams and engineers going "Oops, I completely forgot about that whole massive Kepler line, that line that probably constitutes the majority of our 'gaming' install base. Yea, we forgot about it. Yea, we forgot to do some testing to see if there were any issues for our Kepler line." Yea.....

So, is Nvidia's driver team THAT terrible? This isn't something new, it has been going on since last year. I don't want to see anything from Nvidia except a "fixed" driver. What I do want to see though is theresponse from the Nvidia camp here who have been telling me for years how magnificent Nvidia is and how they can do no wrong.

Good luck 🙂

As a 970 owner I'd like more control when it comes to game works features. A slider for tessellation factors would be nice. If AMD can do it why wouldn't nvidias driver Gods not be able to.

On another note. Looking at the images 64x does look the best to me. Not worth the performance impact over 16x.

Guessing allowing the option to control the feature set of my card would be counter productive to sales goals.
 
Of course. So please correct me if I am wrong: you guys are suggesting they are intentionally giving their products a life span of one year and that they are doing this, presumably, to increase sales?

I don't know if NV is doing that, but I do know that my 7970 is more competitive vs "comparable" NV cards today than it was 18 mos ago when I bought it. Would that keep me from buying NV in the future? Ofc not, but neither will "hairworks", "physx", or some other bs convince me to buy NV. Rather, I will act rationally in my own self-interest and buy the card that gives me the best bang/buck (with noise and power draw factored in). Maybe that's why I've run about 60/40 NV over the years, they generally do seem to have better hardware and don't need to resort to dirty tactics to maintain/improve their market share.
 
Nvidia is between a rock and a hard place with this situation.
There is no false dichotomy between these next two choices.

1) The driver team missed Kepler tanking with Hairworks enabled. [Note: this is one of the hyped premium features]
2) It was a deliberate attempt to make Maxwell look better than every other line, previous Nvidia included.

Remember everyone, corporations have ONE intention. To remove what's in your account to their own. In the process they will offer either products or services that you have to decide if worth the exchange. Don't be a sheep.

Agreed. Nvidia is in the business of making money. There is nothing to fault in that statement. My statement was aimed squarely at the people who for whatever reason put Nvidia on a pedestal.

But speaking in regards to Nvidia as a business, if it is true that they wanted to force everyone to to upgrade to maxwell by this possible driver issue then it is a case of short term gain over long term profit. As a 780ti owner this issue does not sit well with me as I look towards my next purchase. This is only compounded by how AMD's GCN is looking as it ages.

Let's say that a 280x has sat still as far as performance goes and let's say that a 680 has tanked due to Nvidia drivers. What the end user see's isn't just the 680's performance dropping as time goes by they also see the 280x increasing. It doesn't matter that is technically hasn't. Granted, that is all hypothetical.

It just doesn't look good and in the long term will most likely have an effect on market share.

Gameworks
Kepler performance neglect
GCN performance aging gracefully
970 3.5 vram issue
Nvidia driver issues

These things will have a tangible effect on market share. It just won't be seen immediately.
 
One can be a victim once or twice. When it is becomes a pattern then you want to be the victim. AMD constantly not being on top of major game releases tells me they want to be the victim. Blindsided by a major project 2 months before release? An online user was able to accomplish in 24 hours what AMD couldn't in months or years? Poor AMD, always a victim.

When AMD does bring things to the light they are then confronted with this.

A professional company act according to the situation not complain,provide information not troll like AMD matt or AMD roy and they are REP of AMD for god sake.

Did u ever seen Manuel G coming out and scolding or trolling AMD for TressFx ,Mantle or AMD copied Gforce experience,DSR ,shadow play so the answer is no that is the main thing.

Nvidia not only show they are the best but they act like it but AMD does non of them and only complain.

It is a lose lose for AMD, because more people would rather defend a lying company.
 
For AMD, they just have to go to their tech papers regarding the uarch, its functions, and use the debuggers, sdk or other complementary troubleshooting software. This is no problem, since AMD, altough sometimes late, its totally open of exposing their uarch to ghe devs and have all those things I named available for using. Ok, now Intel and Nvidia. Well, here is where they are boned, as neither will supply neither tech papers, debuggers or sdk to the devs regarding their uarchs.

hmm, let's compare nvidia's and amd's developer tools. it looks to me like nvidia's developer tools are more comprehensive than amd's.
 
So it looks like there are two GPU's below $1000 in NVIDIA's lineup that can provide playable performance at 1080P ultra with hairworks. Impressive 😉

http://www.twitch.tv/amd/v/5334646

Makes some good points around 17:00.

64X tessellation factor on by default, 8xmsaa on by default when hairworks is enabled - cripple performance on kepler/amd GPU's and create an artificial performance gap between maxwell and everything else. Sounds great.
Good summary, but not sure why anyone is surprised. Nvidia has been doing these gimmicks for years to try to sell hardware. "Fluff" PhysX effects in Batman and Borderland games, crippling PhysX on CPU's, tessellating unseen water under maps in Crysis 2, the list goes on. You can clearly see how little they think of the average consumer.
 
Blindsided by a major project 2 months before release?

If the developer decided not to update AMD about issues, that is, again, a developer decision.

IMHO, I still don't understand the conflict towards AMD and NV(and Intel, if that ever happens) features used by developers. A developer chooses to use a "feature/SDK/Tool/Engine/whatever" for free and/or gets payed to use it.

A developer is responsible for such actions. The developer decides. The developer picks. The developer chooses. The developer is the one responsible.




And I still can't believe how "The Glorious PC Gaming Master Race" are totally for and promoting closed systems and companies. IMHO, the popularity of AMD should had been expanding YoY.
 
Forbes is garbage now. They recently released an article that said NASA data shows that polar ice is not melting at the rate predicted.

The article goes on and on about NASA posts 7 links. Only one link has actual data 6 are only about how polar ice melting is measured and reported. The one link with data is not NASA related at all. The author who is not a scientist draws a conclusion that he puts forth as fact.

In the end he only looked at the sea ice and not at arctic and antarctic ice although he did mention that there is more ice in the antarctic, but did not mention the correlation between melting arctic and an increase in antarctic ice and how this is an expected result.

Forbes is a bunch of shills. Every article I read feels like an article designed to make some CEO feel smug.
 
And I still can't believe how "The Glorious PC Gaming Master Race" are totally for and promoting closed systems and companies. IMHO, the popularity of AMD should had been expanding YoY.

because fanboys. They think its a good thing and thing AMD is just cheap and not investing. It's the same reason people buy expensive things with no real benefit or worse than cheaper products. Higher cost warps minds. Apple's wealth relies a lot on this kind of money influenced perception.
 
Last edited:
So, I actually looked at the guys other articles and the one he posted right before this is:
"To Combat Nvidia's GameWorks, AMD Must Focus On Developing Own Proprietary Graphics Libraries"
http://www.forbes.com/sites/jasonev...eveloping-own-proprietary-graphics-libraries/

Really? So the solution to this problem is to make more proprietary libraries so eventually we are going to need both an AMD and nVidia card in our systems to run the games with the desired framerates/settings.

Gawd damn.
 
If the developer decided not to update AMD about issues, that is, again, a developer decision.

IMHO, I still don't understand the conflict towards AMD and NV(and Intel, if that ever happens) features used by developers. A developer chooses to use a "feature/SDK/Tool/Engine/whatever" for free and/or gets payed to use it.

A developer is responsible for such actions. The developer decides. The developer picks. The developer chooses. The developer is the one responsible.

Devs are grunts too in a large company. I wouldn't put the blame squarely on Devs. It's Publishers and stockholders, whose main interest is money. If your employer says to implement Gameworks, well you've no say if you want to be paid sometimes.
 
Status
Not open for further replies.
Back
Top