• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[PcGameshardware] The Witcher 3 Benchmark

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I think they should simply add different quality modes for hairworks in this game, make 16x "high" (combined with perhaps lower MSAA?) and 64x "extreme" or something and people would be happier?

it does look a bit silly for it to be using such high levels of tessellation with a small benefit in visual quality compared to more reasonable levels or nothing...

and it feels like the Batman cape or invisible stuff on Crysis 2... something that at certain point exists more to generate a performance problem for other products than increasing quality for anyone.
 
It is! Just like that, unethical Nvidia and their hackjob closed source code are mitigated. It really is too bad that Nvidia users can't also benefit by changing tessellation in their CP. See what this crap does, even to you all?

otoh, if someone can come out with a hack in a few days, why cant the devs and nvidia/amd do the same with months to work on the game?
 
I'm not a fan of optimizations that go behind the developers back and alter IQ, even if it is very slight, a very slippery slope to me. However, if it garners more performance and makes a feature useable can understand the trade-off, and for flexibility.
 
If AMD was a professional company than they would never talk about gamework in fact rather talking they would have taken some initial steps to deal with it.

U need professionals to handle matters not fanboys that is the difference between Nvidia and AMD.
 
I'm not a fan of optimizations that go behind the developers back and alter IQ, even if it is very slight, a very slippery slope to me. However, if it garners more performance and makes a feature useable can understand the trade-off, and for flexibility.

what do you think drivers do, AMD and NV have been "optimizing" Dev's poor format selections for decades (R11G11B10 instead of R16G16B16 for example). Even back in the 69XX days i could never spot the difference between 16X and greater in tess benchmarks. ( 8 to 16 it is very visible) .
 
It was AMD that did go behind the developers and their customers' back initially with that optimization and not a fan of that based on it's a slippery slope, no matter how slight the IQ change is.
 
If AMD was a professional company than they would never talk about gamework in fact rather talking they would have taken some initial steps to deal with it.

U need professionals to handle matters not fanboys that is the difference between Nvidia and AMD.

Why wouldn't they? Who best to offer a professional opinion? Also, if a professional company never complains, should they also shut up about IP theft etc? "Just deal with it."

The twitch video defines what their problem is: they mock at the idea of doing more.

No matter how much one already does, one can always do more. Well.. unless one can't. Point is they already do tons. The things he mentioned aren't even their biggest for consumers.
 
It was AMD that did go behind the developers and their customers' back initially with that optimization and not a fan of that based on it's a slippery slope, no matter how slight the IQ change is.

Forced obsolescence is the better way?

Seems like without a discernable difference in visual quality one would look at it differently. Viewed as product longevity, free performance.
 
it wasn't AMD and this is a quote from Nvidia documentation:

DirectX 10 introduces two new HDR formats that offer the same dynamic range as FP16 but at half the storage. The first format, R11G11B10, is optimized to be used as a floating-point render target. It uses 11 bits for red and green, and 10 bits for blue.

ATi did in DX9 in the backend (driver) what was introduced to Dev's in DX10. But i dont want to get caught in specifics. If a Dev has chosen a poor format that for no reason hurts performance and the Driver team can easily optimise it , I as a consumer would what the vendor to do it. You have always been able to turn those kinds of optimisations off in CCC if you care enough.
 
Another thing: AMD could of been more pro-active to get TressFX in there and had years to try -- to offer TressFX after they did see the performance of HairWorks is silly considering nVidia likes x64 tessellation factor and their nVidia's hardware has an advantage. It's like AMD was surprised or are they going to use this as another example to attack nVidia's competitive advantages -- developer relations and GameWorks?

If AMD placed more focus on getting TressFX in there, show how efficient it may be over GameWorks/HairWorks, more open than GameWorks -- that would garner some very positive attention and awareness instead of whining and offer how unfair it is.

AMD had an opportunity to showcase their talents and vision if they were more proactive with TRessFX instead they are too late, complain and attack.
 
it wasn't AMD and this is a quote from Nvidia documentation:



ATi did in DX9 in the backend (driver) what was introduced to Dev's in DX10. But i dont want to get caught in specifics. If a Dev has chosen a poor format that for no reason hurts performance and the Driver team can easily optimise it , I as a consumer would what the vendor to do it. You have always been able to turn those kinds of optimisations off in CCC if you care enough.

AMD was going behind the developers back and worse their customers' back 'till it was exposed, then AMD added an option in CCC.
 
Perhaps I've missed this, but since TW3 also is for the PS4 and Xbox One, both having GCN GPUs, why didn't CDPR resort to using TressFX?
 
Why wouldn't they? Who best to offer a professional opinion? Also, if a professional company never complains, should they also shut up about IP theft etc? "Just deal with it."



No matter how much one already does, one can always do more. Well.. unless one can't. Point is they already do tons. The things he mentioned aren't even their biggest for consumers.
A professional company act according to the situation not complain,provide information not troll like AMD matt or AMD roy and they are REP of AMD for god sake.

Did u ever seen Manuel G coming out and scolding or trolling AMD for TressFx ,Mantle or AMD copied Gforce experience,DSR ,shadow play so the answer is no that is the main thing.

Nvidia not only show they are the best but they act like it but AMD does non of them and only complain.
 
Last edited:
A professional company act according to the situation not complain,provide information not troll like AMD matt or AMD roy and they are REP of AMD for god sake.

Did u ever seen Manuel G coming out and scolding or trolling AMD for TressFx ,Mantle or AMD copied Gforce experience,DSR ,shadow play so the answer is no that is the main thing.

Nvidia not only show they are the best but they act like it but AMD does non of them and only complain.

They are providing information. If they are asked these things in interviews what should they say? Lie about the situation?

DSR is not something new, it at minimum copied old tricks. Shadowplay is streaming online. I say it copied twitch and youtube and any other such service. Geforce experience... I don't know. I think raptr does more imo. GE was useless to me.

What does nvidia have to complain about? AMDs tech is open, if they mess up, its on them. They do take cheap shots at AMD though.
 
They are providing information. If they are asked these things in interviews what should they say? Lie about the situation?

DSR is not something new, it at minimum copied old tricks. Shadowplay is streaming online. I say it copied twitch and youtube and any other such service. Geforce experience... I don't know. I think raptr does more imo. GE was useless to me.

What does nvidia have to complain about? AMDs tech is open, if they mess up, its on them. They do take cheap shots at AMD though.

Shadowplay is totally new because it is based on GPU recording.

Nvidia is was in a same situation where AMD is now in 2013.

Hitman ran worse on Nvidia even AMD 7870 was with GTX 680 par

Sleeping dog same story AMD7870> GTX 680 when it released

Tomb Raider 2013 more panic for Nvidia because performance of GTX 680 performance was below par AMD 7870 and Fixer Ads so did u see any troll or complain from Nvidia or scolding Developers or AMD for this no because they accept and moved on to improve.
 
Last edited:
A professional company act according to the situation not complain,provide information not troll like AMD matt or AMD roy and they are REP of AMD for god sake.

Did u ever seen Manuel G coming out and scolding or trolling AMD for TressFx ,Mantle or AMD copied Gforce experience,DSR ,shadow play so the answer is no that is the main thing.

Nvidia not only show they are the best but they act like it but AMD does non of them and only complain.

A lot of people said that if gameworks was hurting AMD then they should speak up about it. Now that they did say something, you are saying it was unprofessional because they bring up the points that everyone has been talking about and show it in a clear picture.

I think people might be upset because all of the fighting over if gameworks does hurt has finally been shown. They even talk about how it is even hurting Nvidia's own cards.

How can anyone freely defend this?

Also how do you know that they are not pushing their open source stuff? They could be but because of how Nvidia works, maybe companies are taking the free hardware/support and/or money over AMD.

All I ever see from Nvidia is lies of what they are doing, which to me seem way more harmful and unprofessional.
 
Shadowplay is totally new because it is based on GPU recording.

Nvidia is was in a same situation where AMD is now in 2013.
Hitman ran worse on Nvidia even AMD 7870 was par with GTX 680
Sleeping dog same story AMD7870> GTX 680 when it released
Tomb Raider 2013 more panic for Nvidia because performance GTX 680 performance was below par AMD 7870 and Fixer Ads so did u see any troll or complain from Nvidia or scolding Developers or AMD for this no because they accept and moved on to improve.

That all depends on the reason for those performance problems. The double standard is if AMD cant optimize because of lack of source, its AMDs fault. If nvidia fails to get their game working without similar barriers, its AMDs fault.

Shadow play is a nice feature but saying someone is copying someone in this industry is asking for trouble. Features spread.
 
Forced obsolescence is the better way?

Seems like without a discernable difference in visual quality one would look at it differently. Viewed as product longevity, free performance.

Or one could look at the other side as a commitment to their customers to not go behind the developer or customers back, no matter how subtle. I was strongly against questionable optimizations that altered IQ, even slightly, and voiced my displeasure at nVidia mostly -- too aggressive -- they seemed to have learned from past questionable mistakes. It's a slippery slope.
 
That all depends on the reason for those performance problems. The double standard is if AMD cant optimize because of lack of source, its AMDs fault. If nvidia fails to get their game working without similar barriers, its AMDs fault.

Shadow play is a nice feature but saying someone is copying someone in this industry is asking for trouble. Features spread.

No i am just saying rather than any BS from AMD by saying developers are lazy or bad they need to spend money simple and straight .AMD wants everything for free so it is never going to happen.
 
Shadowplay is totally new because it is based on GPU recording.

Nvidia is was in a same situation where AMD is now in 2013.

Hitman ran worse on Nvidia even AMD 7870 was with GTX 680 par

Sleeping dog same story AMD7870> GTX 680 when it released

Tomb Raider 2013 more panic for Nvidia because performance of GTX 680 performance was below par AMD 7870 and Fixer Ads so did u see any troll or complain from Nvidia or scolding Developers or AMD for this no because they accept and moved on to improve.

Not the same. nVidia was able to optimize so that TressFX so that it runs nearly the same on nVidia and AMD cards. Why? Because TressFX is an open library that any developer has access to.

Hairworks using 64x tessellation shows nVidia purposely want to make both AMD, and older nVidia cards look bad next to Maxwell cards. Even though THERE IS NO VISUAL IMPROVEMENT OVER 16X. AMD users are lucky they can adjust that setting down so that it actually runs better on AMD hardware.
 
Another thing: AMD could of been more pro-active to get TressFX in there and had years to try -- to offer TressFX after they did see the performance of HairWorks is silly considering nVidia likes x64 tessellation factor and their nVidia's hardware has an advantage. It's like AMD was surprised or are they going to use this as another example to attack nVidia's competitive advantages -- developer relations and GameWorks?

If AMD placed more focus on getting TressFX in there, show how efficient it may be over GameWorks/HairWorks, more open than GameWorks -- that would garner some very positive attention and awareness instead of whining and offer how unfair it is.

AMD had an opportunity to showcase their talents and vision if they were more proactive with TRessFX instead they are too late, complain and attack.

If you watched that Twitch video, and were keen on the development process of the Witcher 3, you would know that CDProjecktRed from around a couple years ago were showing off Hairworks on wolves in use cases, showing they've already committed. Gameworks compensation and integration is now probably a contractual obligation.

Then from the Twitch video, and how this game was developed, final code for Hairworks was delivered from Nvidia and integrated into the main game branch a couple months before shipping. It doesn't sound like even CDProjektRed had source code access to this.

Before this code, AMD was running just fine, probably Kepler too. After this code and with GOLD print press copy a month away. The performance rears it's ugly head. AMD offers TressFX at this point but CDProjecktRed knows it's too late and says no.

From the Downgrades, to the console catering development, and lying up till release, CDProjektRed were in over their heads and are as much to blame as either AMD or NVidia.
 
Last edited:
Back
Top