[Forbes] AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Listen to yourself man.
I listened to his post and it is bang on what don't you agree about it?
I suppose if nVidia lost share dramatically because of GameWorks may force them to rethink their strategy, but they gained share so far.
I personally am not obsessed or even care much about market share numbers, but if Nvidia did indeed gain and they see GW as one of those reasons then we will see GW pushed even harder. This will be very bad for PC gaming IMO.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's simple.

You can't have nVidia putting in locked down code into games that arbitrarily and in a malicious manner hampers performance of their competitors cards. It's incredibly obtuse to then blame said competitor for being hampered by this, when after all that is the goal from nVidia.

It ends badly for gamers. We as gamers and enthusiasts are now accustomed to this outcome as this has been the norm with nVidia, not the exception.

Time to start looking at what nVidia does, and stop listenting to what they say about their shameworks program.

Of course this is not up to nVidia, it's up to Devs and gamers to put a stop to this nonsense from nVidia.

The forbes article is rubbish.

Maybe nVidia should simply lock their features to their platform. AMD gamers can enjoy HBAO+, PhysX Destruction and Cloth , and even HairWorks tweaked.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I saw a video where they showed both consoles and the PC version of Witcher 3. I must conclude that you gotta be nuts to spend all that money on a PC when both consoles look just as good. You gotta seriously nitpick at the scenes to find any differences, and certainly none of those differences justify spending an extra $400 minimum above and beyond the cost of a console. The game runs fine on both consoles. The fps might be low, but the gameplay is smooth.
 

SPBHM

Diamond Member
Sep 12, 2012
5,076
440
126
I saw a video where they showed both consoles and the PC version of Witcher 3. I must conclude that you gotta be nuts to spend all that money on a PC when both consoles look just as good. You gotta seriously nitpick at the scenes to find any differences, and certainly none of those differences justify spending an extra $400 minimum above and beyond the cost of a console. The game runs fine on both consoles. The fps might be low, but the gameplay is smooth.

you can use a lower end PC with cost closer to the consoles and still being a full PC with all the other advantages

https://www.youtube.com/watch?v=gGf4SVWEw2g

also I've seen PS4 owners reporting more significant performance problems in some other parts of the game, affecting gameplay...

still, even with all the downgrades and performance issues I'm amazed by how good this game actually is, I haven't enjoyed a game like this in a very long time, and it's such a huge improvement over Witcher 2 :thumbsup:
not just in terms of visuals but all the rest.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
It's not a snide remark, what Nvidia platform are we talking about here? Last time I checked my computers are not Nvidia or AMD or Intel platforms they PC platforms. Now if Nvidia wants to sell a closed off Nvidia branded gaming box then go for it.

HairWorks runs on a PC using an Intel x86/AMD64 processor with industry standard interconnects. I don't even need an Nvidia card to run HW so the "Nvidia platform" comment is confusing.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
They're not the same -- for example: AMD had Mantle for their Radeon platform or ecosystem. Companies have to try to differentiate and innovate and offer reasons to buy their products. I can easily see someone offering AMD gaming or platform or nVidia gaming or platform.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
They're not the same -- for example: AMD had Mantle for their Radeon platform or ecosystem.
Mantle is now DX12 and at least two other APIs which supports any GPU with the appropriate driver/hardware capability so not sure what your point is here.
I can easily see someone offering AMD gaming or platform or nVidia gaming or platform.
You mean a piece of hardware (I'm assuming a PC) that only runs for example Nvidia games?
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I saw a video where they showed both consoles and the PC version of Witcher 3. I must conclude that you gotta be nuts to spend all that money on a PC when both consoles look just as good. You gotta seriously nitpick at the scenes to find any differences, and certainly none of those differences justify spending an extra $400 minimum above and beyond the cost of a console. The game runs fine on both consoles. The fps might be low, but the gameplay is smooth.

If you're just going for parity it's maybe $200 more, but it's always tempting to spend a little more for that much more powerful hardware.

Anyway, controllers are terrible for shooters and strategy doesn't even exist on console so it isn't even an option for me. And if I upgrade my current system for the price of a ps4 it'll be a powerhouse.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
The developer basically said it was too late to add tressFX.

That's what they said, yes. But they have been patching the game after release...that's the funny part of the excuse. As if they where making a non patchable game!

It seems that part of AMDs plan could be to twist Nvidia advantages into something bad, evil, or awful.

It's not "bad", "evil" or "awful". It is closed and locked source and AMD can't touch it.

As an example, nvidia comes out with Gsync and AMDs response was fist to attack and downplay then manipulate.

Read this -> "Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."

NV didn't come out with anything, it took an industry standard, branded it and made it proprietary. Feel manipulated yet? lol

Like, AMD cant work with developers on gameworks titles, that Nvidia prevents them in their contracts.

I expect something like this would not be in a contract, at least for the non GW stuff, but people do usually "motivate" others.

Nvidia does prevent developers from sharing gameworks code. This is stretched to become........the entire game.

All the bling is compute intensive, most of the game will already be optimized on the cpu side of things. I am seriously surprised AMD can improve the performance by that much, and everyone should be! It does make me wonder how they do it.

It needs to go down to 30hz before you can say its on-par with GSync.

Now this was hilarious. These monitors that can go past 60Hz are "forward looking technologies" and expensive, PC low level API's are incredible at having high minimum framerates, as Mantle showed, and we are near the next gen graphics memory era, and VR.

Games running below 60Hz will not be the normal "soon".
 

Stormflux

Member
Jul 21, 2010
140
26
91
People are still defending nVidia here blindly. No, no one should expect nVidia to hand over their code over to AMD for optimization. But...

It's been a week since this fiasco. For Hairworks, the simplist solution to the entire thing is handled for AMD users already by a driver side fix; controlling the tesselation factor.

Why can't CDProjektRED provide this as an option IN THEIR OWN GAME? Why did no one think that 64x from any camp was overkill? Sounds to me that even CDProjecktRED doesn't have access to the source code of Hairworks here. 4 patches, simplest solution absent. This solution would help performance across the board, Kepler and Maxwell users too.

Kepler performance still goes unanswered.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Why can't CDProjektRED provide this as an option IN THEIR OWN GAME? Why did no one think that 64x from any camp was overkill? Sounds to me that even CDProjecktRED doesn't have access to the source code of Hairworks here. 4 patches, simplest solution absent. This solution would help performance across the board, Kepler and Maxwell users too.

On AMD cards the fix was as simple as reducing tesselation in the driver. If that was already there (and this setting has been for years) the CD-Red could have easily done this either in the game engine (tell driver to use 16x) or consulted with AMD to fix this problem.
 

Stormflux

Member
Jul 21, 2010
140
26
91
On AMD cards the fix was as simple as reducing tesselation in the driver. If that was already there (and this setting has been for years) the CD-Red could have easily done this either in the game engine (tell driver to use 16x) or consulted with AMD to fix this problem.

Yeah great for AMD users. You're missing the point. As a Kepler owner, why isn't this even an option within the game? On or Off are the only options.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I saw a video where they showed both consoles and the PC version of Witcher 3. I must conclude that you gotta be nuts to spend all that money on a PC when both consoles look just as good. You gotta seriously nitpick at the scenes to find any differences, and certainly none of those differences justify spending an extra $400 minimum above and beyond the cost of a console. The game runs fine on both consoles. The fps might be low, but the gameplay is smooth.


Depends on what your tastes are. No console can run a game above 1080p so I don't really care and I don't even have a display that goes above 1080p. Consoles are closer now than ever imo to pcs but trying to compare a pc to console is crazy when pcs do so many things. I compare a console to the cost of my graphics card at most since I would no matter what have a pc anyway. Considering I have 3 xbox 360s, I have no issue upgrading my card multiple times during a console life period. Most people comparing consoles and pcs never really do so effectively.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Is causing enough bad press that NV makes gameworks play nicer not a valid way of dealing with any roadblocks it places? Part of the whole idea of the free market is that actions that hurt the consumer are punished. If there's no fear of backlash, then any mechanism by which the market prevents damaging actions is unraveled, and raw money will always find a way to win.
right on. there is near zero backlash, absolutely minimal. the backlash is so small, you would need a microscope to see it. some posters here even defend the practice :thumbsdown: unless they have some skin in the game, I honestly have no idea why anyone would defend it.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Yeah great for AMD users. You're missing the point. As a Kepler owner, why isn't this even an option within the game? On or Off are the only options.

I don't think tessellation is the problem with Kepler. All too often we are seeing Nvidia produce "Game Ready" drivers, optimised for Maxwell so it looks good in the benchmark charts, then a month or 2 down the line they bring Kepler's performance back up to something playable, but I expect not quite its full potential.

Kepler users are now 2nd tier customers.
 

Stormflux

Member
Jul 21, 2010
140
26
91
I don't think tessellation is the problem with Kepler. All too often we are seeing Nvidia produce "Game Ready" drivers, optimised for Maxwell so it looks good in the benchmark charts, then a month or 2 down the line they bring Kepler's performance back up to something playable, but I expect not quite its full potential.

Kepler users are now 2nd tier customers.

This would be a boon to Maxwell users too. Instead of losing an average of 10fps because of an On/Off toggle. They could lose... 5fps for example if they wanted to run at 16x tesselation.

I'm used to tweaking variable settings, not binary switches as a PC gamer.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
As a Kepler owner, why isn't this even an option within the game?

Because fanboys(which includes the majority of review sites) don't want to acknowledge that NV makes there newest hardware look the best by making their older hardware look worse.


Like someone pointed out many posts ago, the value of an AMD GPU increases with time.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I listened to his post and it is bang on what don't you agree about it?

I personally am not obsessed or even care much about market share numbers, but if Nvidia did indeed gain and they see GW as one of those reasons then we will see GW pushed even harder. This will be very bad for PC gaming IMO.
Bang on to what? Complete ridiculousness? We all know, and have known forever now, that 30fps is the point where playable meets the brink of unplayable. Now free sync doesnt work under 40fps and suddenly anything below 40 is unplayable. Ya know, it's not really believable but you ate it up and backed him up. However I am grateful for the transparency.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Read this -> "Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."

NV didn't come out with anything, it took an industry standard, branded it and made it proprietary. Feel manipulated yet? lol
What was manipulative about that? They made the hardware that makes it work. 2560x1440, 144hz, with variable refresh and response time compensation. The rtc on the freesync screens still isn't fixed.

Meanwhile huddy said gsync would have a full frame of latency. An easily verifiable lie. Makes me secondguess everything amd says. I really think amd is too agressive in trying to spin everything into some kind of evil plot by nvidia.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Yeah great for AMD users. You're missing the point. As a Kepler owner, why isn't this even an option within the game? On or Off are the only options.

My guess is NVIDIA can't figure out how to implement the option in the control panel. Probably too embarrassed to ask AMD for help.

Most likely the developer is locked in once implemented. Option to alter is a no go?

Their probably all sporting TitanX's now anyways....If not SLI.

Joking aside(maybe) as a 970 owner I'd also like the option.

Seriously....Looks like if choosing a NVIDIA product the only way to be somewhat future proof is to buy their future cards. Doubting they're wanting to keep past generations competitive as it'll hurt their bottom line.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Bang on to what? Complete ridiculousness? We all know, and have known forever now, that 30fps is the point where playable meets the brink of unplayable. Now free sync doesnt work under 40fps and suddenly anything below 40 is unplayable. Ya know, it's not really believable but you ate it up and backed him up. However I am grateful for the transparency.

To be fair, 30 FPS may be playable if maintained 100% of the time, but 30 Hz? Not so much.

That is the real issue: playing under 40Hz is definitely not recommended.

G-Sync has a trick where once you cross 35FPS/Hz or so it then begins to drive the display at double the frame rate, so if you are at 33 FPS, those frames are being doubled and the monitor is running at 66Hz.

I don't think I have seen anyone, ever, say 30Hz is playable.

And the whole 30FPS is playable thing, that's only true if your are capping at 30FPS yet the system might be capable of producing 40+ FPS consistently. Once you drop below 30FPS, it is noticeable, and to some, even 30FPS on a 60Hz display is not entirely smooth.

Tricks like G-Sync and FreeSync are not solutions to framerates in the 30s, anyone who has been sold on that premise has been fooled. It is ideal for systems that may not fully maintain 60FPS but generally stay close more often than not, like no lower than 50FPS or perhaps 40. All of a sudden having doubled frames when you are close to 30, I'm not sure if that has an impact on input lag (which is really what all these adaptive refresh methods are about - maintaining no/minimal input lag but smooth still, unlike the v-sync approach to smoothness). I'm not at all arguing that FreeSync is superior because it doesn't resort to automatic frame-doubling to refresh a panel, but I feel the superior solution is in-between the two currently available.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Because fanboys(which includes the majority of review sites) don't want to acknowledge that NV makes there newest hardware look the best by making their older hardware look worse.


Like someone pointed out many posts ago, the value of an AMD GPU increases with time.

That is a nice conspiracy.

Now where does AMD hardware get better in tessellation over time? The tessellation scale in AMD drivers was forced due to AMDs lack of tesslellation hardware in their product years ago. After proclaiming to the world tessellation was a great thing like usual AMD half assed their way into it while Nvidia went full bore. The end result was AMDs horrible tessellation performance compared to Nvidia in a benchmark. With egg on their face they came up with a fix by replacing what the developer wants and cap their tessellation factors and spun it that nobody would know the difference anyways.

But I agree that Nvidia wouldn't be hurt to include the ability to scale tessellation in the driver for those that are ok with not as precise tessellation for better performance.
 
Status
Not open for further replies.