• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[PcGameshardware] The Witcher 3 Benchmark

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No i am just saying rather than any BS from AMD by saying developers are lazy or bad they need to spend money simple and straight .AMD wants everything for free so it is never going to happen.


I would hope the lead devs can make a game engine work well else why not just pay amd to make your game for you ...or even better amd should pay devs to build their game and keep none of the profit!
 
If you watched that Twitch video, and were keen on the development process of the Witcher 3, you would know that CDProjecktRed from around a couple years ago were showing off Hairworks on wolves in use cases, showing they've already committed. Gameworks compensation and integration is now probably a contractual obligation.

Then from the Twitch video, and how this game was developed, final code for Hairworks was delivered from Nvidia and integrated into the main game branch a couple months before shipping. It doesn't sound like even CDProjektRed had source code access to this.

Before this code, AMD was running just fine, probably Kepler too. After this code and with GOLD print press copy a month away. The performance rears it's ugly head. AMD offers TressFX at this point but CDProjecktRed knows it's too late and says no.

From the Downgrades, to the console catering development, and lying up till release, CDProjektRed were in over their heads and are as much to blame as either AMD or NVidia.

Im not saying they did... but CDPR is Polish company. $ here is worth a lot of man hours... Im just saying... :whiste:
 
I would hope the lead devs can make a game engine work well else why not just pay amd to make your game for you ...or even better amd should pay devs to build their game and keep none of the profit!

I don't think there is any point in arguing with him. If you make a point he cannot counter he will either just repeat himself or ignore your post. If he cannot understand what open source means then your argument is wasted.
 
Im not saying they did... but CDPR is Polish company. $ here is worth a lot of man hours... Im just saying... :whiste:

No doubt. I'm not saying they're 100% to blame for taking, what I would believe to be the top end of a possible Gameworks deals. It's all business, and NVidia is the one in commanding lead over AMD.

I put the onus on the entirety of CDProjektRed, not just devs. Publishers and upper management who's main goal is to make money have a lot more sway when they're handing out paychecks.

As an aside, CDProjektRed were heralded as the bastion of PC gaming. Oh how the mighty have fallen. I'm going to be extremely cautious about Cyberpunk.
 
Last edited:
I think they should simply add different quality modes for hairworks in this game, make 16x "high" (combined with perhaps lower MSAA?) and 64x "extreme" or something and people would be happier?

it does look a bit silly for it to be using such high levels of tessellation with a small benefit in visual quality compared to more reasonable levels or nothing...

and it feels like the Batman cape or invisible stuff on Crysis 2... something that at certain point exists more to generate a performance problem for other products than increasing quality for anyone.

I think that is the heart of the matter. Especially so given this is what was achieved initially. I don't think the achievement here with gameworks/hairworks, namely generation of a performance problem for other products, was a mistake. It's a watch what they do, don't listen to what they say kind of a matter.

Irony is that AMD users get to work around this nonsense by lower tess factor. On a 280x i'm having great performance with hairworks and mix of high/ultra in the game now.
 
I think that is the heart of the matter. Especially so given this is what was achieved initially. I don't think the achievement here with gameworks/hairworks, namely generation of a performance problem for other products, was a mistake. It's a watch what they do, don't listen to what they say kind of a matter.

Irony is that AMD users get to work around this nonsense by lower tess factor. On a 280x i'm having great performance with hairworks and mix of high/ultra in the game now.

The poor Kepler performance in TW3 isn't due to GameWorks though. HairWorks performance hit is similar on a 780 Ti and 980. So even if you could lower tessellation factor on NV, I doubt it'd help. That's not the issue here.

Kepler has bad performance even with all the GW features turned off.
 
Just bought Witcher3 for my rig below. I have the AMD workaround with tesselation for the game set at 8x. I'm using Ultra settings and so far am running well in my rig below.
 
AMD was going behind the developers back and worse their customers' back 'till it was exposed, then AMD added an option in CCC.

Thats just inventing things, i've had ATi cards since the 9500pro and there has always been the option to disable these optimisation. (Catalyst A.I). But nice way to completely ignore my actual point. Makes your position look real strong......
 
I'm not a fan of optimizations that go behind the developers back and alter IQ, even if it is very slight, a very slippery slope to me. However, if it garners more performance and makes a feature useable can understand the trade-off, and for flexibility.

This^^^

If AMD had done that they would have been accused of cheating. For example, if nVidia's Kepler fix is accomplished by reducing IQ, watch the crap hit the proverbial fan.
 
I don't like these type of optimizations but I understand why AMD did offer them based on how nVidia uses x64 tessellation a lot.
 
I don't think there is any point in arguing with him. If you make a point he cannot counter he will either just repeat himself or ignore your post. If he cannot understand what open source means then your argument is wasted.
People just like to made up their own facts from the unicorn lands just because it's making them feel better using certain "brands"

Want to know the real thing that actually happens? Ask AMD, Nvidia, & the developer:
3pAx5j4.png


0dZX4Av.png


Cla4Soe.png


Contrary to what forumers here believes, AMD is actually the one who usually helps the developer to optimize their engines. Where's those forumer here claiming AMD always being lazy with developer?

Source: http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy

Developers also say hurting AMD probably wasn’t the primary motive of Gameworks.
 
Hey guys.

I did a run of my own benchmarks and I am posting them here for anyone that may be interested.

The benchmark consists of the whole Woodland Beast mission. It starts with a quick real time cutscene, a fight with two ghouls, a ride with Roach, a fight with two alghouls, a fight with five Scoia'tael and a fight with 6 drowners. All in all, around 8mins of gameplay. Same path for all systems and same actions as humanly possible.

No hairworks on any of the tests hence the preset's (-) deviation in some of them.

For the Ultra settings I tested my 970 on three processors, 2500k i7-860 and Q9550, as well as the 7950 with the 860. (spicy wallpapers on all video links)

Witcher 3 1920X1080 Ultra(-) GTX 970 @1.5Ghz CORE i5-2500K @4.8GHz - 68fps

Witcher 3 1920X1080 Ultra(-) GTX 970 @1.5Ghz CORE i7-860 @4GHz - 62fps

Witcher 3 1920x1080 Ultra(-) GTX 970 @1.5Ghz Q9550 @4GHz - 61fps

Witcher 3 1920X1080 Ultra(-) 7950 @1.1Ghz CORE i7-860 @4GHz - 33fps


For the High Settings, I tested the 7950 and the 570

Witcher 3 1920X1080 High(-) 7950 @1.1Ghz CORE i7-860 @4GHz - 47fps

Witcher 3 1920X1080 high(-) GTX 570 @850Mhz Q9550 @4GHz - 31fps


And for the Medium settings I tested the 570 and 5850

Witcher 3 1920X1080 medium GTX 570 @850Mhz Q9550 @4GHz - 36fps

Witcher 3 1920X1080 medium 5850 @950Mhz Q9550 @4GHz - 29fps

These are with the 1.02 patch. I did a rerun with the 1.03 on the 970+2500k and found equal results. The 7950+860 did not seem to produce something considerably better with 1.03 and 15.5 Catalyst, although I did not do a complete benchmark.

Take care.
 
Last edited:
Hey guys.

I did a run of my own benchmarks and I am posting them here for anyone that may be interested.

The benchmark consists of the whole Woodland Beast mission. It starts with a quick real time cutscene, a fight with to ghouls, a ride with Roach, a fight with two alghouls, a fight with five Scoia'tael and a fight with 6 drowners. All in all, around 8mins of gameplay. Same path for all systems and same actions as humanly possible.

No hairworks on any of the tests hence the preset's (-) deviation in some of them.

For the Ultra settings I tested my 970 on three processors, 2500k i7-860 and Q9550, as well as the 7950 with the 860. (spicy wallpapers on all video links)

Witcher 3 1920X1080 Ultra(-) GTX 970 @1.5Ghz CORE i5-2500K @4.8GHz - 68fps

Witcher 3 1920X1080 Ultra(-) GTX 970 @1.5Ghz CORE i7-860 @4GHz - 62fps

Witcher 3 1920x1080 Ultra(-) GTX 970 @1.5Ghz Q9550 @4GHz - 61fps

Witcher 3 1920X1080 Ultra(-) 7950 @1.1Ghz CORE i7-860 @4GHz - 33fps


For the High Settings, I tested the 7950 and the 570

Witcher 3 1920X1080 High(-) 7950 @1.1Ghz CORE i7-860 @4GHz - 47fps

Witcher 3 1920X1080 high(-) GTX 570 @850Mhz Q9550 @4GHz - 31fps


And for the Medium settings I tested the 570 and 5850

Witcher 3 1920X1080 medium GTX 570 @850Mhz Q9550 @4GHz - 36fps

Witcher 3 1920X1080 medium 5850 @950Mhz Q9550 @4GHz - 29fps

These are with the 1.02 patch. I did rerun with the 1.03 on the 970+2500k and found equal results. The 7950+860 did not seem to produce something considerably better with 1.03 and 15.5 Catalyst, although I did not do a complete benchmark.

Take care.
Thanks, man.
 
Not only that it implies a secondary motivation. And I find that credible Nvidia would first want to push their proprietary tech and if they can at the same time gimp or even disable visuals on Radeon hardware even better from their perspective.
 
Hey guys.

I did a run of my own benchmarks and I am posting them here for anyone that may be interested.

Thanks for the benches. It does confirm how light Witcher 3 is on the CPU. Amazingly so given the recent trend of games to be CPU bottlenecked, particularly open world style games.
 
@pslord

What kind of GPU usage are you seeing in Novigrad? I get 99% percent everywhere except for when I enter cities. Cpu pegs to the max and GPU usage drops. Of course, my i5 2500k is only @4.0 vs your 4.8.
 
Back
Top