[Nvidia] The Witcher 3: Is your system ready - Nvidia official system requirements

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I was very much looking forward to Witcher3 but the realization that they butchered what could have been a visual stunning games for PC, because the aweful consoles are lagging behind, makes me not want to buy the game.

I`ve seen videos of the final build and the colours they now use are way too cartoonish for my likes and the grass and tree`s and textures are just horrible from what we have seen earlier.

For a company that got good PR because they said DLC`s was free and they wouldnt do like other game companies and charge money for them, it seems they forgot to care about PC gamers.
Screw consoles. Especially Xbox One. They are dragging us down
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
So ware is the link ? so i can post it to an AMD rep and ask them to explain themselves.
I posted u.

Developers deleted countless topics about tressfx 2.0 bad performance on Nvidia.

I finished the game last month and if i enable TressFx 2.0 on lichdom battlemage than it become a slide show but it is ok because it is AMD tech and no one can complain about it.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
But still lichdom battlemage runs totally crap if i enable tressfx

Quite possible, just because the source code is open doesn't automatically mean it will be optimized for NV.Devs/NV doesn't seem to care for the game and truth to be told I haven't heard about that game either.When it was released?
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Quite possible, just because the source code is open doesn't automatically mean it will be optimized for NV.Devs/NV doesn't seem to care for the game and truth to be told I haven't heard about that game either.When it was released?
August 2014.
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
I think it is open source now so no need to ask the devs or anything.There you go
http://developer.amd.com/tools-and-sdks/graphics-development/amd-radeon-sdk/

Actual licensing terms when you go to download? -

1. LICENSE:
a. Subject to the terms and conditions of this Agreement, AMD grants You the following non-exclusive, non-transferable, royalty-free, limited copyright license to (i) download, copy, use, modify, and create derivative works of the source code version of the Software and materials associated with this Agreement, including without limitation printed documentation, (collectively, “Materials”) for internal evaluation only with AMD processors or graphics products; and (ii) make and distribute copies of the Materials and derivative works thereof created by You in binary code form only for use only with Your products that support AMD processors and in computer systems including AMD processors or graphics products, provided that You agree to include all copyright legends and other legal notices that may appear in the Software. Additionally, You agree that any distribution of the Materials to a third party, must include a software license agreement with terms and conditions that are at least as restrictive and protective of AMD’s intellectual property rights in the Materials as the terms and conditions set forth herein, including but not limited to the terms and conditions set forth in Sections 4 through 7 herein. Except for the limited license granted herein, You shall have no other rights in the Materials, whether express, implied, arising by estoppel or otherwise.
b. Except as expressly licensed herein, You do not have the right to (i) distribute, rent, lease, sell, sublicense, assign, or otherwise transfer the Materials, in whole or in part, to third parties for commercial or for non-commercial use; or (ii) modify, disassemble, reverse engineer, or decompile the Software, or otherwise reduce any part of the Software to any human readable form.
c. AMD is under no obligation to support or provide maintenance for the Materials or to provide any updates or enhancements to You.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
IIRC all the devs stated that recommended specs are for 30fps only, so you'd need to double up these specs for 60fps. So SLI Titan X or 4 way 980s for 60fps at 4K.

Honestly these requirements aren't all that different for Witcher 2 but of course since its DX11 instead of DX9 we all expect better code at this point.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It seems like they were relying on inflated console specs to brute power through the fact they're still working on the Witcher 2 enginer which was designed for DX9. I'm sure a ground-up native 6 thread next-gen only game would have been estimated to take a lot longer than what they estimated an upgraded DX9 engine would take. Except it looks like they had to put in a great deal of time to strategically and incrementally downgrade the graphics so it runs fine, so time wasted backtracking.

One wonders if they had just written the next-gen and DX11 engine from the ground up if it wouldn't have taken the same time as the backtracking-delayed version of the Witcher 2 DX9 engine.

From the 2013 demos it's clear they know how to program extremely good looking DX11 effects. I imagine that because the base engine wasn't designed with them in mind (e.g. dx9 was the latest at original architecture design stage) they are getting much worse performance on consoles for the effect than if they had written from the ground up to incorporate those techniques and the consoles quirks and limitations. Probably most true from a CPU perspective.

Basically it looks like they jumped the gun by designing before they knew the specs, gambled, and made the wrong bet.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The driver sees every call. They can optimize every line of code without having access to it. Otherwise the game wouldnt start...

No, they cannot. Looks like you still don't understand how GW's EULA works. By virtue of EULA GameWorks agreement, AMD or Intel can never optimize any part of GW source code in any game. The developer can also never alter, manage, optimize, de-compile, or otherwise, any of the GW source code without NV's permission. Would NV provide permission for the developer to optimize GW's source code so that it could run better for AMD, or share that code with AMD? No, because that would defeat the entire point of GW's program. AMD can optimize all other parts of the game, excluding GW. I would expect an AMD driver post launch though and probably a 20-40% performance hit at launch against similar performing NV cards in other games.

AMD can optimize it through the driver.

People many times confuse being able to optimize for the game, from being able to optimize for a library that the game calls from. The game might call functions from a library, that library communicates with the GPU in any fashion it wants, and then you "integrate" that result into the game code, to put it in layman's terms. The problem that AMD has with GameWorks libraries is that they either have to reverse engineer them (and at this point I doubt if they even can put their hands on them, and I'm sure there are revisions), or they literally have to record every single call possible from the game and then reverse engineer the calls for maximum performance, which is basically prohibitive in time and manpower costs.

When NVIDIA says that they give "source access", they mean that they give it to the developers. Meanwhile, the whole point of the ready libraries is that you don't have to write the damn thing yourself, so giving access of the source code to the developers means nothing.

If they really believed what they said, they could have open-sourced the GW libraries and enable their competition to optimize for them (like AMD did with TressFX, or all the DirectCompute source code in Dirt Showdown, Sniper Elite 3, Hitman Absolution, etc.).

The water is still geometry and uses tessellation...

Doesn't mean anything. The physics simulation behind interactive waves/ripples has been neutered/removed.

The original plan, per the video I link below, was to have water being tessellated at 16X and have advanced wave physics, etc. That has been scrapped and gamers who dived into the game noticed tessellation factor has been increased to 64X. It makes sense why as a part of GW they would want to increase tessellation 4X for hardly any visual benefits:

72520.png


AMD cards have decent tessellation up to 16x but it falls off a cliff at 32x and 64x.

It has been downgrade though but really do believe that?

100%. For the last 2.5 years they fed screenshots and trailer videos that showed off graphics we wouldn't get. Don't get me wrong the game still looks great, but not amazing, not the next Crysis 1 of RPGs that we were promised.

Even during E3 when they were promoting all the next generation effects like advanced wave physics, real time reflections of water/blood splatter on water, tessellated ground terrain, advanced fire and smoke particles = 95% of that has been removed. All real time global illumination has been removed from all aspects of the game. It's only pre-baked lightning now with random light sources.

Original Effects advertised for the Witcher 3 -> next gen PC graphics

I understand why CDPR couldn't realistically make a game with 100+ hours of gameplay that advanced though. They shouldn't have ever hyped up the technical graphics with all those pre-rendered screenshots and marketing trailers then. I guess that's how this industry works. Far Cry 4, Watch Dogs, The Witcher 3, AC U, etc. all were misrepresented to get more hype and thus pre-orders. Most of us didn't even know how badly the game was downgraded until the last 2 months which shows how great of a job CDPR has done to hide that they coded for the lowest common denominator - XB1 - and threw in higher resolution and GW features for PC as the standout settings. Even the draw distance appears nearly identical in videos between the console and PC Ultra versions.

This is still going to be an amazing game from a story and gameplay perspective but the fact that a single 980 can max it out at 1080P Uber settings shows it's no powerhouse. The next Crysis 1, 3 moment in PC gaming would need Titan X SLI to run the game at 30 fps on everything maxed out at 1080P and look better than ANY game, not just in its genre. :p
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Even nvidia users won't be running Hairworks unless they have dual 980s for 1080p or dual Titan X for 2560x1440. The devs turned it on at times in the recent gameplay reveals, and at 1080p with a single 980 the game was running like total garbage. Forget about it at 4K altogether.

It's a shame how badly this game was hyped for visuals and then massively downgraded as the years went by. It looks no better than Dragon Age Inquisition at this point.

2013

2504401-6535952199-ip9VU.gif


ilo8ZEuf1yn7w.gif


2015

cWf3DID.png


W2475ee.gif



At least the game itself looks to be amazing. But the promised incredible graphics are non-existent.
damn, the downgrade is real. damn consoles. well, another pc games dev just bit the dust :(
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I was very much looking forward to Witcher3 but the realization that they butchered what could have been a visual stunning games for PC, because the aweful consoles are lagging behind, makes me not want to buy the game.

I`ve seen videos of the final build and the colours they now use are way too cartoonish for my likes and the grass and tree`s and textures are just horrible from what we have seen earlier.

For a company that got good PR because they said DLC`s was free and they wouldnt do like other game companies and charge money for them, it seems they forgot to care about PC gamers.
Screw consoles. Especially Xbox One. They are dragging us down

Completely agreed.

The only true visually next gen game still coming along seems to be Star Citizen. I'm sure The Division is going to suffer the same fate as the Witcher, AC Unity and Watch Dogs did.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
stop with the off topic crap about tress fx and gameworks unless it specifically has something to do with just The Witcher 3, its cluttering this thread up with the same old same old.

Mods, please delete this message if you clean the rest of the thread. Thanks.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
damn, the downgrade is real. damn consoles. well, another pc games dev just bit the dust :(

Wow....

This looks pretty much just like TW2, which looked great, but could have been so much more based on the original pics.

Oh well. :/
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
They can optimise through the driver, you don't need the source for that.

They can't propose a modified code that runs much better on their gpu's, which means hairworks will always run bad on current amd gpu's, as it uses quite a bit of tessellation, and the r9 285 is the only amd gpu that does that well.

Thats not how it works. You cannot optimize code without knowing what you are optimizing. People saying "they can optimize through the driver" obviously have zero development experience. Yes they can optimize code on the driver side, but its extremely difficult without knowing how user space is doing something.

If you take TressFX 1.0 for example, nVidia had their driver optimized for it within two weeks because they had the source code. They didn't put any code changes into TressFX, they only changed their side. AMD cannot do this with Hairworks because they are not allowed to see the source code.
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Thats not how it works. You cannot optimize code without know what you are optimizing. People saying "they can optimize through the driver" obviously have zero development experience. Yes they can optimize code on the driver side, but its extremely difficult without knowing how user space is doing something.

If you take TressFX for example, nVidia had their driver optimized for it within two weeks because they had the source code. They didn't put any code changes into TressFX, they only changed their side. AMD cannot do this with Hairworks because they are not allowed to see the source code.

This kind of logic obviously flies above the heads of some people here. But you can just do it in the driver!!1! Blame AMD! Please. I considered switching to Nvidia just for this game, but there's seriously no point. That, and I won't support this artificial gimping bs that Nvidia loyalists devour.

I'm mad at the bait and switch but I'll still love this game.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
As long as people continue to buy the games when strong evidence graphic giming, developers will keep on not giving a damn about PC gamers.

There should have been demonstrations and refusal to buy the game worldwide until CD Project issued out a patch with a ton more textures. I bet they have made many of them already but discarded them as well as toned down other ingame graphic settings to fit the crappy consoles. Meaning with a proper demonstration they could probably have made a patch available for PC gamers 1-2 months after launch with much better graphics

But this aint happening with the success Witcher 3 have become. People are used to settling for less
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looks like a similar EULA that people complained about for PhysX...

What's your point? That we have gotten to the point where all NV features should be disabled on Intel/AMD GPUs because they are made to run best on NV only?

Let's say Intel goes and works with EA on the Star Wars Battlefront and over the course of 2-3 years makes a game that scales extremely well with 8 CPU cores. But whenever the game detects any other CPU besides Intel Authentic CPU, it blocks all other hardware from running the game across 8 threads. What that advance PC gaming and be great for it? I guess you would say YES, it does, but it goes counter to the PC gaming motto which is never to code any game that purposely limits performance on other vendor's hardware. That's what GW and forcing PhysX on the CPU for AMD users with NV secondary cards does. That's why a lot of people have such negative association with GW, far more than with TWIMTPB or AMD GE.

All NV needs to do is share all of the source code with Intel and AMD. If NV cards still run faster, all is fair. Right now we are moving towards NV spending more marketing dollars and inserting more and more of its closed source code into games which means NV is becoming a co-developer of software, closed source one at that. That would be akin to console exclusives or a developer coding a game that purposely favours XB1 or PS4 over the other.

It seems people cheering for GW would rather have only NV in the marketplace since if NV has 100% of the GPU market, then developers can make ALL games with GW features without any fear of alienating any Intel or AMD GPU users. I guess that would be your perfect scenario then. :cool:

As long as people continue to buy the games when strong evidence graphic giming, developers will keep on not giving a damn about PC gamers.

In this case, I still think the game is worth every penny of $60 USD. It's an open world RPG the likes of which we haven't seen and it has 100+ hours of things to do, plus another 25-30 hours in the DLC. Even if you take out all the side-quests, I bet there are a solid 35-40 hours of real single player storytelling. I personally don't care for 3rd or 1st person multi-player games that much which is why to me it's not about supporting a gimped game but it's about supporting the idea of making single player games with campaigns that last longer than 5-7 hours. The developer is even giving away 16 DLCs for free, even for those who pirated the game. Contrast this with many other games that have worthless DLC like Evolve that push the game to $120. Look at Batman Arkham Knight, it's $90-100 with the DLC and the game will likely be 1/2 to 1/3 the size.

Even if this game literally had graphics as good as Witcher 2, I would still get it. My problem was more to do with marketing of the game but I guess it's partially my own fault for believing they would make 2 separate versions of the game - one for neutered consoles, and the other for the PC with Ultra/Uber settings that were what the trailers showed.

Do Game devs have access to Nvidia and AMD's driver source code?

This question is confusing since it doesn't touch on the issue of GW. There is no AMD source code in the game's engine as part of the game. CDPR confirmed in writing that HairWorks cannot be optimized for AMD cards on their behalf (and we know why because GW EULA prohibits this). That's why they recommended this feature to be turned off in the game for AMD owners.

I'll re-iterate:

"People many times confuse being able to optimize for the game, from being able to optimize for a library that the game calls from. The game might call functions from a library, that library communicates with the GPU in any fashion it wants, and then you "integrate" that result into the game code, to put it in layman's terms. The problem that AMD has with GameWorks libraries is that they either have to reverse engineer them (and at this point I doubt if they even can put their hands on them, and I'm sure there are revisions), or they literally have to record every single call possible from the game and then reverse engineer the calls for maximum performance, which is basically prohibitive in time and manpower costs.

When NVIDIA says that they give "source access", they mean that they give it to the developers. Meanwhile, the whole point of the ready libraries is that you don't have to write the damn thing yourself, so giving access of the source code [they aren't allowed to ever share with any 3rd party vendor other than NV] to the developers means nothing."

GW = NV proprietary SDK source code libraries, designed to work specifically better on their GPUs since they optimized specifically for their GPU architectures.

VisualFX

Provides solutions for rendering and effects including:

HBAO+ Enhanced Horizon Based Ambient Occlusion
TXAA Temporal Anti-aliasing
Soft Shadows Improves on PCSS to reach new levels of quality and performance, with the ability to render cascaded shadow maps
Depth of Field Combination of diffusion based DOF and a fixed cost constant size bokeh effect
FaceWorks Library for implementing high-quality skin and eye shading
WaveWorks Cinematic-quality ocean simulation for interactive applications
HairWorks Enabling simulation and rendering of fur, hair and anything with fibers
GI Works Adding Global Illumination greatly improves the realism of the rendered image
Turbulence High definition smoke and fog with physical interaction as well as supernatural effects

At any point, NV can decide to block all GW's source code from even running on Intel/AMD GPUs if it wanted to. We already have an example in The Crew where HBAO+ is blocked from AMD cards, but HBAO+ works on FC4. Even if it's not NV's, but the developer's fault for blocking HBAO+ in The Crew, due to the existence of GW, we are even discussing HBAO+. If GW didn't exist, the developer would be forced to design its in-house advanced version of HBAO+ equivalent. Sure, it might take the PC industry longer to adopt some of the next generation graphical features, but it's better they are all adopted as open-source.

If GW's existed in its form as it does today, things like SM 3.0, SM 4.0, tessellation, and all the other graphical advancements we have seen in the last 20 years would have all been proprietary NV SDK libraries. That would have forever altered the course of PC gaming as we know it.

There is no AMD Gaming Evolved graphical feature that AMD inserts into any game where they prohibit NV from tapping into the source code, which means NV can optimize for every single GE graphical setting/feature. NV does the complete opposite. If we imagine all of the game's shaders and effects replaced entirely by NV's GW features in the year 2025, AMD would be able to optimize for 0% of the game. I guess that's the future some PC gamers want since they'd rather have a monopoly, get AMD out of the way so all games have GW. That way a PC gamer never has to worry what GPU to buy - the choice is clear NV!

// Anyway, this is getting off-topic. Let's get back specifically to the Witcher 3 please.

Completely agreed.

The only true visually next gen game still coming along seems to be Star Citizen. I'm sure The Division is going to suffer the same fate as the Witcher, AC Unity and Watch Dogs did.

I think it's clear that basically all cross-platform AAA games will be console versions with slightly enhanced visuals for the PC courtesy of GameWorks (or in rare cases AMD GE). The key benefits of buying those games for the PC will be mods (if applicable), cheaper prices (if applicable), superior control options (mouse and keyboard), higher resolution (1080P -> 5K), flexibility (Eyefinity, 3D Vision Surround gaming), superior performance and smoothness (ability to game up to 144Hz), and better solution to tearing/stutter (FreeSync/GSync).

Overall, PC games will still look better than their console counter-parts but the difference won't be that large as was the case with Crysis 1 or Crysis 3 vs. Xbox 360/PS3 games. You are probably not going to be able to fire up any PC game and make a PS4 owner with Uncharted 4 or The Order 1886 have his jaw hit the floor at your PC gaming graphics.

That's why I hope this console generation gets replaced by fall 2019 at the latest; and for the next round I hope MS's and Sony's consoles are equally powerful so that XB2 or PS5 doesn't gimp the game development too much. By the look of it, Nintendo's NX might be some hybrid of Next 3DS and Wii U which means I don't foresee Nintendo competing for graphics with PS5/XB2. Games made specifically for the PC like Star Citizen are the only ones that have a chance to truly WOW us against the console AAA ports imo.
 
Last edited: