Looks like a similar EULA that people complained about for PhysX...
What's your point? That we have gotten to the point where all NV features should be disabled on Intel/AMD GPUs because they are made to run best on NV only?
Let's say Intel goes and works with EA on the Star Wars Battlefront and over the course of 2-3 years makes a game that scales extremely well with 8 CPU cores. But whenever the game detects any other CPU besides Intel Authentic CPU, it blocks all other hardware from running the game across 8 threads. What that advance PC gaming and be great for it? I guess you would say YES, it does, but it goes counter to the PC gaming motto which is never to code any game that
purposely limits performance on other vendor's hardware. That's what GW and forcing PhysX on the CPU for AMD users with NV secondary cards does. That's why a lot of people have such negative association with GW, far more than with TWIMTPB or AMD GE.
All NV needs to do is share all of the source code with Intel and AMD. If NV cards still run faster, all is fair. Right now we are moving towards NV spending more marketing dollars and inserting more and more of its closed source code into games which means
NV is becoming a co-developer of software, closed source one at that. That would be akin to console exclusives or a developer coding a game that purposely favours XB1 or PS4 over the other.
It seems people cheering for GW would rather have only NV in the marketplace since if NV has 100% of the GPU market, then developers can make ALL games with GW features without any fear of alienating any Intel or AMD GPU users. I guess that would be your perfect scenario then.
As long as people continue to buy the games when strong evidence graphic giming, developers will keep on not giving a damn about PC gamers.
In this case, I still think the game is worth every penny of $60 USD. It's an open world RPG the likes of which we haven't seen and it has 100+ hours of things to do, plus another 25-30 hours in the DLC. Even if you take out all the side-quests, I bet there are a solid 35-40 hours of real single player storytelling. I personally don't care for 3rd or 1st person multi-player games that much which is why to me it's not about supporting a gimped game but it's about supporting the idea of making single player games with campaigns that last longer than 5-7 hours. The developer is even giving away 16 DLCs for free, even for those who pirated the game. Contrast this with many other games that have worthless DLC like Evolve that push the game to $120. Look at Batman Arkham Knight, it's $90-100 with the DLC and the game will likely be 1/2 to 1/3 the size.
Even if this game literally had graphics as good as Witcher 2, I would still get it. My problem was more to do with marketing of the game but I guess it's partially my own fault for believing they would make 2 separate versions of the game - one for neutered consoles, and the other for the PC with Ultra/Uber settings that were what the trailers showed.
Do Game devs have access to Nvidia and AMD's driver source code?
This question is confusing since it doesn't touch on the issue of GW. There is no AMD source code in the game's engine as part of the game. CDPR confirmed in writing that HairWorks cannot be optimized for AMD cards on their behalf (and we know why because GW EULA prohibits this). That's why they recommended this feature to be turned off in the game for AMD owners.
I'll re-iterate:
"People many times confuse being able to optimize for the game, from being able to optimize for a library that the game calls from.
The game might call functions from a library, that library communicates with the GPU in any fashion it wants, and then you "integrate" that result into the game code, to put it in layman's terms. The problem that AMD has with GameWorks libraries is that they either have to reverse engineer them (and at this point I doubt if they even can put their hands on them, and I'm sure there are revisions), or they literally have to record every single call possible from the game and then reverse engineer the calls for maximum performance, which is basically prohibitive in time and manpower costs.
When NVIDIA says that they give "source access", they mean that they give it to the developers.
Meanwhile, the whole point of the ready libraries is that you don't have to write the damn thing yourself, so giving access of the source code [they aren't allowed to ever share with any 3rd party vendor other than NV] to the developers means nothing."
GW = NV proprietary SDK source code libraries, designed to work specifically better on their GPUs since they optimized specifically for their GPU architectures.
VisualFX
Provides solutions for rendering and effects including:
HBAO+ Enhanced Horizon Based Ambient Occlusion
TXAA Temporal Anti-aliasing
Soft Shadows Improves on PCSS to reach new levels of quality and performance, with the ability to render cascaded shadow maps
Depth of Field Combination of diffusion based DOF and a fixed cost constant size bokeh effect
FaceWorks Library for implementing high-quality skin and eye shading
WaveWorks Cinematic-quality ocean simulation for interactive applications
HairWorks Enabling simulation and rendering of fur, hair and anything with fibers
GI Works Adding Global Illumination greatly improves the realism of the rendered image
Turbulence High definition smoke and fog with physical interaction as well as supernatural effects
At any point, NV can decide to block all GW's source code from even running on Intel/AMD GPUs if it wanted to. We already have an example in The Crew where HBAO+ is blocked from AMD cards, but HBAO+ works on FC4. Even if it's not NV's, but the developer's fault for blocking HBAO+ in The Crew, due to the existence of GW, we are even discussing HBAO+. If GW didn't exist, the developer would be forced to design its in-house advanced version of HBAO+ equivalent. Sure, it might take the PC industry longer to adopt some of the next generation graphical features, but it's better they are all adopted as open-source.
If GW's existed in its form as it does today, things like SM 3.0, SM 4.0, tessellation, and all the other graphical advancements we have seen in the last 20 years would have all been proprietary NV SDK libraries. That would have forever altered the course of PC gaming as we know it.
There is no AMD Gaming Evolved graphical feature that AMD inserts into any game where they prohibit NV from tapping into the source code, which means NV can optimize for every single GE graphical setting/feature. NV does the complete opposite. If we imagine all of the game's shaders and effects replaced entirely by NV's GW features in the year 2025, AMD would be able to optimize for 0% of the game. I guess that's the future some PC gamers want since they'd rather have a monopoly, get AMD out of the way so
all games have GW. That way a PC gamer never has to worry what GPU to buy - the choice is clear NV!
// Anyway, this is getting off-topic. Let's get back specifically to the Witcher 3 please.
Completely agreed.
The only true visually next gen game still coming along seems to be Star Citizen. I'm sure The Division is going to suffer the same fate as the Witcher, AC Unity and Watch Dogs did.
I think it's clear that basically all cross-platform AAA games will be console versions with slightly enhanced visuals for the PC courtesy of GameWorks (or in rare cases AMD GE). The key benefits of buying those games for the PC will be mods (if applicable), cheaper prices (if applicable), superior control options (mouse and keyboard), higher resolution (1080P -> 5K), flexibility (Eyefinity, 3D Vision Surround gaming), superior performance and smoothness (ability to game up to 144Hz), and better solution to tearing/stutter (FreeSync/GSync).
Overall, PC games will still look better than their console counter-parts but the difference won't be that large as was the case with Crysis 1 or Crysis 3 vs. Xbox 360/PS3 games. You are probably not going to be able to fire up any PC game and make a PS4 owner with Uncharted 4 or The Order 1886 have his jaw hit the floor at your PC gaming graphics.
That's why I hope this console generation gets replaced by fall 2019 at the latest; and for the next round I hope MS's and Sony's consoles are equally powerful so that XB2 or PS5 doesn't gimp the game development too much. By the look of it, Nintendo's NX might be some hybrid of Next 3DS and Wii U which means I don't foresee Nintendo competing for graphics with PS5/XB2. Games made specifically for the PC like Star Citizen are the only ones that have a chance to truly WOW us against the console AAA ports imo.