Or they are just using nVidia's Gamework libaries which are unoptimized for ALL GPU hardware.
Since others haven't implemented it there must be. Then the reason is pure semantics. But its there. Just like we use Windows and OSX on our desktops instead of (still free) Linux version. Because its not as simple as we sometimes think.
Stop spreading misinformation and flat out fabricated data. The main reason Intel hasn't supported FreeSync is because companies that size aren't flexible to adopt the latest cutting edge standard for projects/product designs that were taking place 2-3 years in the pipeline.
"Now, Intel has thrown its own hat into the ring and announced that it intends to support the VESA Adaptive-Sync standard over the long term."
http://www.extremetech.com/gaming/212642-intel-will-support-freesync-standard-with-future-gpus
Fact of the matter is nothing stops Intel or NV to get FreeSync support incorporated into their products. NV will hold out by milking its loyal consumers for all its worth until market conditions force them to start supporting it. The only way NV won't support it is if AMD goes bankrupt or if NV maintains 80% market share indefinitely.
AMD needs to block Gameworks, otherwise Nvidia is gonna continue to get AAA titles under its belt.
Ya, what does that mean? You think AMD can just come to Ubisoft and tell them to drop NV as a GameWorks partner when NV sends them engineers and pays for co-marketing of their games? What do you think it's as easy as picking up the phone, scheduling a meeting with Ubisoft executives and telling them to use open source?
Dying light -- horribly optimized at launch,
with single CPU core being pegged to 100%, massive performance drops with draw distance despite almost no IQ differences.
Fallout 4 - what needs to be said about this one? Outdated graphics, unoptimized on CPU + GPU side + memory bandwidth side. Glitchy out of the wazoo. Lighting Godrays that are unnecessarily demanding but the game looks worse than Crysis Warhead, 2008 game.
MGS V - major issues with multi-GPU support, but about the only game in this list that was decent
Witcher 3 - works great without HairWorks, meaning the main GameWorks feature in this game is a FAIL on
all GPUs. Performance degradation is massive due to excessive and pointless over-tessellation <-- proven to be wasteful via screenshots and reduction of tessellation factor to lower levels on AMD cards.
Project Cars - one of the worst optimized games of 2015. When this game launched,
GTX960 OC was almost as fast as GTX780Ti.
UBER FAIL.
Just Cause 3 - game aimed at consoles with outdated graphics and GameWorks effects that hardly matter. Essentially GameWorks did nothing to make this game look better than consoles. The main differences per digital foundries are the inclusion of heat haze and slightly more details. Basically a console game on the PC.
Assassins Creed Syndicate - Oh wow,
the epitome of GamesDon'tWork. Looks worse than Unity too which means they had to downgrade graphical details, number of NPCs on screen and that's what they called
better optimized than Unity. How is SLI working in this game? :sneaky:
Rainbow Siege Six - Outdated graphics,
GameWorks features look terrible. Game was clearly made with consoles in mind. Graphics are miles worse than even Black Ops 3.
Killing Floor 2 - Oh you found
1 game that runs well on low end hardware but then again its graphics aren't anything special so it should.
Evolve -
completely broken CPU optimization, but worst of all, anyone who bought this game supports
$60 of DLC. I wouldn't buy this game for $1 because I do not support such business practices. This post sums this game up:
"No offense to anyone, but you'd have to be one hell of an absolute idiot to spend that much money on this, or any other game."
Evolve - one of the most hyped games of 2015 that today no one cares about.
Batman Arkham Knight - worst PC port of 2015 hands down, broken in all areas, flat out bombs on 2GB GPUs. Maybe the worst console to PC port of all time.
AMD Gaming Evolve 2015
Battlefield Hardline -- amazing optimization
Star Wars - best looking overall + best optimized FPS game of 2015
Dirt Rally -- best optimized racing game on the PC in years
Thanks for proving that GameWorks has been a total failure in 2015. Whether or not GameWorks itself ruined some of these games or the developers are just inept at coding is debating on a per game/per use basis but nearly every GameWorks title of 2015 was broken and unoptimized.
You forgot Anno 2205 - another unoptimized turd where a
"magic" patch improved performance 15-20% post launch -- i.e., the developer rushed the game to market in an unoptimized state.
People in the gaming industry (aka Devs) like to share solutions to problems, new rendering techniques and other information.
Your view of software seems a little bit outdated. Even Microsoft is open sourcing more and more stuff ... D:
Haven't seen a solid response yet as to what developers who don't use open-source next gen effects are going to do for their next games? Call NV again cuz they do not want to spend the $, or know how to make next gen graphics with open-source standards?
Newsflash: Some posters in this very thread who defend closed-source black box DLLs/middle-ware have a
track record of crapping anything AMD, missing out on thousands or tens of thousands of dollars via bitcoin mining (because the thought of owning AMD hardware that makes $ is worse than paying $500-550 for mid-range NV cards every 2 years), constantly buying overpriced flagship NV cards and use the halo status of the top card to ignore how NV gets destroyed in all other pricing segments gen after gen, ignore when older NV cards fall apart (read: Kepler), ignore price/performance, etc. Basically, if AMD beat their favourite brand in
every metric and cost $100, they'd pay to $500 to own NV. Remember, these are the same people who owned NV when it had terrible 2D and 3D IQ (all the eras leading up to Fermi), when NV had Full RGB broken over HDMI, owned the god awful GeForce 5, owned NV even when ATI smoked them with 9800/X800/X1900 series. Same hypocrites that spout perf/watt while buying NV during GTX200/400/500 series. :sneaky:
...Most of them also haven't owned an ATI/AMD card in a decade or maybe ever so their opinion on AMD vs. NV GPUs is basically worthless since they were never objective to start with and never have any merits to compare their experiences with different brands. I'd say that anyone who since 2012 sites perf/watt as a key metric, who consecutively owned GeForce 5, 6, 7, GTX200 and Fermi should automatically be disqualified from all objective GPU discussions/GPU recommendation threads. I've probably only met 2 PC gamers who satisfy this and remain objective because they also bought AMD cards for secondary rigs (objective as in for their next GPU purchase they actually consider AMD vs. NV as if it's an all new purchase, not blindly decide that their next card is NV 100%, no matter what). I would bet that almost anyone who falls into this category would not buy an AMD card even if it were 100% faster and cost 1/2 the price. My guess is some people in this thread defending NV's GWs fall exactly into this category of PC users.