Saw alot of bashing on DX11, Gameworks and everything that one could try and pull in based on own users opinion about AMD/Nvidia and FO4. Well it was as simple as bad drivers from AMD.
http://www.overclock3d.net/reviews/...amd_increases_dramatically_with_new_drivers/1
Considering how vast the game is, relying on 1 review to quickly generalize how AMD's drivers magically fixed the performance ignores that this game is still broken, unoptimized and inconsistent in its performance delivery, even after multiple patches and drivers. I can just as easily find other professional reviewers whose experience is the complete opposite of Overclock3D to prove this point.
Kit-Guru performed new testing with 15.11.1 drivers and compared AMD and NV cards. The "dramatic" performance improvements for AMD cards in their parts of the game are nowhere to be found. You know what they found instead though? I'll quote the most important parts:
"On the AMD side, the latest Catalyst 15.11.1 beta driver was used. For Nvidia, we used the Game Ready driver 358.91.
The graphs above show average and minimum frame rates for each of our three GPUs. No matter what card we were using, we found that frame rates can vary dramatically, seemingly at random. We would often get huge spikes in frame rate too, resulting in a pretty disappointing experience overall, gameplay doesnt stay smooth for very long.
On the GTX 970 and R9 290, frame rates would often dip in to the low 30s but would also occasionally rise as high as 80 to 90 frames per second at points in the Wasteland while running at 1440p. At 1080p, both cards are able to keep things above 60 frames per second more often than not but there are still plenty of huge spikes and dips.
Given that the engine isnt designed to run above 60 frames per second without issues, I also ran the game at 60Hz and still found that the frame rate would cut itself in half at points.
We arent alone with these frame rate issues on the PC. There have been plenty of reports showing the console version of Fallout 4 also struggling, so it seems Bethesda may have dropped the ball a bit there. There is a beta patch available on Steam right now but it doesnt seem to make much of a difference and you will still find that performance dips as much as 50% in some cases.
There are reports that lowering the Shadow Distance setting to medium can help, but I did not find this to be the case here. Still, it is worth trying as you may have different results."
http://www.kitguru.net/gaming/matthew-wilson/fallout-4-pc-game-performance-analysis/
If you have followed many reviews and videos and user feedback on this game, you would have recognized the bigger picture here. In some cases, this game runs like butter on 1 rig and yet someone else with the same system has major issues. Other gamers experience certain bugs and glitches that are seemingly absent on another rig with similar or different hardware.
The biggest take-away from this title is not that AMD/NV need to have day 1 drivers for optimal performance but that well designed, well-optimized, long-term QC tested games that are made on truly next gen and efficient game engines perform well on a huge variety of gaming GPUs, spanning across different AMD and NV architectures.
SW:BF is a perfect example of a game that runs well across various CPU+GPU hardware and the key thing is the performance is consistent.
Sure, we can point fingers that AMD should have had day 1 drivers on day 1, such as 15.11.1, however, as of right now with more optimized NV/AMD drivers, FO4 is
still a game with poor optimization given its graphical fidelity (imo average looking at best for 2015) and is hampered by inconsistent performance delivery. What other game have you seen where an i3 4360 paired with DDR3-2400 would be almost 30% faster than an i7 4770K DDR3-1600? Can you name any well-optimized PC game from 2007 until now that would exhibit such behaviour? Fact is, FO4 is just being treated by many PC gamers with a double standard -- i.e., many of its technical flaws are forgiven, with the focus instead on AMD vs. NV.
Instead of just looking at NV and AMD, maybe we should all look at the big picture how so many PC developers cannot program/code/optimize, which means as end users we have no choice but to lower settings for a game that already looks nothing special, or upgrade hardware. All of this is absurd when other AAA games are running at nearly 60 fps on a 4-year-old HD7970, while having graphics, textures and animations that look a full generation better. Not every game needs to look like Crysis 3 or Metro Last Light to be enjoyable. If the game looks average and doesn't show any technical achievements, then it better run like butter on low-end hardware. With SW:BF blowing FO4 out of the water on a technical level, there are no excuses for Bethesda. FO4 is a good game but it would have been MUCH better had it been made on Frostbite 3 or CryEngine 3.5, or any other well-optimized modern game engine.
Even with new drivers, it seems the game continues to suffer from frame rate instability. Granted, maybe Bethesda is relying on the modding community to improve the game and that's a fair point. It's still disappointing to see such a hugely anticipated and popular title of 2015 have such a bad balanced ratio of graphics vs. hardware requirements required to achieve stable, consistent frame rates.