Problem is defining what's genuinely required to play & enjoy the game vs what's theoretically "required" simply to stuff everything on Ultra for 2560p @ 120fps e-peen contests, which simply loads the CPU more with optional extra shadows, etc, which are often barely perceptible during static side by side screenshot comparisons let alone gameplay. In fact, some games actually manage to look worse at higher settings. Eg, Skyrim High vs Ultra vs Max comes to mind, with the grass / ground looking sharper on High @ 81fps than Max @ 44fps (and far worse on Ultra than High):-
http://www.techspot.com/articles-info/467/images/High_02.jpg
http://www.techspot.com/articles-info/467/images/Ultra_02.jpg
http://www.techspot.com/articles-info/467/images/Max_02.jpg
Thief recently
"recommended" an i7, yet runs just fine on i3's with barely 10% difference vs an i7. In this case, "
it needs an i7" = an i7-920 @ 2.66GHz.
CoD Ghosts "required" 6GB RAM until it didn't. Some games will inevitably be "heavier" than others (Watch Dogs), but if the core game engine is too heavy (baseline engine, not just "extra shadows for ultra", etc), then how is it going to even work on the PS4 / XB1 whose 8 tablet cores barely match an i3-4340 in 100% perfectly threaded games (of which only 6 are usable for games on consoles)? ie, no matter what optional extra eye-candy you laden the CPU down with on Ultra for PC's, the core game will still have to run on typical console equivalent "Med" (sometimes even Low) setting in order to well, run on consoles! I can see 8GB RAM + 64-bits (and as always, GPU) making far more of a difference than CPU alone. If you have the extra horsepower, you can turn it up. If not, you don't HAVE to run everything on High / Ultra at all to enjoy the game. Half the games I own I can't often tell High from Ultra during actual gameplay, and my i5-3570 @ 4.2GHz isn't even an unlocked K chip and isn't close to being maxed on any game. It's still the GFX card that counts even for "next gen" games.
^ This. PC gaming needs more of a developer attitude upgrade than anything else.
Edit: And that's not just big names developers. As much as I love Indie's, they have their fair share of problems too. "
Sir, you are being hunted" somehow manages to get lower fps than Crysis with visuals worse than Morrowind / late 1990's / early 2000's FPS's. And it "recommends" 8GB RAM too - for a game which if written in an older more suitable engine for its GFX like Lithtech 2.0 or Unreal 1.0-2.0, would basically consume 300-750MB RAM. No amount of "throwing hardware at it" will solve discrepancies like that.