Will the i5 2500k be sufficient cpu power to last through the next generation of consoles games (Modern Warfare 7, Grand Theft Auto 6 etc etc...) ?
What do you guys think?
In terms of hardware:-
Athlon 5350 (4C Jaguar @ 2GHz):-
7Zip - 5,785 (4T)
CineBench - 0.5 1T / 1.97 (4T)
Handbrake - 69.6fps (4T)
WinRAR - 635 1T / 2076 (4T)
x264 HD 5.0.1 - 4.2fps (4T)
PS4 & XB1 equivalent Consoles (8C Jaguar @ 1.75GHz):-
7Zip - 10,123 (8T)
Cinebench 11.5 - 0.44 1T / 3.5 (8T)
Handbrake - 122fps (8T)
WinRAR - 555 1T / 3633 (8T)
x264 HD 5.0.1 - 7.4fps (8T)
Intel Avoton C2750 (8C Atom @ 2.6GHz):-
7Zip - 13,509 (8T)
Cinebench 11.5 - 0.47 1T / 3.77 (8T)
Handbrake - 130.3fps (8T)
WinRAR - 701 1T / 3838 (8T)
x264 HD 5.0.1 - 7.4fps (8T)
i3-4130 (2C/4T @ 3.4GHz):-
7Zip = 10,166 (4T)
Cinebench 11.5 = 1.48 1T / 3.47 (4T)
Handbrake = 154.1fps (4T)
WinRAR = 1178 1T / 3902 (4T)
x264 HD 5.0.1 = 7.6fps (4T)
As you can see the nearest Intel CPU to the consoles Jaguar is the 8-core Atom (per core) / i3 (overall performance). Now bear in mind only 6/8 console cores can be used for the actual game (so knock 25% off of above console's 8T score), whilst the i5-2500K can be heavily OC'd it's pretty much a no brainer. Previous gen consoles also benefited greatly from the rapid process move (130nm (2002) -> 90nm (2004) -> 65nm (2006) -> 45nm (2008) -> 32nm (2010) -> 22nm (2013 Intel only), etc). This 2-year cycle has slowed down too across the board (especially for AMD & video cards) which hurts platforms that have to be designed down to a TDP (ie, AMD CPU based consoles, which is why they use the small Jaguar cores in the first place, there simply isn't the available wattage for a big core + 7850 class GFX card (short of sticking it in a Micro-ATX HTPC case with a 400w ATX PSU and calling it a "console")).
The real increasing issue for many PC gamers will continue to be not lack of CPU horsepower, but lazy consolized bad ports. Devs can screw up *any* PC game on *any* rig, with non-performance related stutter / micro-stutter / messed up mouse controls, ugly "designed for TV at 10ft" HUD's left in on "designed for 2ft" PC's, "platform parity" politics (ie, dumbing down GFX from early footage so as to not "offend" lesser platforms), bad netcode / hit registration for MP games, game-breaking bugs, etc. Likewise, obvious resolution / fps differences aside (1080/60 vs 720/30), they can "cheat" by 'tweaking' consoles to render fewer dynamic shadows / shaders, etc, for any given preset (ie, typical "Medium" on consoles may not even be Medium on PC, let alone High or Ultra) giving consoles the illusion of performing better.
As for xxx 6 & yyy 7 franchises, personally I think this is half the problem with modern gaming. I'd rather see more decent standalone / "no more than 1-2 sequels" games come out and be judged on their own merit (eg, Dishonored, Portal, The Last of Us, etc) without having to be "franchised to death" under a mound of 6-30x stale "cash in" sequels or annual / biennial "refreshes". (But then asking for fresh-feeling creativity from some devs is like asking for a pony for Christmas...)