- Jun 2, 2012
- 6,470
- 32
- 91
Will the i5 2500k be sufficient cpu power to last through the next generation of consoles games (Modern Warfare 7, Grand Theft Auto 6 etc etc...) ?
What do you guys think?
What do you guys think?
I think people underestimate what next gen consoles could potentially bring to the table.
You should remember the 360 was being developed at a time when PC's were struggling to run Doom 3. No PC back in 2005 could run something like BF3 even on ultra low settings but the xbox and PS3 are still chugging away.
If Microsoft and Sony want a 10 year life cycle from their next consoles, they have to be pretty beefy specs and incredibly well optimised. I wouldn't be surprised if Google, Apple or Samsung entered the console market one day too and I wouldn't be surprised if next gen consoles leave up-to-date PC's in their dust.
The companies seems to have harder and harder time making a profit on consoles, and since the customers get them because they're comparably cheap and accessible I do not at all see that xbox 720 or ps4 will be powerhouses. Microsoft seems to want to take their console into a more htpc direction, and the Wii-U appears lackluster in terms of graphics. With Sony's financial woes I doubt they will want to spend anymore on the console than they must.
Will the i5 2500k be sufficient cpu power to last through the next generation of consoles games (Modern Warfare 7, Grand Theft Auto 6 etc etc...) ?
What do you guys think?
I guarantee you it won't be anywhere close to be able to max out the first generation of games with minimum FPS at 60 FPS plus at 1080p max without AA.
Irrespective of your GPU.
Of course some next gen games may run that well (mentioned above) but at least a few or maybe even 25-50% of them won't run anywhere close.
To get acceptable performance you will need a 3770k and to unleash the true power either haswell or its successor will be needed.
Those who think otherwise ought to take a lesson or two in history![]()
Have a look at bf3 multiplayer benchmarks. 2500k and 2600k are a league apart. Where 2500k is unplayable, 2600k is acceptable,
3770k is even better. And benchmarks only show half the story. The real smoothness int showed in numbers.
3770k may not do a good job then. But it may be tolerable while 2500k is unlikely to be tolerable or at least has a much lower chance.
This is because of HT which newer engines and games will have better support of,
Anyway, Haswill will be needed to max things out. 3960x may not do that well then
And it doesnt matter if sb gets higher fps, it won't be as smooth and will lag compared to a I'vb with slightly lower fps. Not ATM but in the future. You would know if you try every CPU and hardware with every game over the last 10 years.
You have to figure that any nextgen console that is yet to be released has been in development for a few years. They've already decided on things like video cards, cpu's, ram, drives, etc for at least a year and a half, if not further back. Development models have to be produced as do prototypes of production models, a couple years in advanced, that are forwarded to the industry to get a idea of what they will be developing for, what they will be designing accesories, what the limitations will be. Launch games will have been in development for at least a couple of years by the time the consoles are actually released to retail market.
So that standard is likely of hardware a couple years old, and mid-range standards of that time period. And when I mean mid-range hardware, dont expect a midrange CPU or RAM in a console... those are always going to be second rate crap that may not even qualify for entry-level on a PC. The GPU inside of the console is the primary focus. So for your Xbox 720 or PS4, take whatever was mainstream back in 2010 and thats likely what kind of GPU performance youll have in those consoles.
It's almost assured a highend PC from 2009 would still easily outperform the PS4 or Xbox 720. Now the only difference is API standards, they can add last minute additions that Direct X or OpenGL have updated since (yet that can still be restricted by the base hardware they choose).
Any Enthusiast/Gaming/Performance PC from the past 2 years, even mid-range, would easily blast the capabilities and of a PS4 or Xbox 720. Why? Because their console hardware is already years outdated due to their development and release timeframe. It's already old by the time its release compared to PC hardware.
All a console can and ever will be is *literally* a yesteryear entry-to-midrange PC with a controller that cuts corners and uses graphics reduction tweaks (lower polys, blur, fog fx, no AA, terrible texture quality, horrid texture sizes, upscale from a lower native resolution to make you think your running at a higher resolution such as 1080p, color depths and palettes that might be as well be binary, etc) to juice every possible polygon and frame out of it to keep up with the times. The developers also are able to work with one single platform of hardware, instead of a bazillion possible hardware configurations like you have in the PC industry. This lets them heavily optimize the game for the limits of that hardware alone, and also greatly reduces production time and testing. That sounds like a GREAT thing, but not for the advancement of technology or graphics itself... because you can only develop for a single piece of old hardware, so your limited to that on the technical front.
erm, that's not entirely true. When the PS3 and XBOX 360 were first released, you needed a very high end rig to compete with them.
I guarantee you it won't be anywhere close to be able to max out the first generation of games with minimum FPS at 60 FPS plus at 1080p max without AA.