RussianSensation
Elite Member
- Sep 5, 2003
- 19,458
- 765
- 126
Personally I would be far more excited about this new type of console if it meant they would release new hardware every 3-5 years instead of every 10 years. If you think about it their R&D costs are drastically cut by using pre-made hardware, and their costs will be far lower as well.
+100. :thumbsup::thumbsup:
In the past console generations generally lasted 5 years, tops 6, before a new one was launched. As a result:
1) PC gaming wasn't held back as much;
2) Consoles were much cheaper since they didn't need to have top of the line hardware to last 7-10 years. Cheaper price made them more popular (PS1, PS2);
3) Developers made a lot better games since they only had a shot at making at most 1 sequel in a span of 5 years. They focused on making a good game from the get go since in the span of 5 years, there wasn't much room to make a mediocre game. Now we have games with 5-10 hour single-player, never ending sequels using graphics from 2008 (Assassin's Creed and Call of Duty 1 millionth edition) made to cater to 7-year-old hardware...
4) Graphical leaps occurred much more rapidly. If someone told me in 2007 that BF3 would be the best looking game by 2012, I would have laughed. Now, it's a sad reality that we need 4xMSAA in deferred game engines to bring a GPU to its knees while the textures and detail in the game is still stuck at 2008 levels, with only animations and lightning model being the bright spots.
This upcoming generation may just be the worst one of them all. They are looking to run PS4/Xbox Durango for another 7-10 years but to start with will probably show low-end to mid-range HD7000 hardware in them (I am guessing HD7770). So it's going to be far worse imo than what we just experienced with PS3/360. Considering HD7850 ~ HD6950 + 5% or so, by 2013 this level of performance will be downright laughable to survive 7-10 years of innovation to come. This wouldn't be a problem if they replaced the consoles in 2018.
Also, the fact that they are focusing so much more on multi-media aspects such as Twitter, Facebook, online apps, digital content streaming, Kinect integration, etc, means less and less budget can be allocated towards hardware because all these other expenditures are eating into their cost structure. Putting Kinect into the console is like taking away $50-60 budget that could have been allocated towards an SSD, a faster CPU, GPU, etc. (or just making the console cheaper).
I think the next generation of consoles is going to be less about the hardcore gamer than ever. Even now all the rumors are not really hinting at mind-blowing performance, mind-blowing graphics, next generation physics effect/ AI, etc. (you know the usual marketing they start spewing). Almost all the rumors are negative: no used games (or requiring online passes to unlock used full game content), constant Internet connection, tying your purchases to Xbox Live/PSN account to combat piracy, motion controls, etc. If true, they are on their way to making consoles closer and closer to the PC, minus the cheap Steam games or the simplicity that consoles used to offer (plug and play at a cottage and off you go).
Basically they are focusing less and less on the core aspect of consoles - gaming with next generation AI, physics, graphics - and shifting towards making them more like all-in-one multimedia devices. All that's going to mean is more rehashed sequels with console ported graphics. Also, developers are focusing more and more on multi-player aspects of gaming with single player games falling to the wayside. You can just imagine the future of gaming may be Free to Play model with them trying to sell us constant DLC, weapons/upgrades for $1-5 to keep the game running (like Blacklight Retribution).
I don't remember any generation where a GPU could crush any game at 1080P easily. There was always some game that couldn't be maxed out with top of the line hardware at that time: Quake 3, Doom 3, Far Cry 1, Crysis 1, Metro 2033, etc.
The next generation after HD7970 / GTX680 is just going to make performance bordering on absurd levels for anyone outside of 2560x1600 monitor.
When I look at Crysis 1 which launched in Nov 2007, PC graphics have hardly moved beyond that. If Crysis 1 supported DX11 with Tessellation and advanced DOF, and had high-rez texture pack, it would probably look better than any 2012 game on the PC today. That's just a sad state of innovation when console hardware serving as the lowest common denominator :|
My other gripe is how in the world did they choose HD7000 series over Kepler without waiting to see how Kepler performs with intentions of using these consoles for 7-10 years? It's almost like they just went for the cheapest option, which AMD likely provided as they are far more desperate than NV is. With Kepler's 2x the Tessellation performance, better performance/die, performance/watt, better FP16 texture performance, class leading 1080P performance that will be used in next generation console games, it would have been the perfect GPU for next generation of consoles.
Overall seems like the next generation of consoles is all about maximization of profits and cost cutting, gaining access to wider audience, rather than focusing on the core gaming experience.
Last edited:
