I don't follow hardware that much but I agree 1080P will be baseline. I really can't see much happening with 4k, though. Essentially everybody has a flat screen now and none are above 1080p. At the earliest it will be the tail end of the PS4 before 4k is taken seriously by the masses I would guess, and quite possibly much later than that.
Yep. PS4 should be excellent, but 4K is going to be not much use outside of BluRay 4K (nice!), until the power is really there in a low-TDP form (PS5).
400% should easily be doable as an increase in GPU grunt over PS3, but that would assume that nothing else is getting more intensive with new titles, that we'd see PS3-level AI, level complexity, model/poly count, texture size/resolution, etc.
Another way to look at it is this :
In 2005, a 6600GT could run most titles of the time at ~60fps at 1280x1024 with moderate details. That same 6600GT cannot run anything today (2012 titles) at ~60fps, and usually not even 30fps outside of stuff on very very old engines. To get the same results with modern AAA titles with a similarly midrange resolution (1680x1050), you have to have a dramatically faster card, really a 7850 is the starting point for 60fps play.
Consoles ARE more efficient in using their hardware power with less overhead, and another inarguable bonus point is that Sony and Microsoft will pour tons of money (not as dramatically as the 05/06 war due to economic forces, but still a difference) into their AAA titles and dev kits, which is notably more intense than the resources put into PC.
But we come back to fundamental limitations and the target market :
(1)- Target market : 99.99% of customers have 720p and 1080p screens. Many, many of these have been purchased recently and people don't feel like replacing them. The vast vast majority of these customers won't buy a 4K set until they come down to no more than about $3,000, and most of those won't buy until they hit the $1k mark.
When faced with a choice, game devs can either (A)- Build a game world and models/textures/etc that will work in 4K well, hence way less detailed to enable a good framerate, or (B)- Build it with nice AA, effects, high detail models, textures, etc, in order for a super smooth 1080p experience. It's possible that some titles will have both modes enabled. And it's also possible that they might enable some kind of hybrid mode that the game is actually rendering in 1080p but upsampling to look *better* in 4K, but not really native 4K.
(2)- Fundamental limitations. Pushing 8 million pixels in a game world with extremely high detail is not feasible with low-TDP hardware. Sony and Microsoft are starting out with a dev kit and locked base hardware specs that have already been determined, they have to hit a target MSRP and components that will fit their cost limitations within current process tech (28nm most likely), within a power/heat envelope that will work within a nice somewhat slim casing. What is possible? Very good hardware design, good dev kits, lessons learned (for Sony : ease development process, for Microsoft : DVD too small!), and a super smooth seamless 1080p experience with huge leaps forward in game complexity possibilities due to vastly increased memory.
PS3/X360 look pretty good on a TV from far, but when you look at them on an equal footing (HDMI to a nice monitor) they look pretty rough. You see a laggy framerate in most things compensated with blur effects, low resolution textures, poor shadows, lots of aliasing, some load time issues, it goes on and on. It's actually quite impressive that the devs got as much out of them as they did. PS4/X720 should really give a gigantic leap forward for good 1080p experiences and deeper gameplay possibilities.
But 4K AAA high-res gaming is a pipe dream for now. It's a bridge too far, and to build for that would cause a console's price to increase drastically, with gigantic PCBs/Coolers/Power Supplies, Heat, etc, all for a market that MIGHT get to 1% in 3 years. They simply won't do that, they're not that stupid.