PS4 + 4K = HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA.
HAHAHAHAHAHAHAHAHAHAHAHHAHAHAHAHHAHAHAHAHHAHAHAHAHHAHA
HAHAHAHHAHAHAHAHAHHAHAHAHAHHAHAHAHHAHAHAHAHAHAHAHAHAHAHHAHAHAHAHAH
I'll give you a hint, the GPU grunt behind Crossfire 7970 6GB (2x6GB models) is STILL not enough to reasonably play max settings at 2560x1440. And we're talking about ~$1k worth of GPUs with HUGE coolers and complex PCB, good luck stuffing those into a console case along with feeding the thing 500w+ to keep it going.
PS4 specs are already nailed down, and there's no way in helllllll that 4K will be doable in any real form with the specs it will have.
Don't get me wrong, #1 I'm excited about 4K and hope it becomes reasonable to game at that res before too much more time passes, and #2, I think the PS4/X720 or whatever they get called will be pretty sweet, but this next gen is NOT going to be 4K friendly. In fact, it will probably be worse than the current gen was (X360/PS3 could techinically do 1080, but most games rendered under that, but enough examples are out there of decent 1080 games for them), but PS4/X720 will not be able to do anything convincingly at 4K without severe overcompensation by drastically reducing native rendering.
Remember, 4K = ~8+ million pixels
Lowly 2560x1440, which is enough to bring $1k in GPU power down enough not to be able to max out the latest games at a high framerate = ~3.6 million pixels
In other words, you'd need something in the range of FOUR times the power of a 7970, and probably 6-8GB video memory to make 4K gaming decent.
Trust me, I've followed gaming tech since the Atari 400, 4K won't be really here for gaming until PS5/Xbox4.
They also COULD offer 4K support for microgames or whatever you want to call the lighter games that, at least at one point, comprised the bulk of Live/PSN downloadable titles for under $15.
I don't think it would be a stretch to see the system support dashboard/video output at 4K (is there, or will there be, 4K Blu-ray releases?), and in doing so, it makes it possible to output games at that resolution if the developer so chooses to offer the option. And I imagine, for the next generation, we won't see any developers outside of the small indie-like cheap games with simpler graphics (they will always be popular), but who knows if the manufacturers will even allow games to output at that resolution, regardless if they support it for other video output.
As for the argument that consoles are way ahead because of some supposed better support for 4K, you are kidding, right? How is it consoles have better 4K support?
How is it PC gamers fault that PC displays have "lower resolutions" ?
Do you understand manufacturing constraints?
Monitors at 24" or lower are the most common size for PC gaming. Some go grand by going with 27" or 30" - at the lower size, 1920x1080/1200 is not low resolution for a 20" monitor. It is for a 30", and it is not a common 30" computer monitor resolution, they have 2560x1440/1600, and I believe you can pay a fair premium to get that in 24" sizes, not sure about any lower.
That's a high pixel density for that size of panel, and it will be extremely expensive to get a 4K density at that size. It's not a fact that PC gamers are "settling" for low resolutions, it's an available and cost factor. I can guarantee, given the funding, many people will purchase 4K displays when they become available in 27" and 30" formats, and it would be a little while after that before they reach 24" or lower.
Of course TVs, at what, 46, 50, 60", are going to have 4K resolutions before they hit common computer display sizes.
Most people don't care to have that size of display sitting on their desk, a few feet from their eyes. It sounds cool at first, but then the TV as computer display often winds up well short of the mark. It might just be too oppressive for some, and it'll also likely have slower response times that PC gamers, good ones at least, will certainly NOT settle for. It isn't as much of an issue with console gaming due to inherent input lag and lower framerates anyhow, especially the slower controls in whole. It's also not a huge factor when everyone is on the same footing with consoles - on PC, when you have slow response times and the best players have ridiculously fast displays and controls without input lag, there will be a competitive difference.
But quite simply, it will be cheaper and offer a better opportunity to draw in higher profits by focusing on lower-density 50" 4K panels, before manufacturers bring that to a higher density 30" or smaller panel in a market where it would be much tougher to demand a grossly higher retail price. I say that because with new TV generations, a vastly superior new technology can demand a $3000+ higher pricetag than an otherwise comparable monitor of equal display quality (when ignoring the one major difference). The mature tech drops, let's say to $1000 as norm (save fancy features) - the new 4K display, at same level of quality (no other fancy features) could easily command a $4000 MSRP until the market gets saturated with them. You just cannot create that kind of price differential in computer monitors and expect to gain a worthwhile market share.
In short, sure, consoles may get nominal 4K support because some people will, over the course of the next generation, pick up 4K displays. But there just won't be the rendering ability to actually play the AAA titles at that resolution, not likely even half that resolution upscaled though perhaps by the end of the generation when coding tricks have been figured out. By the time 4K monitors are available for PCs, PC GPUs will have outpaced the console tech by enough that the PCs will be, for the gamers who are willing to drop a fair bit of money, be able to render that 4K resolution with good visual detail at respectable framerates.
The PS5/Xbox4 generation may very well launch with a better rendering ability at 4K compared to the PC GPUs available at time of launch. But hell no, not this next generation.