• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will next-gen consoles attempt to push 4k or increase fidelity @ 1080p?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No, the only issue isn't latency. We still have general issues with speed (4K at 60 FPs, uncompressed?), and people will have to put up with data caps in the U.S. As long as there is a giant U.S. market for gaming and ISP data caps exist, you can't go with an OnLive-like service for all gaming. People couldn't afford the Internet overages.

That part can probably be overcome. Latency seems a lot harder to overcome in time.
 
Streaming will always be bad for gaming. You would eliminate all competitive shooter and fighting games overnight.

Nobody wants 10 frames of input lag.
 
Seems like 4K upscaling at least will be an easy checklist feature to add to the next consoles. Actually rendering at anything approaching 4K? I doubt it. Things like higher res textures and advanced effects will offer more bang for the buck. I would expect everything to be rendered at 1080p at least though.
 
I'd actually say that once OLED production gets into swing we'll see a lot of people upgrade as it is a rare tech that is both "sexier" as well as brings real benefits and also will likely offer size increases at lower costs (meaning people will be able to go even bigger for less money). Give it 3 years and it will be an easy sell to consumers. By Black Friday next year, I'm betting we'll see near $1000 50" 4K OLED sets (I want to say they're at about $3000 now, with 1080p ones being $1500, but we'll see what prices at Black Friday). Give it 3 and I think we'll see 50" OLED 4K sets for $500.


OLED? Not likely. But you can already find 50" LCDs for that price.
 
Right now real time ray tracing is out of reach, due to hardware and software efficiency limitations. We had some cool demos of what ray tracing could do, put out by mainly Nvidia and Intel, around 2008ish, but since there hasn't been a lot of chatter about it, because it's just way too heavy on the CPU side, and not possible with something like the XBO or PS4.

There's been improvements in other types of Global Illumination. Recently, we heard that "The Tomorrow Children" is using voxel cone tracing, which seems to be pretty effective.... but we're seeing some good progress with hybrid solutions as well, using both rasterization and ray tracing. PowerVR for example, added ray traced shadows to a rasterized scenes in Unity. Stuff like this will be used more and more as this console gen goes on, but I think the next console gen will be the one where good GI finally is a thing in most mainstream games.

I think that as this sort of new rendering starts to be adopted more, Nvidia, AMD, PowerVR and other graphics companies will start optimizing their hardware for hybrid rendering, and the next consoles should be way better at it than what we got now.

We'll probably see big improvements in some other weak areas, like texture streaming. New memory tech has a LOT more bandwidth than what we have now...

Besides that, I think it'll be a lot of the same. I don't think developers can really pour a whole lot more time in resources into games than they already are... hardware's sorta plateauing on die shrinks and whatnot, so the amount of graphics muscle available won't be drastically better...

So, in short I don't expect a lot in higher resolutions or hardware power, but think that Global Illumination and speed of caching to and streaming from memory should improve.
 
Last edited:
Well, these consoles actually blow on the hardware side, so if you pair the PC hardware with the low-overhead programming of consoles, there's probably a lot more you COULD get out of it. Of course, Mantle flopped with that, but DX12 looks capable of doing SOME of that stuff on the PC side.
 
No, the only issue isn't latency. We still have general issues with speed (4K at 60 FPs, uncompressed?), and people will have to put up with data caps in the U.S. As long as there is a giant U.S. market for gaming and ISP data caps exist, you can't go with an OnLive-like service for all gaming. People couldn't afford the Internet overages.

You do realize that using centralized servers to play with a friend 2000 miles away is going to cause insane amounts of laggy controller response, right?

Heck, I feel sorry for people that don't know how to set their TVs to "game mode". I accidentally had mine turned on with all the default image processing and it was horrendous.
 
Oh ... the future?

1) Pixel counts will only improve slightly. I expect dual 1080p rendering support to handle 3D. Then it will be upscaled to 4k.

2) I don't even know if next gen will use a TV. I almost expect headsets to take over. That would take a paradigm shift though. Probably not a risk manufacturers want to introduce. Headsets will probably be paripherals for those that can afford it.

3) Expandable memory. Sorry, but something has to be done about this. I would happily pay $100 for my PS4 to have memory to cache level data in. Destiny and Bloodborne come to mind when i think about horrible load times that could improve with improved caching of some sort.

4) I hope to see huge imprvement in CPU, not GPU. At this point, I'd rather see improved AI.
 
I have a 55" 4k HDTV, from where I sit the 4k is not noticeable. I think 4k gaming is something that will require a lot more hardware power but the gain in user experience may not be worth it for some from a living room TV perspective. However, 4k support for VR support on next gen consoles, that's where I can see it making a difference.
 
4k is useful for 3d and not much more.

8k or whatever comes next might bring glasses free 3d to market.

I am really impressed with the passive 3D on a 4k set because it doesn't flicker like active does and still gives 1080p to each eye. Active 3D has always had a dimmer picture as well but I preferred it to passive because of the resolution, but not any longer.
 
I am really impressed with the passive 3D on a 4k set because it doesn't flicker like active does and still gives 1080p to each eye. Active 3D has always had a dimmer picture as well but I preferred it to passive because of the resolution, but not any longer.

Hows is text in that setup? One of the issues with the passive setups is that every other line goes to a different eye, so it made text very blurry. Does that still hold true with a high enough resolution like yours? I'd be very interested in it, if it didn't make text look blurry.
 
Hows is text in that setup? One of the issues with the passive setups is that every other line goes to a different eye, so it made text very blurry. Does that still hold true with a high enough resolution like yours? I'd be very interested in it, if it didn't make text look blurry.


I don't have 4K but a friend does. Text is as clear in 3D as it is in 2D to me.

See if a local store can demo it to you and you can see.
 
Um, out of curiosity, what is the resolution of most game assets like wireframe models? Reason I ask is that even if you had 1440p resolution there's a good chance that the various art assets weren't even created in that resolution.
 
Um, out of curiosity, what is the resolution of most game assets like wireframe models? Reason I ask is that even if you had 1440p resolution there's a good chance that the various art assets weren't even created in that resolution.

They will be created for higher resolutions in the future.
 
No, it won't be, not even close. Don't confuse the average household in the U.S./Canada/Germany/U.K. and enthusiast biased PC forums such as AnandTech, HardOCP, TPU, etc. with the worldwide console gaming market. Also, don't assume that because many people in developed/high income per capita countries could afford a 4K TV from now until 2020 that the rest of the console's target market could also afford those TVs.

You are also missing another important point -- to fully benefit from 4K pixels on a 4K TV at 8-10 feet that most people use their consoles at, you need a rather large 4K TV.
<snip>

5-6 years (?) from now 4K will indeed be very common in North America even, as you point out, if it's a complete waste of time for most of us due to viewing range.

I'd rather sit at 1080p and make games look better with that. I've never seen a 4K game that looks like a movie at 1080p. People get caught up in numbers.

But the next round of consoles will hit market in about five years and need to maintain relevance for 5-7 years past that point, so the odds of them supporting 4k are very high IMO, even just to shut up whiners about lack of 4k.

But I can buy a 60" 4k now for $1200 or $750 for a 48". It won't be too long before 4k represents the majority of large screen purchases.
 
Last edited:
5-6 years (?) from now 4K will indeed be very common in North America even, as you point out, if it's a complete waste of time for most of us due to viewing range.

I'd rather sit at 1080p and make games look better with that. I've never seen a 4K game that looks like a movie at 1080p. People get caught up in numbers.

But the next round of consoles will hit market in about five years and need to maintain relevance for 5-7 years past that point, so the odds of them supporting 4k are very high IMO, even just to shut up whiners about lack of 4k.

But I can buy a 60" 4k now for $1200 or $750 for a 48". It won't be too long before 4k represents the majority of large screen purchases.

I'd say the odds are that it will say 4K support, but not render at higher than 1080p. 4K support will be like XB1's 1080p support now, an upscaled image from 1080p.
 
i can't wait for 4k consoles to come out but the games will be 1600p @ 30fps and people will cry about it not being true 4k.

CANNOT WAIT!!
 
I'm sure they'll have 4k "capability", but it will only be used for "home brew" games that use hardly any processing power. If you're thinking AAA level graphics in 4k you can forget it on consoles. You need to have monstrously powerful PC hardware to do that now. Even in five years I down the $100 gpu Sony plops in their system is going to push anything near 4k with a solid frame rate. More likely the devs will complain they'd rather use that horse power for better graphics at 1080p.
 
The only way next gen consoles run at 4K30 is if both Sony and Microsoft wait until ~2021 for the 10nm High Performance node. Your typical console GPU is around 200mm² with a ~75W TDP.

That means either the current console generation will be just as long as the PS3/Xbox 360 generation, or if they elect to jump in during ~2019 while we are still on the 14nm High Performance node, the Xbox Two/PS5 will both be rendering internally at 1440P and upscaling to 4K.
 
Last edited:
The only way next gen consoles run at 4K30 is if both Sony and Microsoft wait until ~2021 for the 10nm High Performance node. Your typical console GPU is around 200mm² with a ~75W TDP.

That means either the current console generation will be just as long as the PS3/Xbox 360 generation, or if they elect to jump in during ~2019 while we are still on the 14nm High Performance node, the Xbox Two/PS5 will both be rendering internally at 1440P and upscaling to 4K.

The GPU power budget on PS4 is much higher than 75W. It has a GPU that lands between HD7850 and 7870 in performance. During 2013, AMD never had any GCN GPU that could hit that level of performance on 28nm in just a 75W power envelope. The original PS4 can easily hit 140W of total power usage while PS3 was close to 200W.

xboxone-ps4-powerconsumption-02.jpg


A GTX980Ti is almost exactly 3X more powerful at 1080P HQ than an R7 370 (roughly similar to PS4's GPU). For next gen consoles, they are targeting 5X the increase in perf/watt. If we assume the same TDP & similar breakdown between CPU/GPU component, that would make PS5 roughly 5X faster than PS4, or 5/3 = 67% faster than GTX980Ti.

Can a card that's 67% faster than GTX980Ti play some AAA games at 4K @ 30 fps with a combination of Medium-High settings? Yes. Could it play 2020-2025 AAA game at 4K well? Probably not but for marketing purposes, they might gimp graphics IQ for 4K marketing! 😉

My biggest issue with 4K console gaming is that in the living room, 4K is mostly a marketing gimmick without having a huge TV or sitting way up close than normal. I'd much rather see all next gen games target 1080P maxed out @ 60 fps with maximum graphics (lighting, textures, shadows, physics, AI, objects on screen, rather than pointless 4K marketing gimmicks).

There is no doubt in my mind that next gen consoles will be marketed as 4K though simply because of 4K TVs, 4K BluRays and naturally having HDMI 2.0 (or newer) or DP 1.3 (or newer) connection. For media alone, I wouldn't even be surprised if they targeted 5K marketing capability.

Intel is already marketing 5K @ 30-60 fps for Kaby Lake for media and that's only a 2016 GPU.

intel-roadmap-5q-002-1920x1080.jpg
 
Last edited:
You are also missing another important point -- to fully benefit from 4K pixels on a 4K TV at 8-10 feet that most people use their consoles at, you need a rather large 4K TV. A puny 49-55" 4K TV will be mostly marketing as far as IQ is concerned compared to the 1080P IQ of a similar quality panel at those small sizes. Even though 4K TVs themselves will be better since LED/LCD panels are expected to keep improving with time, the pixel density itself won't be the primary factor that will make graphics look significantly better for consoles on those smaller 4K TVs, unless one is willing to get a 75-84" 4K TV or plays console games much closer than 8 feet from their TV.

resolution_chart.jpg

I think your chart is not as accurate as you think. The ideal screen/display should be able to match the resolution of the human eye, which is, approx 1 arcminute (or 0.02 degrees). That would mean a pixel at 8 feet distance would need to be smaller than 0.023695375 inches in width and height (remember that the largest distance in a pixel is the diagonal, which means it actually needs to be smaller than the smallest length your eye can discern at that distance, so you need to find the largest square pixel that fits in the circle with diameter in inches of 96*sin(0.02) ). That would also mean that the largest TV at 4k resolution (3840x2160) and 16x9 aspect ratio to have pixels small enough to not be discernible at 8 feet would be 85 inches (well 85.30335068 inches to be exact). Which is much larger of a screen than the one in the chart for that distance and resolution.
 
I think your chart is not as accurate as you think. The ideal screen/display should be able to match the resolution of the human eye, which is, approx 1 arcminute (or 0.02 degrees). That would mean a pixel at 8 feet distance would need to be smaller than 0.023695375 inches in width and height (remember that the largest distance in a pixel is the diagonal, which means it actually needs to be smaller than the smallest length your eye can discern at that distance, so you need to find the largest square pixel that fits in the circle with diameter in inches of 96*sin(0.02) ). That would also mean that the largest TV at 4k resolution (3840x2160) and 16x9 aspect ratio to have pixels small enough to not be discernible at 8 feet would be 85 inches (well 85.30335068 inches to be exact). Which is much larger of a screen than the one in the chart for that distance and resolution.

Lots of numbers get thrown around to what is "noticeable" and our eyes tend to adjust to the environment around them a bit. I would bet it most would find the chart to be fairly accurate in the sense that the diminishing returns past that point is HUGE.
 
Lots of numbers get thrown around to what is "noticeable" and our eyes tend to adjust to the environment around them a bit. I would bet it most would find the chart to be fairly accurate in the sense that the diminishing returns past that point is HUGE.

On this idea, I don't find the sight of visible pixels particularly offensive, but the jaggy crawly, shimmery ALIASING is pretty easy to see. When you raise the resolution aliasing is less noticeable to me, and it seems like the lighter AA methods are sufficient, even if it's just a post process AA filter. At low resolutions, I find myself pumping things up to like 8xMSAA because 4xMSAA wasn't good enough, while my frame rate and videocard cry for help.
 
Back
Top