Will 20nm based gpus be able to handle 4k res for gaming?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
To me, going 4k is good for people who do work on their computer. Or for movie buffs when we start seeing 4k releases. The sharpness of the screen should be really nice and the resolution should offer great real estate for getting work done.

Just wanted to comment and say that 4K reviews for "Movie Buffs" are already out. We've read them, and this comes down to the viewing distance graphs you may or may not have seen.

resolution_chart.jpg


This isn't the chart I'm looking for but I think it makes my point clear. The viewing distance you have to be at to get the full benefit of 4K is pretty close. Most people sit anywhere from 6 to 12 feet from their TV. If we extrapolate that graph, we'd get 4K viewing distance at around 4-6 feet for a 70-80 inch screen before we notice the benefits of it.

This agrees with reviewers reviews of Sony's 4K TVs. Sony let some reviewers at CNET get a first hand look, and they agreed that while 4K benefits were visible, it was only at viewing distances 4-6 feet I think, and that's really close for the TV they were reviewing (I think 85 inches).

At a normal viewing distance of 8-10 feet, reviewers said they couldn't really tell the difference between 1080p and 4K. 4K will be better for games IMO, because we sit extremely close to our monitors. But for movies, I don't think the difference will be as big.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Did you look at Anands review of 4k? We need atleast a doubling of gpu power (titans no less) to make 4k useable but not completely playable. Some games will be fine with a single titan but I would assume that the vast majority of current and future titles with AAA graphics will require much more computing power before 4k is feasible for the average consumer. By much more, we are talking 4x current processing power and that will not happen with a single die shrink.

First point: You don't have to max every game out, which as it turns out, is how all benchmarks are done. If every game were required to be "maxed out", nothing would be even playable even at 2560x1600. Yet, incredibly enough, you can gain 30-40 fps by just turning 1 or 2 settings down in the most demanding games such as crysis 3 or metro: LL. And the game doesn't even look any different, generally speaking (at least in the case of crysis 3 between very high and high). I seem to remember this same discussion about 1080p many years ago. Overkill. Too much graphics horsepower. We don't need more than 1280x1024. blah blah blah. To be sure, 4k is priced out of range for most users and it will take a couple of years to take hold, but we will get there eventually - i'd rather the technology be available than not be available. It's not like anyone is forcing you to buy it. ;)

Second point: 4k isn't necessarily aimed at gaming at this time. That being said, 4k will be released this year and 2 years or so from now it will begin to take hold. This isn't different than when people were reluctant w.r.t. 1080p.....I will tell you this, anytime anyone said that technology or screen resolution was "good enough" and didn't need to change they were always wrong 100% of the time. Technology and screen resolution will march forward, it will not remain static. If technological progress were based on the "good enough" naysayers we would all still be using 486s with 15 inch CRTs. Again, it will take time for 4k to take hold but it will happen, I guarantee it - let the GPU manufactures worry about what's good enough. It will get there in time. The early adopters will be screwed, they will get borderline quality for outrageous prices, but in a couple of years prices will normalize and 4k will be the new 1080p. Everyone will have 4k in their living room probably 5-6 years from now.

In the meantime, stop thinking that you are forced to play every game maxed out. If every game required "maxing out" I would have quit PC gaming ages ago when I switched to 2560x1600. Literally even a Titan can't max the latest games out in surround or at 2560x1600, yet you can easily manage by lowering to FXAA or dialing 2-3 settings down.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
First point: You don't have to max every game out, which as it turns out, is how all benchmarks are done. If every game were required to be "maxed out", nothing would be even playable even at 2560x1600. Yet, incredibly enough, you can gain 30-40 fps by just turning 1 or 2 settings down in the most demanding games such as crysis 3 or metro: LL. And the game doesn't even look any different, generally speaking (at least in the case of crysis 3 between very high and high). I seem to remember this same discussion about 1080p many years ago. Overkill. Too much graphics horsepower. We don't need more than 1280x1024. blah blah blah. To be sure, 4k is priced out of range for most users and it will take a couple of years to take hold, but we will get there eventually - i'd rather the technology be available than not be available. It's not like anyone is forcing you to buy it. ;)

Second point: 4k isn't necessarily aimed at gaming at this time. That being said, 4k will be released this year and 2 years or so from now it will begin to take hold. This isn't different than when people were reluctant w.r.t. 1080p.....I will tell you this, anytime anyone said that technology or screen resolution was "good enough" and didn't need to change they were always wrong 100% of the time. Technology and screen resolution will march forward, it will not remain static. If technological progress were based on the "good enough" naysayers we would all still be using 486s with 15 inch CRTs. Again, it will take time for 4k to take hold but it will happen, I guarantee it - let the GPU manufactures worry about what's good enough. It will get there in time. The early adopters will be screwed, they will get borderline quality for outrageous prices, but in a couple of years prices will normalize and 4k will be the new 1080p. Everyone will have 4k in their living room probably 5-6 years from now.

In the meantime, stop thinking that you are forced to play every game maxed out. If every game required "maxing out" I would have quit PC gaming ages ago when I switched to 2560x1600. Literally even a Titan can't max the latest games out in surround or at 2560x1600, yet you can easily manage by lowering to FXAA or dialing 2-3 settings down.

The problem is, game developers could do more to benefit the graphics if they did higher resolution textures and higher polygon counts. Everyone talks about TV/monitor resolution without addressing the real issue that would make every game look better immediately.

Why then get the icons and fonts smaller when I increase resolution? In Windows, at least. The term "real estate" was mentioned earlier, and that implies that the display content doesn't stay the same. Unlike with Apples retina display.

On a higher resolution screen, you can place two pages side by side when doing work. Or have two windows open next to each other in a web browser. That's one real benefit of higher resolution monitors.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The problem is, game developers could do more to benefit the graphics if they did higher resolution textures and higher polygon counts. Everyone talks about TV/monitor resolution without addressing the real issue that would make every game look better immediately.

I don't disagree. As I said though, 4k isn't necessarily being designed for gaming. The technology will be released, early adopters will be screwed, PC gamers will pay way too much money to make it manageable - and in 2-3 years time 4k will be the new standard. Eventually, 4k will be the new 1080p. It will take the some time to get there, everyone was also reluctant about 1080p as well - so seeing this discussion here didn't really surprise me. PC gamers were reluctant to let go of their CRTs with 1280x1024 resolution as well some years ago, just as they want to stick with 1080p forever now.

Anyway, higher resolution will always make an image appear more sharp - but you are also correct higher poly counts and HD textures will always make graphics appear better as well. No argument here :) Game developers can and should do better in that respect. The only incorrect thing stated in this thread is that 1080p is "good enough" as if 1080p will remain the only resolutions used for next 20-30 years. 4k will happen, but as mentioned it will take a few years for it to become ubiquitous. Right now only early adopters will bite the bullet, and they will pay outrageous prices for borderline quality. But, it will be the new standard and the norm for every living room in 5-10 years.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Hmm, well, maybe I'll look at one for my wife's photography studio around BF 2014, hopefully by then they'll be a bit more affordable.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
On a higher resolution screen, you can place two pages side by side when doing work. Or have two windows open next to each other in a web browser. That's one real benefit of higher resolution monitors.

How is that a benefit if the content is too small to read? If you mean larger monitors in general, I fully agree. But if we're talking 24" 1080p vs 26" 4K, then not.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I'm surprised 4k is so demanding. The jump from 1080p to 1440p wasn't nearly this demanding.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Its not worth the GPU power just push all those pixels imo...

They should use the power to make the actual graphics better.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
No surprise there. 4k has 4x as many pixels as 1080p. 1440p has only 1.78x as many as 1080p.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
How is that a benefit if the content is too small to read? If you mean larger monitors in general, I fully agree. But if we're talking 24" 1080p vs 26" 4K, then not.

Cause it ISN'T too small to read. Anyone with corrected vision or 20/20 will be able to read the text fine.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Cause it ISN'T too small to read. Anyone with corrected vision or 20/20 will be able to read the text fine.

DPI scaling also helps. Have a really high pixel density makes text easier to read. I'd much rather read text on my laptop than on my 1440p desktop.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
DPI scaling also helps. Have a really high pixel density makes text easier to read. I'd much rather read text on my laptop than on my 1440p desktop.

Yeah, legibility has never been an issue with higher resolutions for me even at default DPI settings. 2560x1440 is perfectly fine on a 27 at default DPI, and 2560x1600 is even better at 30 inches. DPI and legibility is mostly a mobile / ultrabook issue - and in that case you can use higher DPI settings (I *highly* suggest using XP style scaling). Windows 8.1 is also doing further enhancements for high DPI settings, it will basically be 100% identical to what portable macbooks do in OSX. It will scale at 2.0 increments (as compared to the current 1.6).

In short, legibility is never an issue except with ultra portable products - and in that case you will simply increase the DPI setting. If the text is too small, increasing the DPI will alleviate that issue and make everything larger once again. As mentioned windows 8/7 are best with windows XP style scaling, and significant DPI improvements are coming to windows 8.1.
 
Last edited:

Sancus

Junior Member
Jun 6, 2013
7
0
0
Well, when I switch from 720p to 1080p, everything gets smaller, so...

On the same screen? Well, of course. The desktop standard is basically 100ppi. This is 1080p on a 22" screen. But higher PPI is not unreadable, it just depends on your vision and viewing distance. 3840*2160 on 22" would be too small for most people at 200ppi, but that's when you use Windows DPI scaling settings.

But there is a wide range of ppis that are acceptable, for example some people turn scaling totally off on their 15" retina macbooks so they can use the native resolution and this is totally possible if you have very good vision. That's 226. I use mine set at 1920*1200 scaling which makes it similar to 150 unadjusted PPI. A 4k monitor at 30 inches would only be 147ppi unscaled -- That is very usable for many/most people.

if you have a lot of problems reading text at >100 PPI then likely your viewing distance is too far or your vision is substantially worse than 20/20.
 
Last edited:

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
In a nut shell, I don't care right now.

I will care when 4K monitors are under 700 dollars and 120Hz. I will care with 4K TVs are 120Hz, under 500 dollars, and there's content available at that resolution.

Until then, it doesn't matter.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Anyone who's able to buy a 4 thousand dollar monitor, can probably afford to fill his computer with top of the line graphics cards. By the time 4K computer monitors are affordable, I imagine that GPU's that can run at 4K will be as well.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Why then get the icons and fonts smaller when I increase resolution? In Windows, at least. The term "real estate" was mentioned earlier, and that implies that the display content doesn't stay the same. Unlike with Apples retina display.

Because the icons are a fixed size. So many pixels by so many pixels. Not so with games.
 

Plimogz

Senior member
Oct 3, 2009
678
0
71
Every time new tech is on the horizon "we" are asked to speculate about it.... I really don't under stand the point. If we are lucky this thread won't turn into how a GTX 780 is a way better buy than the Titan or that Crossfire frame times suck. Just saying..

Your good-natured humor and obvious ignorance of GFX progress through past, present and future generations is clearly not pertinent to this thread. :p

But, yeah:
We don't really need 4K. It's just something the corps are pushing so there is a need to support it with uber GPU power.
I'm not the type of guy to say "640K is enough for anyone", BUT, I think 2560x1600 is the sweet spot. We are fine where we are now. Just my opinion. But I think a line needs to be drawn where improvement move beyond practical in the realm of intangible.
When large displays reach "retina" level resolution we might find that the cool VGA feature of the time is not so much to AA [relatively pixel-y 2560x1600] images but to increase performance by reducing the level of precision on less important parts of the images by "motion blurring" them.

Think about it. In much the same way as we currently (and at good expense) tweak vertices and textures (with AA) to make up for low resolution and the obvious distinctions between neighboring rows of pixels, wouldn't it follow that in a future with resolutions finer than the average eye can perceive some parts of the image would be "softened" prior to per-pixel shading and all that expensive stuff?

On the one hand this would have the effect, which is apparently desirable in some modern engine writer's minds, of softly blurring the more distant or less prominent parts of the image (which, to be fair, seems to espouse how real human visual perception works). And on the other, if it was implemented prior to the final shading (or rendering; or rastering, I don't know much about these things) it could save some GPU power by "blobbing" less remarkable parts of the ultra-high resolution 4K frame.

In support of this far-fetched vision of the GPU future I would mention the already existing move towards 'motion blurring' (which at this point costs performance rather than affording more of it), and also the natural working of the eye and the cerebral structures which interpret its input: recognizing moving objects or immediately pertinent environmental features are what the eye and the brain allocate the best resources towards, so a smart way to make better looking, faster processed 3D simulations in a sub-retina resolution future would perhaps be to cull resolution on less prominent parts of the 4K final frame (based on object, distance, or priority, etc... -- idk w/e), opting to instead resolve these in some other, lower (say, 1/4 or 1/2 -- the latter of which still amounts to more than the 25x16 w/ pertinent ppi that you're comfortable with now. Mr. 640K! :p) resolution.

This way, motion blur would be a feature you turn on to gain performance at the expense of ridiculously sharp, sub-retina, resolution. And AA would be extinct as an interest to the 4K-level enthusiast.

Though I guess it could easily subsist as a general-hardware-based option for those at 1080P, playing with an APU on a budget.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
I'll be honest...we need higher resolution textures and higher polygon count models and characters in the games first. Resolution only sharpens up the image, higher poly counts and higher resolution textures makes a game look immensely better even at 1080p.

This is where I'm at. This and draw distance and 60fps with Vsync at 1080p on single 20nm gpu please.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
In a nut shell, I don't care right now.

I will care when 4K monitors are under 700 dollars and 120Hz. I will care with 4K TVs are 120Hz, under 500 dollars, and there's content available at that resolution.

Until then, it doesn't matter.

I have never bought a TV that was less than $1k myself lol
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
20nm single GPUs won't handle it.

I think this is correct. If it takes multiple Titans/780s to get a game working at playable frame rates now, I doubt you will get a single card with the performance of 2x Titans in the next release.
 

Plimogz

Senior member
Oct 3, 2009
678
0
71
Of course they won't handle it. Even if we hope for a near doubling of top-of-the-line performance with the process shrink (which is wholly unreasonable give the trend over the last couple of die shrinks and four iterations of consumer GPUs from either maker), 4K is a once-in-a-decade kind of upgrade in resolution.

The question should be not so much whether 20nm GPUs will be able to handle 4K res, but rather whether early adopters of first-gen 20nm GPUs will be able to afford decent 4K computer displays to go along with their shiny new GFX cards.