• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will 20nm based gpus be able to handle 4k res for gaming?

Kippa

Senior member
Just curious, when 20nm Maxell based gpu and the AMD 20nm equivalent comes out next year, will they be able to handle current games at 4k resolution? Obviously the specs for the 20nm Nvidia and AMD cards aren't out yet, but as a guestimate how do you think they would fare? I am not that technically clued up enough to do a rough guess.
 
I dont know how think anyone can give an answer but I would say no chance. games will only get more demanding so even if the cards get stronger we will be right about where we are now with games being unplayable at that resolution.
 
Did you look at Anands review of 4k? We need atleast a doubling of gpu power (titans no less) to make 4k useable but not completely playable. Some games will be fine with a single titan but I would assume that the vast majority of current and future titles with AAA graphics will require much more computing power before 4k is feasible for the average consumer. By much more, we are talking 4x current processing power and that will not happen with a single die shrink.
 
Last edited:
A single Titan runs BF3 at like 30fps at 4K, and I think that was at medium details. In 2016 a $550 GPU might be able to run BF3 at 60fps @ 4k.
 
Just curious, when 20nm Maxell based gpu and the AMD 20nm equivalent comes out next year, will they be able to handle current games at 4k resolution?.

Since you didn't specify whether said games would need to be run at maximum settings, the answer, obviously, is yes
 
A single Titan runs BF3 at like 30fps at 4K, and I think that was at medium details. In 2016 a $550 GPU might be able to run BF3 at 60fps @ 4k.
Depends on the settings, but I run BF3 on med/high with low AA on a single 6970 at 2560 x 1600 and get 50 FPS avg. I think sli titans or xfire 7970s handle eyefinity resolutions that are in the same ballpark as 4K just fine.
 
I have run eyefinity (3x1) on 1080p with a 7970 and haven't had really any trouble. BF3 ran maybe 30fps. But that isn't that bad and is still easily playable.
 
Every time new tech is on the horizon "we" are asked to speculate about it.... I really don't under stand the point. If we are lucky this thread won't turn into how a GTX 780 is a way better buy than the Titan or that Crossfire frame times suck. Just saying..
 
We don't really need 4K. It's just something the corps are pushing so there is a need to support it with uber GPU power.
I'm not the type of guy to say "640K is enough for anyone", BUT, I think 2560x1600 is the sweet spot. We are fine where we are now. Just my opinion. But I think a line needs to be drawn where improvement move beyond practical in the realm of intangible.
 
We don't really need 4K. It's just something the corps are pushing so there is a need to support it with uber GPU power.
I'm not the type of guy to say "640K is enough for anyone", BUT, I think 2560x1600 is the sweet spot. We are fine where we are now. Just my opinion. But I think a line needs to be drawn where improvement move beyond practical in the realm of intangible.

I would personally rather see a push for higher resolutions than a push for new forms of AA (or any kind if AA for that matter).
 
We don't really need 4K. It's just something the corps are pushing so there is a need to support it with uber GPU power.
I'm not the type of guy to say "640K is enough for anyone", BUT, I think 2560x1600 is the sweet spot. We are fine where we are now. Just my opinion. But I think a line needs to be drawn where improvement move beyond practical in the realm of intangible.

You're kidding, right? 2560x1600 has been the highest resolution available to the masses for, what, 10 years or so? 3d, at least in tv's, hasn't really taken hold, but 4k will be an extremely easy transition. Will it be challenging for NV/AMD to keep up? Sure it will. However, this will at least provide an impetus for something more than the tiny iterative improvements that we have been getting over the past couple of years. I want to see something cool like 8800 gtx or 9700 pro again, only this time from both camps at once.
 
Yeah, I'm sure everybody has their own thoughts on this. I feel that AA is important and as we go up in resolutions, AA requirements go down. So with 4K, we may not need AA at all. I don't know. Barely need it with 2560x1600 I'm finding. But there are those who want max AA at any res, cost be damned. So, like anything, its all relative.
 
You're kidding, right? 2560x1600 has been the highest resolution available to the masses for, what, 10 years or so? 3d, at least in tv's, hasn't really taken hold, but 4k will be an extremely easy transition. Will it be challenging for NV/AMD to keep up? Sure it will. However, this will at least provide an impetus for something more than the tiny iterative improvements that we have been getting over the past couple of years. I want to see something cool like 8800 gtx or 9700 pro again, only this time from both camps at once.

Has been available, yes. What's been actually utilized by the masses is 19x10 or less. So no. I'm not kidding.
 
I'll be honest...we need higher resolution textures and higher polygon count models and characters in the games first. Resolution only sharpens up the image, higher poly counts and higher resolution textures makes a game look immensely better even at 1080p.
 
I'll be honest...we need higher resolution textures and higher polygon count models and characters in the games first. Resolution only sharpens up the image, higher poly counts and higher resolution textures makes a game look immensely better even at 1080p.

Agreed completely. I game at 2560x1440 and while it looks decent you high rez textures would make it look a whole lot better.

I usually only run 2x MSAA at most when gaming because it looks almost no different at 4x and there's a huge frame rate penalty for jumping up to 4x.
 
To me, going 4k is good for people who do work on their computer. Or for movie buffs when we start seeing 4k releases. The sharpness of the screen should be really nice and the resolution should offer great real estate for getting work done.
 
But doesn't a higher resolution on a fixed screen size make everything illegibly small? In 2D, that is.
As for movies, I couldn't say. I think 1080p on 22-24" is already pretty good. I don't see any artifacts or jaggies, but maybe I'm not looking hard enough.
 
4K for gaming starts being useful when the textures used in games are actually high resolution, at least 2048x2048 by default
 
I'll be honest...we need higher resolution textures and higher polygon count models and characters in the games first. Resolution only sharpens up the image, higher poly counts and higher resolution textures makes a game look immensely better even at 1080p.

I agree with this ^. Nice summation.
 
But doesn't a higher resolution on a fixed screen size make everything illegibly small? In 2D, that is.
As for movies, I couldn't say. I think 1080p on 22-24" is already pretty good. I don't see any artifacts or jaggies, but maybe I'm not looking hard enough.

I don't see why. Imagine a 20" screen with only Twenty thousand pixels on it, or twenty billion. Same image. One is incredibly higher in definition.
 
We don't really need 4K. It's just something the corps are pushing so there is a need to support it with uber GPU power.
I'm not the type of guy to say "640K is enough for anyone", BUT, I think 2560x1600 is the sweet spot. We are fine where we are now. Just my opinion. But I think a line needs to be drawn where improvement move beyond practical in the realm of intangible.

Having used plenty of 25x16 setups and having tried SHARP's 4K IGZO display, I can tell you that 25x16 just seems woefully pixelated and old compared to how crystal clear 4k was. It's astonishing how much difference it makes, with monitors I notice the difference because I am much closer to the screen than say, a TV.
 
I don't see why. Imagine a 20" screen with only Twenty thousand pixels on it, or twenty billion. Same image. One is incredibly higher in definition.

Why then get the icons and fonts smaller when I increase resolution? In Windows, at least. The term "real estate" was mentioned earlier, and that implies that the display content doesn't stay the same. Unlike with Apples retina display.
 
I would guess once 4K displays become more normal (the cost is more affordable for the average person) we will start to see cards that can handle it. I will also guess that day is about 2 years away. Right now 4K is a luxury thus the card makers assume if you can afford a 4K monitor you can afford a high end card.
 
Back
Top