Will 20nm based gpus be able to handle 4k res for gaming?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

brandon888

Senior member
Jun 28, 2012
537
0
0
Even if GTX 880 will be 75-80% faster then 780 .... ( what is a bit unrealistic : ))) )

you still won't max out many demanding games .... so i think only volta GPU in 2016 will be strong enough ....
 

paul878

Senior member
Jul 31, 2010
874
1
0
Performance jump between generation of CPU and GPU have slow down dramatically. I highly doubt 20nm will be able to handle 4k with AAA titles.
Maybe further down the road muti-core GPU (like current cpu) will be practical and bring a major performance boost.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Performance jump between generation of CPU and GPU have slow down dramatically. I highly doubt 20nm will be able to handle 4k with AAA titles.
Maybe further down the road muti-core GPU (like current cpu) will be practical and bring a major performance boost.

Gpu performance hasnt slowed that much and they're already multi core.
 

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Anyone who is saying 4K is "stupid" is not seeing the technological advances. This is not a gimmick like 3D is today. The jump from 1920x1080 to 4K is immediately noticeable and the clarity improved significantly. Remember the jump from 640x480 to 1280x720? I thought that was a very noticeable jump; and 720p to 1080p was another doubling of pixels and we are still not even close to having a 1080P saturated market like 480P was.

Sure, it may be getting harder to see pixels from X distance away, but the human eye can discern a lot. 4K will enable higher PPI where 1080P lacked. Ever see a big 1080P TV? Those are very pixel-y.

But these huge resolutions are annihilating our current GPU's.

What I saw from Anand's 4K benchmarks were that 2GB of VRAM is not enough. Titan's 6GB of VRAM did not bottleneck the card. That was with 4x AA enabled (which can significantly raise VRAM use), which in my opinion is rather unreasonable; especially with the unequipped cards of today.

We need more evidence. If someone has a 4K monitor and would be willing to run a few benchmarks for games with medium/high settings without anti-alising, we could see how our current cards perform. Once games create even higher resolution textures for 4K, even 3GB of VRAM will not be enough. It would be interesting to see how high-end 2GB cards stack up against larger VRAM versions of themselves (GTX 680 2GB VS 4GB) when 4K becomes more mainstream.

Edit: I just realized the vast amount of pixels these GPU's will have to drive in the near future:

- 720P = 0.9 megapixels
- 1080P = 2.1 megapixels
- 2160P/4K = 8.3 megapixels
- 4320P/8K = 33.1 megapixels (IMAX at home, anyone?)

Sharp unveiled an 8K TV at CES this year. Its coming sooner than we think.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Anyone who is saying 4K is "stupid" is not seeing the technological advances. This is not a gimmick like 3D is today. The jump from 1920x1080 to 4K is immediately noticeable and the clarity improved significantly. Remember the jump from 640x480 to 1280x720? I thought that was a very noticeable jump; and 720p to 1080p was another doubling of pixels and we are still not even close to having a 1080P saturated market like 480P was.

I so completely agree with you here. Now it's going to take several years for 4k to completely take hold, but I have no doubts that 4k will be the new 1080p some years from now.

Fact of the matter is, every resolution increase (480p, 720, 1080p) has met some resistance with various folks stating "what we have is good enough". In every case, they were wrong, and they're wrong with 4k as well. 4k will be the new standard, the only question is how many years will it take to become ubiquitous and not unreasonable in price. Obviously, 4k is not reasonable in terms of pricing currently, but it will ever so slowly get there - I'm hoping it will become more commonplace within 5 years. As well, hopefully GPU manufacturers will be up to the task of bringing graphics up to speed in the next 2-3 years.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not saying 4k is stupid. I'm saying game developers can make games look immediately better on current monitors and resolutions with rendering the game objects and environment at higher resolutions. You wouldn't need new monitors for that.

Before we go up in resolution we need game developers to quit giving us sloppy looking low resolution textures and start offering higher polygon counts. Use the hardware and memory for something beneficial to everyone immediately.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Even if GTX 880 will be 75-80% faster then 780 .... ( what is a bit unrealistic : ))) )

you still won't max out many demanding games .... so i think only volta GPU in 2016 will be strong enough ....

I really do not like the idea that people have about having to "max" every game to be considered enough.

Many games to day look great at even medium settings, with only small improvements beyond medium.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
60hz is pointless.

120hz or no sale.

I can get a 120hz 2560x1440 PLS 27" DL-DVI Monitor with very good color reproduction and without OSD input lag for 300 USD Shipped from Korea in 1-2 days.

There just aren't connector standards that will do 120hz 3840x2160 even if such a panel existed.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
60hz is pointless.

120hz or no sale.

I can get a 120hz 2560x1440 PLS 27" DL-DVI Monitor with very good color reproduction and without OSD input lag for 300 USD Shipped from Korea in 1-2 days.

There just aren't connector standards that will do 120hz 3840x2160 even if such a panel existed.

I personally agree with needing 120hz or no sale, but I think we are in the minority. I'd also rather have Lightboost or similar over IPS, but that's another discussion.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
I personally agree with needing 120hz or no sale, but I think we are in the minority. I'd also rather have Lightboost or similar over IPS, but that's another discussion.

Considering the only AAA games that come out for PC these days are Shooters, I question people's judgement when they are defending running 60hz (and even 4k 30hz!!!!!!!!!) displays when running any kind of frame limiter or vsync is absolutely detrimental to the actual twitch game-play of shooters.

In my day, we turned our CRTs to 90+ hz for Unreal Tournament/Counter-Strike/etc.

Grumble Grumble Grumble Get off my lawn!

Also, Lightboost monitors have some pretty huge input lag, most likely from the large amount of signal processing necessary to achieve the effect, so I wouldn't choose one of those either.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The input lag will vary from one model to another, but I don't see it worse than the Korean IPS monitors.

I did notice a comment in a review I just looked up that basically found an Acer lightboost monitor which has very poor input lag, while the Asus vg278h has very good input lag.

It probably isn't specifically about Lightboost that caused input lag in the reviews you've read.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I really do not like the idea that people have about having to "max" every game to be considered enough.

Many games to day look great at even medium settings, with only small improvements beyond medium.

I have to have every detail option except AA on the highest setting. Otherwise why did I buy the game on PC instead of console.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I have to have every detail option except AA on the highest setting. Otherwise why did I buy the game on PC instead of console.

Because most console games look like crap and run like crap. This is coming from someone who has all three consoles and a pretty good gaming PC. Games like Crysis 3 look amazing on medium already
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I have to have every detail option except AA on the highest setting. Otherwise why did I buy the game on PC instead of console.

I'd have to differ with you a bit here. While I can appreciate maxing games out, more often than not - most AAA titles have useless and overkill graphical settings that have minimal graphical impact yet huge framerate impacts. There are settings that literally lower the framerate by 30-40 fps. I've looked at screenshots and played through crysis 3 between high quality and very high quality, and there isn't a difference. Where there is a difference, you'd need to have super-human eyes to spot it, and I have 20/20 vision. For instance, ADOF in Metro 2033 has no visual impact yet causes a dramatic framerate loss, the VH shadows setting has minimal impact in crysis 3 yet causes a framerate hit of 15-20 fps in some areas.

Even at the low/medium quality settings, a PC title will always look substantially better than a console counterpart (assuming the developer did their part) as textures and assets will be higher quality. Crysis 3 looks better across the board due to AA, higher resolution textures and tessellation, and doesn't need to be "maxed out" to look substantially better.

This is aside from the fact that even a Titan can't technically "max out" most games at 1600p. So at that point, you're looking at another 1000$ for the privilege of maxing some titles out. I don't know, that's an iffy proposition......when maxing games out don't affect output that much (or in some cases not at all) and you can lower 1-2 settings and use something like FXAA instead of 8X MSAA - and by lowering 1-2 settings you'll get a completely fluid 60 fps 100% of the time. I'm able to play crysis 3 on a single 780 with some settings at H instead of VH, using FXAA, and I can maintain 60 fps. I guess it's completely subjective, but I've been completely surprised at how a 780 can handle 1600p in a single card configuration like a champ. I'm super pleased with it.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I have to have every detail option except AA on the highest setting. Otherwise why did I buy the game on PC instead of console.

As those above me mentioned, PC gaming is hugely ahead of consoles already, even at medium settings and as gaming has advanced, many of the highest settings have very little visual improvements.

PC games have these settings for various reasons, but many aren't really even meant to be used by design, otherwise their "Recommended" systems wouldn't be able to play their games. PC's have a variety of setups, they offer these settings as options, not requirements.

They give options for those who want to see the cutting edge, do sweat it if you can't play the games with those settings.
 

fuzzymath10

Senior member
Feb 17, 2010
520
2
81
Another thing that needs to happen is the normalizing of apparent resolution across devices. i.e. it's almost a crime to be selling TN 768p laptops in 2013 when you could get non-TN 1920x1200 in a laptop 10 years ago. That resolution is only acceptable in a phone. 4K is necessary only at the moderate end of the spectrum, maybe starting at 22-24".
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I so completely agree with you here. Now it's going to take several years for 4k to completely take hold, but I have no doubts that 4k will be the new 1080p some years from now.

Fact of the matter is, every resolution increase (480p, 720, 1080p) has met some resistance with various folks stating "what we have is good enough". In every case, they were wrong, and they're wrong with 4k as well. 4k will be the new standard, the only question is how many years will it take to become ubiquitous and not unreasonable in price. Obviously, 4k is not reasonable in terms of pricing currently, but it will ever so slowly get there - I'm hoping it will become more commonplace within 5 years. As well, hopefully GPU manufacturers will be up to the task of bringing graphics up to speed in the next 2-3 years.

I think that other resolution increases actually were worth it. For gaming, I think 4K, given the distance we sit is great, for Movie fans, who sit close to their 70-80inch+ TV, it's great.

For the average consumer who gets a 4K HDTV? They probably won't notice it much at all.
This article explains it :http://reviews.cnet.com/8301-33199_7-57566079-221/why-ultra-hd-4k-tvs-are-still-stupid/

I'll quote the main points though:
1. I love 4K
It is my fault for trying to make a nuanced argument on the Internet. I have a 102-inch "TV" and sit 9 feet from it. I would love to have 4K. When I expand out to fill the full 10-foot-wide 2.35:1 screen, I can see pixels with some projectors. I look forward to more 4K projectors. Projectors are not TVs; 4K TVs are a waste. This is because...
2. The eye has a finite resolution
This is basic biology. The accepted "normal" vision is 20/20. In response to my previous articles on the stupidity of 4K TVs, many people argued they had better vision, or some other number should be used. This is like arguing doors should be bigger because there are tall people. Also, just because you have better vision, doesn't mean most people have better vision. If they did, it wouldn't be better, it would be average.
Try this. Go to the beach (or a big sandbox, or a baseball diamond). Sit down. Start counting how many grains of sand you can see next to you. Now do the same with the grains of sand by your feet. Try again with the sand far beyond your feet (like, say, 10 feet away). The fact that you can see individual grains near you, but not farther away is exactly what we're talking about here. The eye is analog. Randomly analog at that. So of course some people are going to see more detail than others, and at different distances, but 20/20 is what everyone knows, and it is by far the most logical place to start any discussion.
Is there some wiggle room thanks to variances in how people see? Yes, of course. Here's an awesome chart:
(Credit: Carlton Bale)
Let's skip ahead a step. Getting bogged down in the specifics misses the big picture. The eye does have a finite resolution, and if you want to argue it's better than 20/20, you're still conceding the point. You're just saying that smaller 4K TVs are viable. How much smaller? Well, not 50 inches. Probably not 60 inches, either. These are the sizes people are buying. Most people are buying even smaller TVs. Which leads to...
3. 84-inch TVs are never going to be mainstream
Never. Ever. Never ever. Like I said earlier, I have a 102-inch screen. I've also reviewed an 80-inch Sharp LCD. And let me tell you, it dominates the room. It's massive. There is a significant difference between a screen (effectively, the wall), and a Device of Unusual Size. Enthusiasts might be OK with this thing in their room, but most people won't. Ask your spouse. Ask your spouse's friends. Screen sizes have been inching upward, but not linearly with price. More specifically, the prices of big-big screens have fallen much faster than their sales have increased. I don't know what the upper limit is for what the average consumer decides is "too big" for their room, but I'm positive there is an upper limit, and this limit is far smaller than screens that need 4K.
I should clarify what I mean by "TV." I'm specifically talking about the televisions we know today. When OLED becomes something you can paint on your wall, or so paper-thin it hangs like a poster, then absolutely people will get bigger screens (presuming they're cheap). However, this is years (decades?) away. This future awesomeness is different than TVs of today. Will we still call them "TVs"? Yeah, probably, but their presence in the room will be radically different, hopefully because these future wafer-thin "TVs" won't have a presence in the room. They'll be part of the wall.
4. Viewing distance hasn't changed with HD, why would it change with UHD?
In the old days of 480i CRT tube TVs, people sat roughly 9 to 10 feet away from their TVs. There were good reasons for this (scan lines). Modern TVs offer significantly better resolution, so people can sit closer. Except...they don't. Most people still sit the same distance from their TVs as they did before.
Could people sit closer? Sure. A lot closer, actually. This ties in exactly with point No. 3. Sitting closer would be like getting a bigger screen, as it takes up more of your field of view. Just as people aren't getting as big a TV as they could, people aren't sitting closer, either.
So they can sit closer now, but don't. Why would anyone assume that because of UHD, people would suddenly sit closer. It doesn't make any sense. And just like with No. 3, I don't think most people would want to sit closer. Some of you might want to sit 5.5 feet from a 84-inch screen, but you are a tiny minority.
And speaking of viewing distance, this is precisely why comparisons to the Retina Display iPad are specious. The viewing distance is rather different between a TV and a tablet. Or, as President of DisplayMate Technologies Corp. Raymond M. Soneira says, your TV is already a Retina Display.
5. Why 4K?
Ah, now this is an interesting question. It's clear many seem to think TV manufacturers are some sort of altruistic entities that only do new things if there's a benefit to the consumer. How adorable, but no. Ultra HD isn't the "new technology" it appears. Modern TVs are made from huge sheets of "motherglass." From this big piece, companies slice up smaller pieces to make televisions. It's easier (read: cheaper) to make a big piece and cut it into smaller TVs.
Originally this was in case there was a problem with part of the glass, the rest could still be sold as TVs. When you read about "yields" as part of TV manufacturing, this is largely what they're talking about.
But manufacturing has gotten really good, so most of these pieces of motherglass are fully used. Instead of slicing up one piece of motherglass into four 42-inch 1080p LCDs, what if you just kept the whole thing as one piece? What would you have? You'd have an 84-inch TV. Use the exact same (or similar) drive elements/electronics and all the various bits, and you've got a 3,840x2,160-pixel, 84-inch UHD TV. Hey, wait.
You see, TV companies are pushing 4K because they can. It's easy, or at least easier than improving the more important aspects of picture quality (like contrast ratio, color accuracy, motion blur, compression artifacts, and so on).

I wish that this was addressed:
9. There are bigger issues
My biggest complaint about Ultra HD is what it doesn't address. Resolution is not the most important aspect of picture quality. Nor, as we've discussed, is it even a problem with current picture quality. How about improving contrast ratio, color, and compression artifacts? These all have a significantly greater effect on picture quality than resolution.
I'll add another problem to the list of things 4K doesn't address: motion resolution. All LCDs suffer from motion resolution problems, in many cases, losing upward of 40 percent of their visible resolution when anything on the screen moves. All announced (and most of the previewed) Ultra HD displays are still just LCDs, with all of that technology's shortcomings. These so-called "next-generation" televisions will still have poor off-axis picture quality and mediocre contrast ratios. They'll likely have poor picture uniformity, too, as many models are edge-lit. True, they all have higher refresh rates, but without motion interpolation, higher refresh rates do little to fix motion blur. If the drop in resolution with current LCDs is any indication (and No. 5 shows it is), these "2160p" TVs will resolve something like 1,296 lines with motion.
Perhaps this is why nearly every demo at CES of 4K and 8K TVs showed slow pans and still images. Check out "What is refresh rate?" for more on motion resolution.

Cliffs:
-Great for enthusiats for gaming/moviebuffs (willing to sit close to their big screen mammoth TV)
-Not necessary for the average consumer/won't see benefit.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I have a 55" 1080p television and it's easy to see pixels. I don't get the hatred for 4k. I'd love to have a 4k TV and monitor.
Edit: That CNET review is ridiculous. He has some good points, but resolution is still very important as 1080P is so grainy. I was a doubter of retina-like displays but not being able to distinguish pixels is great.
 
Last edited:

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Once 4K prices start dropping, 1080P will mysteriously start to vanish. Take a walk to Best Buy and try to buy a Standard Definition TV. Every TV at big box stores is HD. Once 4K and 8K break into the market, the price difference from 1080P to Ultra-HD resolutions will be pretty marginal.

Also, if you buy a 4K monitor and set it to 1920x1080, it shouldn't look too fuzzy because it scales perfectly. Even if your hardware can't run games at 4K, a non-fuzzy-looking 1080P will still look good.

Console-like scaling would be pretty cool, especially for budget gamers.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
With 4K being a TV standard (unlike 1200, 1440, 1600) the price will drop pretty quickly. The graphics cards to drive them at max settings though, will be very pricey and will stay that way for a long time. AMD and nVidia play their cards right and everyone will be hawking $2k worth of cards to get "acceptable" performance.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
i plan to get 46 inch TV in few months .... and will sit like 2-3 meters away ... not sure that i will see any pixels on 1080p .... or i will ?

I mean ... if i don't want to sit too close to Tv like 1-1.5 meters .... what is the point of 4K ? i can't even see difference between 720p and 1080p when i see even too far .. like 3-4 meters ...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Because most console games look like crap and run like crap. This is coming from someone who has all three consoles and a pretty good gaming PC. Games like Crysis 3 look amazing on medium already

Sorry... Medium is garbage to me
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
medium is not garbage in games like Crysis 3. you would have to stare a screenshot to notice the difference there.

I think its garbage and will only play on the maximum detail settings. Again I'm sorry my opinion bothers people.

I want the visual details. Villify me and tell me I'm wrong all you want. If I buy a PC game I want all the details and graphics it offers turned on.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think its garbage and will only play on the maximum detail settings. Again I'm sorry my opinion bothers people.

I want the visual details.
so you act like a child and max all the sliders and pout if it does not run smooth? :rolleyes:

I could post videos all day long and you would not have a clue what the actual settings are if I did not tell you. those with common sense reduce some of the settings if they dont see a visual difference. hell sometimes just knocking down shadows a notch or two can double or triple framerates making a game perfectly smooth with no obvious visual difference. keep on with your brilliant plan though.