Future 4k gaming

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Doppel

Lifer
Feb 5, 2011
13,306
3
0
4k will only have an eye-detectable increase in quality either up close or on a bloody massive screen. We already know for example that at say 10 feet on 50 inch screen any res better than 720p is literally wasted. Google carltonbale 1080p 720p viewing distance for more.

It will be MANY years before anything can real time render a complex physics scene at 1080p looking more or less like a blu ray movie. Doom is now 19 years old and look how far we have come, but take your best game now and put I next to a movie and it still looks like a cartoonish joke.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
pre rendered CGI looks decent (though not for faces etc). how far away for rendering the latest CGI resi evil movie are we on GPUs? at 1080p i mean?

that movie had a few scenes that could have been spliced into a live action movie and i doubt people would notice.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Decades at our current pace, assuming at some point we switch to ray tracing and then radiosity lighting.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You can power 4k res now for many games, it's about 8MP which is similar in size to 2x 25600x1600 monitors, we're already have systems powering 3x1 landscape configs with these monitors, it's ultra high end stuff but in about 2 more generations that will be do-able in a single high end card.
So, 7-10 years, then? Until it filters down to midrange hardware, it's not here, yet; and midrange hardware can still be pushed quite a bit at 1080P.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Once stuff with Ray Tracing and other techniques for more realism start trickling in, we will be turning down the resolution to turn these effects on.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
So, 7-10 years, then? Until it filters down to midrange hardware, it's not here, yet; and midrange hardware can still be pushed quite a bit at 1080P.

No, pretty much just 1 additional generation to go from single high end card to single mid range card. Each generation being about 18 months, means we can see mid range cards powering 4k (with todays games) in about 4.5 years time.

The main problem with that measurement is that games will become more demanding in that time, at the moment we have a VERY big problem in PC gaming, the software is not pushing the hardware, most of our games are aimed at the console platforms which are running 7 year old hardware. This is why 1080p is so do-able with todays mid range cards, because the graphics load hasn't increased over time like it once did, it's stagnating and when the consoles refresh their hardware it's going to take a massive jump forwards.

The good thing about graphics horsepower is that it's a exponential growth, so it doesn't take many years to make huge strides. In the 7 odd years we've had since the console launch (about 4-5 GPU generations) we now have single video cards for PCs which are quite literally 15-20x faster by most metrics (memory bandwidth, RAM, FLOPS etc) and even larger if you consider multi-GPU solutions.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The other problem is games run poorly sometimes but it's not due to the weak hardware for that title. They just aren't coded optimally for the PC platform overall.
 

richprice79

Junior Member
Dec 18, 2012
15
0
66
4k projection would be the real killer app as mentioned earlier. 100" 4K 2000 lumen led projector. I can dream right>?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
No, pretty much just 1 additional generation to go from single high end card to single mid range card. Each generation being about 18 months, means we can see mid range cards powering 4k (with todays games) in about 4.5 years time.
No, at least 2.

Here's a pertinent example: the Geforce GTX 480 came out in March of 2010. Today, you still can't get a card for <$200, at every day prices, that can match it. Thanks to AMD being aggressive, you can do so around $225-250, but it still looks like it's going to be >=3 years, or about 2 generations (the refreshes coming next year from both companies).

The days of that taking only 1 generation died around the GF2/3 time frame. That was a long time ago.
The main problem with that measurement is that games will become more demanding in that time, at the moment we have a VERY big problem in PC gaming, the software is not pushing the hardware, most of our games are aimed at the console platforms which are running 7 year old hardware. This is why 1080p is so do-able with todays mid range cards, because the graphics load hasn't increased over time like it once did, it's stagnating and when the consoles refresh their hardware it's going to take a massive jump forwards.
For some games, yes. But it's not hard, today, to stress a card at 1080P. Due to consolitis, you need a game that supports user mods. When the games get more demanding, though, that will only make it harder for higher-res to be fast enough with some given GPU.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
4k projection would be the real killer app as mentioned earlier. 100" 4K 2000 lumen led projector. I can dream right>?

I've had my eye on this for a while, I'd love to replace my 1080p 3LCD projector with an insane 4k one but they're going to need to come down in price a lot, right now there's not much competition but there will be over the next few years.

http://www.projectorpoint.co.uk/projectors/Sony_VPL-VW1000ES.html

Once these come down to the <£5000 range I'll start seriously looking at one, but you're right projectors will be the "killer app" for this, the problem is that there's a limited return on the improvement you can actually see from resolution so it doesn't make sense to have traditional sized screens of say 24" with a resolution this high. With projectors throwing large screens with ease (mines about 125" atm) you'll see an amazing difference.


No, at least 2.

Here's a pertinent example: the Geforce GTX 480 came out in March of 2010. Today, you still can't get a card for <$200, at every day prices, that can match it. Thanks to AMD being aggressive, you can do so around $225-250, but it still looks like it's going to be >=3 years, or about 2 generations (the refreshes coming next year from both companies).

The days of that taking only 1 generation died around the GF2/3 time frame. That was a long time ago.
For some games, yes. But it's not hard, today, to stress a card at 1080P. Due to consolitis, you need a game that supports user mods. When the games get more demanding, though, that will only make it harder for higher-res to be fast enough with some given GPU.

As I've said a few times before the market in the last few years has been extremely underwhelming, I think there's a few factors contributing to that such as limits on power draw for PCI-E devices but the primary reason for that is simply that we do not need the power, we literally have about 3 maybe 4 games that can stress modern hardware at 2560x1600 yet most people are running 1080p or less (HALF the number of pixels), i don't think there's any game that struggles to run in 1080p on a modern video card.

Most of the focus has been on how to spend that power, first we saw eyefinity where we simply chuck 3x more pixels at the card which seemed it would be impossible at first but at that stage we still saw healthy increases in speed in the 4xxx/5xxx/6xxx era from AMD. Gamers needed to justify why to buy the new high end cards. Nvidia went the PhysX and 3D route, I've personally shifted to 120hz gaming as STANDARD, I expect 100+ FPS from my games and I still get that with a single high end card from LAST generation.

The whole market is fucked, outside of some fringe cases (The Witcher 2, Crysis 2 cranked, maybe Metro2033) basically 99% of the games are console ports designed SPECIFICALLY for 7 year old hardware.

I think the market will pick up again and we'll see an aggressive return to large increases every generaton it's going to take the release of the new console generation and a major step up hardware wise. I've had a flagship card(s) from every single GPU generation going back to the Geforce 4 days and I've always been an early adopter, the latest generation was the first generation I've skipped, despite being an enthusiast and having way more disposable income now than I've ever had, I simply cannot justify wasting money when all the games are fucking lame and hardware development is so - meh...
 

Doppel

Lifer
Feb 5, 2011
13,306
3
0
Stumbled across this today on reddit, thought it was kinda relevant here.

http://www.errantsignal.com/blog/?p=399
Very interesting, lots of good points!

He knows more about this than I ever will for sure, but I don't agree with his cost analysis around the 3:00 mark. Yes, games will get more expensive to serve up quality, but people are throwing more money at them than ever before. Black ops 2 sold a billion inside of two weeks. Games will continue to blow up bigger budgets. Also, he ignores the possibility of a revolution in how art work is grabbed for games, maybe making it all easier and quicker.

Still at 4:50 throwing that can at the guy's face he makes a good point, having perfect speech for example while also throwing something at a person's face that could mess up their mouth while talking...this is why it will be so many years.

Of course, we all know that at least one idiot journalist while doing a piece on the next gen console will find a quote somewhere that it will be photo realistic, or at the very least "pixar like" graphics. Example: http://www.mtv.com/news/articles/1572569/playstation-3-game-made-rival-pixar-gamefile.jhtml
The hook is the image, the approaching-Pixar graphical quality.

http://www.cinemablend.com/games/Ta...soles-Bring-Development-Costs-Down-48908.html
Long story short: Software is now very capable of photorealistic graphics, as evidenced by the Brigade Engine. Making photorealistic games is not as expensive as many people think

A claim like that is funny considering photorealism right now in real-time actually costs INFINITY because nobody in history has ever done it. You can find demos on youtube of perhaps an incredibly expensive machine getting close to it on a single model, that's it, certainly nothing approaching an actual environment.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
Tesselate those gun barrels, seems a no brainer instead of say subdividing concrete dividers...
My thoughts exactly.

As for resolutions, they do need to go up on single screens. 2560x1600 is not even close to being enough as there's very visible aliasing without AA applied.
 

kache

Senior member
Nov 10, 2012
486
0
71
Very interesting, lots of good points!

He knows more about this than I ever will for sure, but I don't agree with his cost analysis around the 3:00 mark. Yes, games will get more expensive to serve up quality, but people are throwing more money at them than ever before. Black ops 2 sold a billion inside of two weeks. Games will continue to blow up bigger budgets. Also, he ignores the possibility of a revolution in how art work is grabbed for games, maybe making it all easier and quicker.

Still at 4:50 throwing that can at the guy's face he makes a good point, having perfect speech for example while also throwing something at a person's face that could mess up their mouth while talking...this is why it will be so many years.

Of course, we all know that at least one idiot journalist while doing a piece on the next gen console will find a quote somewhere that it will be photo realistic, or at the very least "pixar like" graphics. Example: http://www.mtv.com/news/articles/1572569/playstation-3-game-made-rival-pixar-gamefile.jhtml

http://www.cinemablend.com/games/Ta...soles-Bring-Development-Costs-Down-48908.html


A claim like that is funny considering photorealism right now in real-time actually costs INFINITY because nobody in history has ever done it. You can find demos on youtube of perhaps an incredibly expensive machine getting close to it on a single model, that's it, certainly nothing approaching an actual environment.

Photorealism is impossible in games (where things are unpredictable) without voxels+raytracing, and it's gonna be a long long time before we can do that in real time.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
i can't see how games are going to be ever more expensive. there comes a point surely that the cost of a failed game or 2 ruins even a large company (i guess we're there actually).
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
My thoughts exactly.

As for resolutions, they do need to go up on single screens. 2560x1600 is not even close to being enough as there's very visible aliasing without AA applied.

But that is what AA is for. MSAA, even SGSSAA are always more performance-efficient than a higher resolution that provides the same visual effect.

Development of new generations will slow down anyway. Afaik, 20nm is in worse shape than 28nm was at that time and there is talk that initial Maxwell and HD9000 parts will be still 28nm. I guess the cycle will increase to 3 years between node jumps.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,494
4
81
The other problem we are going to run into is the interconnects. Especially if we want 120hz 4k gaming. That is some serious bandwidth.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I think the market will pick up again and we'll see an aggressive return to large increases every generaton it's going to take the release of the new console generation and a major step up hardware wise.
It's also going to take magic pixie dust. The heat has to go somewhere, and the costs have to be made up somewhere. A little more demand for more powerful hardware may drop prices a little, but it's not going to fix the rest of it. We can't do it with a single GPU, right now, and perf/W improvements have only been, and are only likely to be, for some time, in the 20-40%/shrink range.

Going up in resolution efficiently is only going to work if there is a freeze on scene detail. That is something we have not yet seen.

If NV or AMD could make bigger strides for the next generation, they would, and they would take the competitive advantage it offered them straight to the bank (and, sometimes, they do). A demand spike could infuse them with cash, could lower high-end and midrange prices some, and so on, but GPU technology is not a commodity, where higher demand can magically improve the situation. It's on the bleeding edge, and fighting against physics.
 
Feb 4, 2009
34,626
15,821
136
Everyone keeps citing new consoles, I'm willing to bet all will be designed to minimize upfront costs (ie:less cutting edge hardware) and they will target 1080P, that is it. Nobody really owns a better TV at the moment and it doesn't seem likely that broadcasters will switch resolution on TV's at any time soon. I fail to see why any console maker will push anything more than 1080P.
Also keep in mind tablets/smart phones are truly becoming the gaming kings.
Personally I'd love to see ray tracing.
 

kache

Senior member
Nov 10, 2012
486
0
71
The other problem we are going to run into is the interconnects. Especially if we want 120hz 4k gaming. That is some serious bandwidth.

(4096x2160x120x3x12)/(1024x3)=~36GB/s

Need to double check, but I think 2 Displayport can drive this.

EDIT:
1.62, 2.7, or 5.4 Gbit/s datarate per lane; 1, 2, or 4 lanes; (effective total 5.184, 8.64, or 17.28 Gbit/s for 4-lane link); 1 Mbit/s or 720 Mbit/s for the auxiliary channel.

Yep, can.
 
Last edited:

reallyscrued

Platinum Member
Jul 28, 2004
2,617
5
81
(4096x2160x120x3x12)/(1024x3)=~36GB/s

Need to double check, but I think 2 Displayport can drive this.

EDIT:


Yep, can.

Damn..that's...a lot of bandwidth.

To put that into perspective, that's the peak memory bandwidth on a GeForce 6800 Ultra on GDDR3/256bit/@ 1.1 ghz.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
It's also going to take magic pixie dust. The heat has to go somewhere, and the costs have to be made up somewhere. A little more demand for more powerful hardware may drop prices a little, but it's not going to fix the rest of it. We can't do it with a single GPU, right now, and perf/W improvements have only been, and are only likely to be, for some time, in the 20-40%/shrink range.

It's called R&D, die shrinks improve efficiency, render piplelines get better, optimizations in current techniques are implemented or moved to dedicated hardware. We're no where near the power envelope here if anythign we're being held back by the 300W TDP that the PCI-E specs have in place, this has gimped ALL modern dual GPU video cards, these are all fixable issues. R&D cost money, money means we need sales, sales come when people have a reason to buy new hardware year in and year out and that means we need software to push the limits, it hasn't done that for 7 years now and there is aboslutely NO WAY that hasn't had some negative impact on the whole market.

Going up in resolution efficiently is only going to work if there is a freeze on scene detail. That is something we have not yet seen.

First of all a freeze on scene detail isn't required, we've been going up in resolution ever since the inception of 3D accelerated gaming for the PC. That said there most definitely IS a freeze on scene complexity, any game that is targeted at the consoles looks fundamentally as bad today as it did 7 years ago, the PRIME example is CoD MW series, they all look exactly the same, that is to say for a game in 2012 they all look really shit, there is a few very minor improvements year to year but because they're aimed at the console platforms there is no yearly progression in detail, they simply cannot do it the console hardware is static.

there are a handful of very few niche cases when it comes to added graphics, I've acknowledged these, Metro2033, the Witcher 2, Crysis 2, some of these games have a few extra DX11 effects and some high res settings which is nice to see but they do not represent the bulk of PC gaming today, the bulk of PC gaming todays is console ports.

If NV or AMD could make bigger strides for the next generation, they would, and they would take the competitive advantage it offered them straight to the bank (and, sometimes, they do). A demand spike could infuse them with cash, could lower high-end and midrange prices some, and so on, but GPU technology is not a commodity, where higher demand can magically improve the situation. It's on the bleeding edge, and fighting against physics.

All markets are powered by money, the fight against physics takes cash, bottom line, that needs healthy market with good demand and demand isn't high to upgrade now, why would the average joe with a 1080p monitor or less upgrade their mid range video card when it can run basically most games maxed out with all bells and whistles and has done for several generations.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Agreed would you really need anti ailsing and such at 4k? Isn't that effectively as good as our eyes can see?
I'm still waiting to see truly round cannon/gun barrels. Why is it so damn hard?

its not about how many pixels, it about pixel density

right around 300ppi is where we might draw a line as that is where current high end print is at, so for a ~27" personal screen we'd need more than 6K horizontal pixels to get there, let alone 4K

but yes, with ultra high pixel density, aliasing would be less of a problem, granted, the sheer number of pixels is going to be a new problem as we're talking over 9x as many pixels as 1080p to get to such a resolution
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It's called R&D, die shrinks improve efficiency, render piplelines get better, optimizations in current techniques are implemented or moved to dedicated hardware.
That's where the improvements mentioned are coming from.

You seem to think that if every game were more demanding, somehow they would have 10x the R&D budgets for the GPUs, and then the same for the processes they're being built on, and that even if they had that, they'd get 10x the improvements, too.

We're no where near the power envelope here if anythign we're being held back by the 300W TDP that the PCI-E specs have in place, this has gimped ALL modern dual GPU video cards, these are all fixable issues.
If we're stuck at 300W/card, it's not gimping, and they aren't fixable issues. Electricity isn't free either, and powerful PSUs are more expensive, along with multiple cards/ That we could just decide to ignore power issues is total BS.

We're not being held back in a reasonably fixable way by the 300W/card TDP. Even 300W is insane. 300W should be a whole gaming PC's peak power draw. You're in a fantasy land if you think a paper spec of 300W is holding us back. The amount of cards that would be sold at >300W would be in the thousands, v. millions to tens of millions for cards with decent thermals and noise characteristics. Most people don't want the computer in another room, or to need IEMs, a dedicated A/C unit, or a second job (depending on country), just to use their PCs.

It's holding us back in the same way most cars not having turbo 5L+ V8s is holding us back: sensible people don't want any of it.

First of all a freeze on scene detail isn't required,
To rapidly increase resolution, without rapidly dropping framerates, yes, it is. Otherwise, it won't be equally playable as the lower res. Increasing detail will only serve to increase the needed computational performance beyond that needed just for the higher resolution.

How are you figuring that we'll have some amazing disruptive surge in capability? The last time we went up in pixel density 4x took about 10 years, and it hasn't been something that has been increasing in rate (800x600 to 1080P).

That said there most definitely IS a freeze on scene complexity, any game that is targeted at the consoles looks fundamentally as bad today as it did 7 years ago, the PRIME example is CoD MW series, they all look exactly the same, that is to say for a game in 2012 they all look really shit, there is a few very minor improvements year to year but because they're aimed at the console platforms there is no yearly progression in detail, they simply cannot do it the console hardware is static.
They look bad because they're Madden for twitchies. They were never made to look good. BF is also more or less Madden for twitchies, but made to look good, and stress modern PC hardware. Skyrim and DE:HR would be good counterexamples, IMO.

All markets are powered by money, the fight against physics takes cash, bottom line, that needs healthy market with good demand and demand isn't high to upgrade now, why would the average joe with a 1080p monitor or less upgrade their mid range video card when it can run basically most games maxed out with all bells and whistles and has done for several generations.
Because that's not the case. There are games right now that take a nice video card to look good at 1080P. You won't get perfect play for several generations. A midrange card today can only barely do decent 1080P.