Today's GPUs and Tomorrow's AAA Games

oleguy

Member
Oct 30, 2013
96
0
16
It seems that a variety of technology changes are converging together in the next 12 months that could cause a nightmare for people wanting to buy something today that will still be viable tomorrow.

On the game development side, the Xbox One and PS4 are coming out in a couple of weeks. For the last few years, PC gamers have been blaming the limited feature-set of the current-gen consoles for the slow pace of quality improvement in graphics. Some PC-only games have pushed the envelope, but those aren't as common, and the next gen consoles will be using an AMD-derived x86 platform that has the ability to make cross-platform development less expensive.

Displays are finally starting to increase in quality, and since the next gen consoles have 4K in mind (if not immediately), it would seem that resolution would at least be scaled up to look good on 4K, if not actually be 4K at some point. Such content might be more than a year away, but 4K TVs at least seem to be meeting with more consumer demand than the failed 3D TV push (granted, anything larger than a rounding error is still an increase…)

Finally, GPUs are going to be asked to drive 4K displays for people with a lot of money today, and moderate amounts of cash tomorrow. Based on the results of the GTX 780 Ti, it seems like 4K is a bridge too far. Even in SLI or CF configurations, it seems like a limit is being hit with today's architecture, not to mention the noise and heat output.

Taken together, does it even make sense to buy high-end today if 4K is a possibility in less than two years? It seems like the turnover in cards means that dropping $500 today to drive a 1920x1080/1200 display is overkill, with the added bonus of it being too impotent to drive a 4K display. And buying one today in hopes of making it dual tomorrow means you are at the mercy of that product line still existing in a year, rather than just buying leftover stock that won't be price cut when the next generation GPU comes out.

What are the thoughts about all these technology changes and someone who is upgrading in the near-term? Does anyone have a feel of the pulse in these areas and can actually speak to the realities of 4K uptake and game development?
 

Spidre

Member
Nov 6, 2013
146
0
0
The resolution of the television won't matter, they will still be displaying 720/900/1080p that's been upscaled.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Don't plan for the future, there is no such thing as future proofing when it comes to computers.

Instead, plan for the now. Look at the best bang/buck for various price points, then pick the price point you want to afford, and do it.
 

Childs

Lifer
Jul 9, 2000
11,450
7
81
Dell's Ultra HD monitor looks to be priced around $5500. If you are going to spend that kind of money on a display, and expect to play games at decent frame rates, I wouldnt think another $2K in multiple GPUs is gonna kill you. 4K wont be affordable for mainstream users for some time, and by then, single gpu cards will be better equipped to handle the resolution. No point in worrying about 4K now, as its really only for content creation for the next few years. If you are buying now, buy for what you will be doing until then.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
If you use high resolution as a metric for your decision, then the R9 290/X seems like the way to go. It will be hard to know how game engines evolve over the coming years, but Mantle is definitely a wild card.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Next gen consoles will not run games at 4K, period. The XBone barely plays games at 720P. The PS4 is better off and is running several games at 900P and 1080P. But 4K is out of the question.

4K Tv prices are going to drop pretty fast over the next 2 years. And we will certainly see some in the 1500 range in another 2 years. Thats when 4K will start to take off from a TV point of view. BUT... there is no 4K content out there. Most cable/sat channels are still 720 or 1080i, and even then they have compression artifacts. Its going to be a LONG time for 4K native content.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Nightmare how? Gamers (gamers who play AAA games at least, not Farmville) are on either 1920x1080/1200 or 2560x1440/1600. Cards exist that can handle both well. 4K needs a whole lot more content and interest otherwise it will flop harder than 3D. The expected bump in game quality/assets/size/immersion won't affect gamers, if you have a grunty card you likely have a modern CPU with 16GB RAM and enough HDD space. Next gen consoles should have come out with more RAM and more CPU grunt than what they have now methinks. 8GB RAM is going to be just as limiting in 7yrs as 512MB is now.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
It seems that a variety of technology changes are converging together in the next 12 months that could cause a nightmare for people wanting to buy something today that will still be viable tomorrow.

On the game development side, the Xbox One and PS4 are coming out in a couple of weeks. For the last few years, PC gamers have been blaming the limited feature-set of the current-gen consoles for the slow pace of quality improvement in graphics. Some PC-only games have pushed the envelope, but those aren't as common, and the next gen consoles will be using an AMD-derived x86 platform that has the ability to make cross-platform development less expensive.

Displays are finally starting to increase in quality, and since the next gen consoles have 4K in mind (if not immediately), it would seem that resolution would at least be scaled up to look good on 4K, if not actually be 4K at some point. Such content might be more than a year away, but 4K TVs at least seem to be meeting with more consumer demand than the failed 3D TV push (granted, anything larger than a rounding error is still an increase&#8230;)

Finally, GPUs are going to be asked to drive 4K displays for people with a lot of money today, and moderate amounts of cash tomorrow. Based on the results of the GTX 780 Ti, it seems like 4K is a bridge too far. Even in SLI or CF configurations, it seems like a limit is being hit with today's architecture, not to mention the noise and heat output.

Taken together, does it even make sense to buy high-end today if 4K is a possibility in less than two years? It seems like the turnover in cards means that dropping $500 today to drive a 1920x1080/1200 display is overkill, with the added bonus of it being too impotent to drive a 4K display. And buying one today in hopes of making it dual tomorrow means you are at the mercy of that product line still existing in a year, rather than just buying leftover stock that won't be price cut when the next generation GPU comes out.

What are the thoughts about all these technology changes and someone who is upgrading in the near-term? Does anyone have a feel of the pulse in these areas and can actually speak to the realities of 4K uptake and game development?

If you have to ask, the short answer is "no." Too many unknowns about console port requirements, 4K adoption rates, support for Mantle vs. G-Sync, etc. I would just wait things out and keep using whatever you have until it starts being unacceptably slow. In a year we'll be on 20 nm GPUs so it makes even less sense to blow big bucks on a 28 nm card right now in this time of flux.

If you don't have to ask, it means money is not a factor, and thus you are filthy rich and will just buy the latest and greatest anyway, so go ahead and buy the latest and greatest and keep upgrading when new stuff comes out.
 

atticus14

Member
Apr 11, 2010
174
1
81
Yeah, you can't really future proof a system and I agree that a lot of new things are coming that will change the landscape a bit, but in all honesty you should only buy what you can afford. 4k isn't going to matter to a lot of people because it's just not going to be affordable. Not only will the TV/Monitors come at a premium for 3-5 more years, but you got to have the GPU power to push it as well and games are only getting more demanding.

Going with a bang for buck system and staying in the know to aggressively sale your parts when the time comes so you can invest in something new at minimal loss, will net you optimal performance with minimal costs.


@escrow4
4k won't flop as there is no reason for it to flop, it's the new industry standard. Unlike 3D which can be thought of as an additional feature which may or may not be supported.

Panel makers are going to make the switch because eventually they'll be minimal differences in cost vs 1080p. Adoption rates may be low but there's no doubt that eventually they'll just be as common as 1080p even if content is sparse.
 

Aithos

Member
Oct 9, 2013
86
0
0
It seems that a variety of technology changes are converging together in the next 12 months that could cause a nightmare for people wanting to buy something today that will still be viable tomorrow.

On the game development side, the Xbox One and PS4 are coming out in a couple of weeks. For the last few years, PC gamers have been blaming the limited feature-set of the current-gen consoles for the slow pace of quality improvement in graphics. Some PC-only games have pushed the envelope, but those aren't as common, and the next gen consoles will be using an AMD-derived x86 platform that has the ability to make cross-platform development less expensive.

Displays are finally starting to increase in quality, and since the next gen consoles have 4K in mind (if not immediately), it would seem that resolution would at least be scaled up to look good on 4K, if not actually be 4K at some point. Such content might be more than a year away, but 4K TVs at least seem to be meeting with more consumer demand than the failed 3D TV push (granted, anything larger than a rounding error is still an increase…)

Finally, GPUs are going to be asked to drive 4K displays for people with a lot of money today, and moderate amounts of cash tomorrow. Based on the results of the GTX 780 Ti, it seems like 4K is a bridge too far. Even in SLI or CF configurations, it seems like a limit is being hit with today's architecture, not to mention the noise and heat output.

Taken together, does it even make sense to buy high-end today if 4K is a possibility in less than two years? It seems like the turnover in cards means that dropping $500 today to drive a 1920x1080/1200 display is overkill, with the added bonus of it being too impotent to drive a 4K display. And buying one today in hopes of making it dual tomorrow means you are at the mercy of that product line still existing in a year, rather than just buying leftover stock that won't be price cut when the next generation GPU comes out.

What are the thoughts about all these technology changes and someone who is upgrading in the near-term? Does anyone have a feel of the pulse in these areas and can actually speak to the realities of 4K uptake and game development?

I worked in sales/management before and after 1080p, what I can tell you is this: game developers, for good or bad revolve around "average". They want to develop their games on as little money as possible and reach the largest audience. As passionate as PC gamers are, we aren't that audience. Games for PC in 1080p that were actual native 1080p and not upscaled were a long time coming.

I've said this in other threads, but the same thing is going to be true of 4k but to an even more extreme degree. The hardware coming out today will be long dead before 4k is really taken advantage of by any major game engine. There is literally zero chance that 4k native games are out within the next two years. If you don't believe me here is just one of several reasons *besides* that saturation will take at least 5 years (just like it did for 1080p):

1) Games are developed on a multi-year schedule. Most AAA titles are in development for 3-6 years, some (like Diablo III, the WoW successor, FFXIV, etc) are in development even longer. While 3D artists for major studios are obviously on high end workstations, they are still limited by what hardware can do. Add that to the fact that engine developers (coders included) are at the mercy of what their hardware can do and you will always have a lag of at least 4-5 years from when a technology hits the market to when it has content readily available to utilize it.

If you look back at tech improvements in regard to resolution and games you will see that it holds true. I'm building a system right now to run 1440p @ 120hz, that is the next step in gaming as far as I'm concerned and SLI 780ti are the first option that with overclocking *appears* to be capable of maxing out settings at those kind of framerates. I'll worry about 4k when I can get a 120hz 4k monitor for under $500, until then the benchmarks at 4k are just that: benchmarks.
 

Seba

Golden Member
Sep 17, 2000
1,485
139
106
Somewhat off-topic: 4k is not needed for games. There is still a huge amount of room for improvements of the image quality in games in 1080p. Just compare for instance a tree from a game in 1080p with a 1080p video of a real-life tree. Or a video game character with a video of a real-life human, both in 1080p.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Some of us are on 3 x 1080p which is 3K. Some are even on 3 x 1440p or 3 x 1600p. Others are on 120Hz monitors. Heck, some are on 3 x 120Hz monitors. Please consider this next time you claim 4K benchmarks aren't useful.
 

oleguy

Member
Oct 30, 2013
96
0
16
Now, one thing that may or may not be true is that the textures they use for PS4 and XBox One could be designed to scale to a higher resolution, even if the next-gen consoles can't actually render them properly. I say this only because well before we had 1080p games or content, Blu-ray players were upscaling the DVD content to a 1080p resolution, as the displays could support it. And some of what we saw was crap. Knowing this and how it was received, it does seem like at least AAA game designers would create textures that would look fine upscaled, which creates a question of is looking good is the same as actually being able to render them in that higher-resolution texture on a PC with such capabilities.

As someone pointed out, the main limiting factor would likely be VRAM, so today's and tomorrow's GPUs probably will be memory-limited, requiring that the bus is expanded to an absurd width or the RAM is clocked well beyond what even the new 780 Ti can do, at least if adding more RAM isn't cost-effective. Possible with the next generation GPUs... but probably not as likely.

With that in mind, the 780 seems to be overkill for driving a 1920x1200 monitor... With Vsync plus quality panels being 60Hz, going north of 60fps requires more than just a beefy GPU.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
I always find these arguments a little bit silly. I can assure you that game devs make their games so it can play on even mid-range hardware from the past three years as some sort of acceptable settings. The high-end card market is just too small for a game dev to make a game which is only enjoyable on the latest generation of $500+ GPUs. The current cards are so much more powerful than the consoles (which are ~7790-7850 performance) that a relatively low level of performance will be the baseline for the ports.

If the GTX 980 Ti comes out in 2015 or 2016 and is much faster at 4K than the 780 Ti, of course the 780 Ti will look bad in the 4K/Ultra gaming charts. However, nobody is stopping the GTX 780 Ti owner if the future from playing at 1080p/high settings (and those high setting may look as good or better than ultra setting in todays games). I would anticipate that the GTX 780 Ti will be able to run most games at 1080p/high for quite some time.

Similarly, I am using a GTX 570 right now and it is totally playable at 1080p/high in most games despite the Anandtech reviews being focused around 1440p/Ultra which makes this card look slow. I think that enthusiast segment cards will tend to be good for three years but only on lower settings in the most demanding games in the last year of the cards life.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
It sounds like you are in a spot where buying a good mid range card makes sense for now. Next gen games, followed by 4K gaming making a real impact on PC gaming is years away. A good mid range card will carry you quite far from where we are now if you aren't overly concerned about cranking details settings and using high resolutions.
Also, if you have a 1080p TV, then buying a next gen console would likely be a cost effective way to go that will bring you years of hassle free gaming enjoyment.
 

oleguy

Member
Oct 30, 2013
96
0
16
So, I may have gone too far afield by bringing up 4K, as there are still other technologies that are likely to impact gaming due to the next-gen consoles. Does anyone foresee the games making a quantum leap in graphical quality at 720p or 1080p that would choke off a 770 or R9 280 on the next 12 to 18 months? As best as I can tell, those resolutions are still playable with high detail, if not max.