Have we reached "Good Enough" in the GPU space?

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
The pace of progress in visual fidelity has slowed down significantly.

Part of this is due to technical difficulties, as complexity increases.

Another part is, of course, consoles. Most AAA-games are going to be cross-platform by necessity to increase revenue streams.

You can't increase the difference between console and PC versions too much, or else the console players will feel cheated. Think about a game like Battlefront, which is stunning, yet runs quite well on a sub-$250 dollar GPU like the R9-290. Think about Aliens vs Predator that came out a year ago. A stunning game which ran fine on medium hardware.

The truth is, we're already using quite old GPUs in the PC space which is "good enough" for visually appealing titles. With Pascal/Artic Islands coming out, the danger is that people might not upgrade to the extent that they would otherwise, because games are unlikely to be dramatically more demanding going forward.

Yes, enthusiasts will continue to splurge cash on high-end GPUs. Either for higher resolutions, or for VR. But keep in mind the vast majority of people are at 1080p. The 1440p and above crowd are at miniscule amounts, via the Steam survey(even when controlling for laptops).

And it's the mainstream that drives sales, not the high-end market(which provides margins/profit).

There's simply no way a new game in 2016/2017 is going to fully stress a mid-range Pascal(which is likely to be close to a 980 Ti in performance). That would mean a game version for the PC which would essentially permanently relegate the console game to 2nd class status, which is never going to happen for any crossplatform AAA game.

We've reached the "good enough" phase for GPUs. I'll buy expensive GPUs for VR and for PC exclusives like SC, but for the vast majority of people, a 970/390 today is going to last them even longer than a 7970 Ghz has over the last 4 years.
 
Feb 19, 2009
10,457
10
76
An Intel HD4000 is good enough for most PC usage.

A 960 is good enough for most PC gamers.

It'll never be "good enough" for PC gaming enthusiasts.

1440p was yesterday's cutting edge, 4K is the target.

As games improve their visual fidelity, scope, simulation, hardware has to keep up. By the time 4K is playable at high settings on a single GPU, the next leading target will be VR with a minimum of 2x2K 90-120 hz. But that's just the entry.

VR folks are talking about 16K per eye 120hz+ as the "holy grail" of VR experience.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
There's no reason to believe that the options that we have today are "good enough" for the future ...

Case in point, what if developers start introducing some monte carlo-based global illumination ? ;)

(It's very easy to implement a brute force ray tracing solution for today's games when their based on physically measured material reflectance/transmittance parameters.)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Dev's have prooven time and time again that no matter what the consoles have, and no matter how fast the GPU's are, they will find a way to push them. They may have budget and requirements to meet for sponsers/producers, but once those are met, there are always dev's that want to see what is possible, and push the PC.

This conversation has occurred many times over the last 15 years, yet they keep pushing further.

The fact that most people are on 1080p, that just means the dev's push IQ further, and those on 1440p and 4K will need faster and faster GPU's to keep up. As long as Dev's treat 1080p as the target resolution, there will always be games that will require multiple GPU's of the fastest GPU's to maintain good playable FPS at higher resolutions.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
For the 1080p gamers, yes.

My R9 290 rocks everything and will so throughout the console's lifetime, provided you're not going for some crazy (and useless) Ultra settings the developers has shoe-horned into their games.
Same ISA IP as consoles, more than roughly twice as powerful specs as PS4 (30fps -> 60 fps). That should serve most modest 1080p gamers.

Except enthusiasts, who will shell out a $2000 every year on a new Titan/Fury SLI setup and then complain about the diminishing returns. Rince and repeat.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
When I can crash a jetliner into a neighborhood in GTA20 or whatever, and destroy an entire city block with near perfect physical accuracy, while running 240fps on my dual 8K OLED VR headset, then i might be somewhat satisfied, but probably not.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The short answer is yes, the long answer is more complicated.

One of the biggest issue is we got extremely weak consoles. And that people are stuck at 1080p gaming. These 2 combinations result in that you can run most games at good or acceptable levels with relatively weak hardware.
 

ShadowVVL

Senior member
May 1, 2010
758
0
71
If you are a dinosaur like me and still think N64 graphics look good then yes.
I will spare you all the wall of text and just say for me modern graphics lack imagination and just don't appeal to me.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
The short answer is yes, the long answer is more complicated.

One of the biggest issue is we got extremely weak consoles. And that people are stuck at 1080p gaming. These 2 combinations result in that you can run most games at good or acceptable levels with relatively weak hardware.

They are weak because the entire gaming industry is putting priority in cutting costs first and pushing tech to fullest a distant second. We are at the point where dev costs are the primary limiting factor than available hardware compute power. Even more so for mobile.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
The pace of progress in visual fidelity has slowed down significantly.

Part of this is due to technical difficulties, as complexity increases.

Another part is, of course, consoles. Most AAA-games are going to be cross-platform by necessity to increase revenue streams.

You can't increase the difference between console and PC versions too much, or else the console players will feel cheated. Think about a game like Battlefront, which is stunning, yet runs quite well on a sub-$250 dollar GPU like the R9-290. Think about Aliens vs Predator that came out a year ago. A stunning game which ran fine on medium hardware.

The truth is, we're already using quite old GPUs in the PC space which is "good enough" for visually appealing titles. With Pascal/Artic Islands coming out, the danger is that people might not upgrade to the extent that they would otherwise, because games are unlikely to be dramatically more demanding going forward.

Yes, enthusiasts will continue to splurge cash on high-end GPUs. Either for higher resolutions, or for VR. But keep in mind the vast majority of people are at 1080p. The 1440p and above crowd are at miniscule amounts, via the Steam survey(even when controlling for laptops).

And it's the mainstream that drives sales, not the high-end market(which provides margins/profit).

There's simply no way a new game in 2016/2017 is going to fully stress a mid-range Pascal(which is likely to be close to a 980 Ti in performance). That would mean a game version for the PC which would essentially permanently relegate the console game to 2nd class status, which is never going to happen for any crossplatform AAA game.

We've reached the "good enough" phase for GPUs. I'll buy expensive GPUs for VR and for PC exclusives like SC, but for the vast majority of people, a 970/390 today is going to last them even longer than a 7970 Ghz has over the last 4 years.

My other hobby aside from some pc gaming is CGI. Even the latest of PC games look like utter crap to me compared to the results from Blender's Cycles pathtrace engine. There isn't a card alive that can perform good quality path tracing at 720P with 60 fps in even simple scenes, let alone 4K and beyond with complex shaders thrown in.

So no, we are nowhere close to the "good enough" mark.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
They are weak because the entire gaming industry is putting priority in cutting costs first and pushing tech to fullest a distant second. We are at the point where dev costs are the primary limiting factor than available hardware compute power. Even more so for mobile.

There's more to it than that actually. When the PS3 and 360 launched, they weren't even drawing 175 watts under load. Remember that this was with cutting edge hardware, and that MS and Sony still had cooling issues. On top of that, this was before anyone gave a shit about power consumption. Sony/MS were put in the position of making a console with similar power characteristics.

I mean we all would have loved to see Sony cram a bulldozer and a 290x in the PS4, but think about how hard that would have been to do reliably, even without taking power consumption into consideration. You'd have to account for all of the retards out there cramming their new PS4 in a carpeted corner somewhere, and the intake vents getting caked with dog hair etc. It would have been a nightmare.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
There is no good enough for GPU's. They are different from CPU's in that way. In 2017 there will be games that simply run like crap on anything below mid-high end Pascal or AMD equivalent, even at 1080p with medium settings. GPU's don't last long.
The GTX 570 came out around the same time as the Core i7 2600k. One of those parts is perfectly good today and the other is a piece of useless crap for modern games, even at medium settings 1080p.
 

MongGrel

Lifer
Dec 3, 2013
38,466
3,067
121
Not meaning to fan the flames too much really, but I haven't played a game on a console in almost 30 years.

I've never considered them cutting edge for a long time, they tend to have always had short life cycles.

Good for sales though.

PC's are dead again ...

():)
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
More like Crytek's frail excuses for releasing a turd like Ryse. Whatever that games graphics looked like, it failed as a game and no one played it after 5 minutes. I guess that means that actual gameplay is more important than graphics.

It wasn't that bad of a game. Just kinda a mindless beat-em-up. I kinda enjoyed it, it's nice playing something that can be beat with half my braincells sometimes. Heck it only took me like 7 hours to get through the game.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
An Intel HD4000 is good enough for most PC usage.

A 960 is good enough for most PC gamers.

It'll never be "good enough" for PC gaming enthusiasts.

1440p was yesterday's cutting edge, 4K is the target.

As games improve their visual fidelity, scope, simulation, hardware has to keep up. By the time 4K is playable at high settings on a single GPU, the next leading target will be VR with a minimum of 2x2K 90-120 hz. But that's just the entry.

VR folks are talking about 16K per eye 120hz+ as the "holy grail" of VR experience.
so at least 5 - 10 more years of gpus?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
When I can crash a jetliner into a neighborhood in GTA20 or whatever, and destroy an entire city block with near perfect physical accuracy, while running 240fps on my dual 8K OLED VR headset, then i might be somewhat satisfied, but probably not.

Agreed. 8K @ 240hz is where I think we'd truly see diminishing returns. But we also need graphics and effects to rise with them. We have decades to go. I recently upgraded from 1080p 60Hz to 1440p 144hz on the same size display and as amazing as it is, I want even more PPI and Hz.

so at least 5 - 10 more years of gpus?

5-10 years to get dual 16K 120Hz+? 10 years ago most people here had 1600x1200 at 85Hz, 1280x1024 at 60hz, 1680x1050 at 60hz, or at the most 1920x1200 at 60hz. Today we are only just barely getting 2560x1440 144hz and 3840x2160 60hz. Dual 16K is 32 times the pixels of 2160. If you want it at 120Hz+, and you do, then double that number again for hardware intensity needed. So we have to go roughly 64 times from today. Even if we pretend everyone 10 years ago played at 1280x1024, going from that to 2160 is only a 6.3 pixel increase - and the same Hz.

Decades.
 
Last edited:

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
The short answer is yes, the long answer is more complicated.

One of the biggest issue is we got extremely weak consoles. And that people are stuck at 1080p gaming. These 2 combinations result in that you can run most games at good or acceptable levels with relatively weak hardware.

It is this way because most people with consoles play on their TVs. 4K TVs are not mainstream yet. Not sure it will be in some time either given how long it will take networks to start streaming content in 4K. Not every network does 1080p yet, opting instead for 720p.

So, in that regard, as OP suggest, for 970/390 owners those cards are going to last a long, long time. Next year's Pascal will last even longer.

What I fully expect to happen to counter this and further drive innovation and sales is that prices of 1440p 60hz monitors will start to fall dramatically to increase market penetration and 1080p will go the way of 720p in the PC space sometime around 2017 as 1440p 144hz/4K 60hz becomes the enthusiast standard. It will take a long time for 4K to become the norm. I imagine it will eventually, but how long? No clue. 4K @ 144hz...Heh. Talk about realism? GPU tech would be scary by the time that is a possibility.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
5-10 years to get dual 16K 120Hz+? 10 years ago most people here had 1600x1200 at 85Hz, 1280x1024 at 60hz, 1680x1050 at 60hz, or at the most 1920x1200 at 60hz. Today we are only just barely getting 2560x1440 144hz and 3840x2160 60hz. Dual 16K is 32 times the pixels of 2160. If you want it at 120Hz+, and you do, then double that number again for hardware intensity needed. So we have to go roughly 64 times from today.

For today's level of quality. If we dropped VR quality to Quake 3 levels current hardware could probably do it.