Is 4K gaming "peak gaming"?

StinkyPinky

Diamond Member
Jul 6, 2002
6,763
783
126
Volta will be 10nm and seems to be probably 20% faster than pascal. So assuming they can shrink down to 5nm with current tech (debatable?) then I would suggest even that wouldn't be able to power 8K gaming with a single GPU (which is what 95% of gamers use).

Would some sort of 4K/8K checkerboard tech work?
 

Atom_101

Junior Member
Apr 14, 2017
12
0
6
GPUs will probably use 3D die to increase their computing power once die shrink is not possible anymore. You know, just like SSDs have gone to using 3D NAND instead of planar NAND.

This is just a speculation. I am no expert.
 

lehtv

Elite Member
Dec 8, 2010
11,900
74
91
4K @ 60Hz is doable with a GTX Titan X Pascal. 8K is four times as many pixels to render, and since a doubling in pixel count requires about 70% more GPU performance to maintain the same fps, you need a card almost 3 times as fast to run 8K @ 60 Hz smoothly. On 4K, Titan X Pascal is almost 3 times as fast as the original Titan (techpowerup, see GTX 970~Titan vs GTX 1080 Ti~Titan XP). These cards are two generations and 3½ years apart, so assuming similar increases in performance for the next two generations, single GPU 8K @ 60 Hz gaming will be possible in about 3 years with the release of the high end card in whatever generation comes after Volta.

Of course, the assumption that we will continue to get +70% per generation in high resolution performance may be wrong. Perhaps due to diminishing returns it will take three generations until a card that runs 8K @ 60Hz, perhaps four. It's just a matter of time.

Would some sort of 4K/8K checkerboard tech work?

No idea what that would even entail or why you would want to do that instead of just running 8K at lower graphics settings or 4K at higher graphics settings.

GPUs will probably use 3D die to increase their computing power once die shrink is not possible anymore. You know, just like SSDs have gone to using 3D NAND instead of planar NAND.

This is just a speculation. I am no expert.

A 3D GPU die would be a nightmare to cool adequately since you'd be increasing the volume of the GPU without increasing its cooler contact area.
 
Last edited:

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,244
3,833
75
I agree with everything you said except...
A 3D GPU die would be a nightmare to cool adequately since you'd be increasing the volume of the GPU without increasing its cooler contact area.
The nice thing about GPUs is they scale very well, at least up to some fraction of the number of pixels in the display. So if you lower the voltage and frequency enough, I think a 3D GPU die could work.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I agree with everything you said except... The nice thing about GPUs is they scale very well, at least up to some fraction of the number of pixels in the display. So if you lower the voltage and frequency enough, I think a 3D GPU die could work.

I don't think its anywhere near that simple. Even if works the thermal part isn't solved. I guess they can hack a dual side cooling solution where they make it 2-stacks and have one heatsink on bottom die and another heatsink on the top die. What happens if you make 4 stacks? 8 stacks? Would you have to lower the clocks and voltages to 1/8th? Even the dual die solution would have worse thermals because they are just closer.

The transistors don't run below 0.7V. And going to that level seriously impacts frequency. People will shudder at the clock speed when running it at 0.8V. If you lower the clock by half and get two dies, you get maybe same performance but worse thermals. From 0.7V to 1.1V on GPUs and 1.3V on CPUs the clock speeds increase superlinearly. You aren't going from 1GHz @ 0.7V to 2GHz @ 1.3V, you are going from 800MHz @ 0.7V to 4GHz @ 1.3V.

It's true that you can undervolt GPUs a fair bit from manufacturer specs. But manufacturer specs it so across dozen million GPUs it would minimize RMA due to too low of a voltage for a particularly crappy one. Let's say they implement fancy circuitry so to correct/minimize those errors and lower the voltage to the point not much different from you/me lowering voltage at the card level. How low can we go in voltages? 10%? That's only 20% reduction in power. Let's say they get to 0.8x the voltage. That's 36% reduction or 0.64x the power. And that's a one time benefit.

That's why death of Moore's Law isn't just about lack of scaling and economic benefits. No voltage scaling can happen anymore. We went very quickly from 2.xV to 1.3V on CPUs. We are at 1.3V for a decade now.

The Near-Threshold Voltage(NTV for short) circuit solutions shown at IDF by Intel runs it at real low frequencies. Yea sure you may run it at 0.7V but how does that help high performance CPUs/GPUs when it runs at 400MHz?

This is the reason people like Ray Kurzweil is insane. He based Singularity by 2050 using rapid scaling from past events. He wouldn't have known the scaling would slow down to a crawl. The reason Exascale systems have been delayed from original 2018 to 2021+ is the same reason.
 

Atom_101

Junior Member
Apr 14, 2017
12
0
6
Well from what I read, 3D stacks will have holes filled with some sort of special electrolyte
(which probably does not exist yet). This fluid will passively cool the inner stacks while the outermost layer will have the heatsink connected to it. The fluid will also maintain electrical contact between layers. With 5nm, the efficiency would probably be high enough for this to work.

I cannot find the article where I read this, otherwise I would have mentioned the source :/
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
238
106
So far, no one has answered the basic question . . ."Is 4K gaming 'peak gaming'? Is it yes or no?
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
238
106
Good show! The answer you quoted did not say yes or no, it gave the rationale for a no, but the word was not there. :)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I am sorry for not being clear enough in my own posts.
So here you go: No, 4k gaming is not 'peak gaming'.

I don't think the advancements would stop, just slow drastically. As in the rate advancing with cars and other technology areas. We will eventually see 8K being used.

As for AMD's quote about VR needing 16K and photorealistic graphics with GPUs having 1PFlops performance? It's starting to look like pie in the sky.

Well from what I read, 3D stacks will have holes filled with some sort of special electrolyte
(which probably does not exist yet). This fluid will passively cool the inner stacks while the outermost layer will have the heatsink connected to it. The fluid will also maintain electrical contact between layers. With 5nm, the efficiency would probably be high enough for this to work.

It sounds very complicated. Remember they structures they are working at is on micrometers and nanometers. It would only be in production if they can do it mass scale, reliably.

5nm won't improve things anywhere near great. Even if you get the "true" 5nm without the marketing jargon foundries are starting to throw around(Remember Samsung's 6/5/4 BS?), the fundamentals don't change. Voltages won't really go down. Reduction in capacitance is quite limited too. The gains are all in an area of low power chips. We are starting to get into an area where even Smartphone chips are going to reach slow rate of advancements.
 

lehtv

Elite Member
Dec 8, 2010
11,900
74
91
Good show! The answer you quoted did not say yes or no, it gave the rationale for a no, but the word was not there. :)
Surely you and anyone else with at minimum half a brain is able to interpret from my post that my answer was a no even though "the word was not there".
 

biostud

Lifer
Feb 27, 2003
18,241
4,755
136
You can always add more pixels so theoretically there is no peak. Personally I think that newer panel technology and HDR, and more advanced rendering is more important than adding more pixels. Also I think I would prefer more fps at 1440p than running 4K.
 

mizzou

Diamond Member
Jan 2, 2008
9,734
54
91
ill say the answer is yes. at least peak in your lifetime

(Spam removed)
 
Last edited by a moderator:

Charlie22911

Senior member
Mar 19, 2005
614
228
116
I would wager that 4k is peak gaming for traditional display technology, anything past 4k only gives diminishing returns and requires huge increases in bandwith and processing power.
I'd also be willing to be most people won't enjoy gaming on a panel much past ~30in, as an Acer x34 owner I can say that having to turn my head and look around the screen with certian game genres is a bit of a chore.

It is my opinion that new display technologies will be the driver for future advancment of GPUs, particularly VR headsets in the immediate future.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
I consider 1440@144 the peak of graphical gaming currently. That'll only change when there will be 4K@144, 8K144.. Etc... I highly value fluidity in gaming.
 

[DHT]Osiris

Lifer
Dec 15, 2015
14,079
12,173
146
I consider 1440@144 the peak of graphical gaming currently. That'll only change when there will be 4K@144, 8K144.. Etc... I highly value fluidity in gaming.
I agree with this. I think that at/after 4k, >60hz will become the 'new thing' to push cards. 120hz@4k will become the target for next gen's TI series/equivalents, and we'll start benchmarking with that in mind. The consoles will catch up eventually, and >4k@60hz will become more of a weird off-hand benchmarking tool/subset of gaming that most don't strive for. >4k@120hz will depend on what manufacturers come up with with panels, we might see a layover at 5k or just bounce right up to 8k.

I won't go back to 60fps though, too spoiled by gsync and 144hz.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I would say for that most gamers, the answer is yes. Neither 4K displays nor GPU powerful enough to play at 4K are cheap at all. And most gamers are at 1080p anyway and will be for awhile. Hell I'm at 2560x1600p and I have no plans on moving to 4K much less anything higher.
 
  • Like
Reactions: Ken g6

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
point of order, that's only 15k.

i think "8k" on a 32" screen would be an improvement over 32" 4k for the better scaling possibilities. at 4x the pixel you'd eliminate fuzziness when scaling to inexact ratios. at least at arm's length+ viewing distances.

"16k" at that same size, you could scale with visual perfection. that's the same ppi as a 4k 8" tablet.
 
Last edited:

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Given that we can build a fairly decent system for what a 4K display and capable GPU or two cost, I think it is fairly safe to say that it be awhile before most gamers will move up from 1080p. Prices would have to come down quite a lot before this happens.

On the other hand, folks doing work on their rigs seem to be more productive with large 4K displays due to the increased screen estate. Productive enough to justify in some cases the purchase of said monitors.
 
  • Like
Reactions: Ken g6

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
I'd rather game on a 1440p OLED rather than a 4k anything else.

Forget 8k, 50k, 500k - how about better actual graphics not just higher resolution? Sick of it.
 
  • Like
Reactions: whm1974