• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question Gaming performance is a solved problem. Change my mind

Three main pillars I base my theory on:

1) DLSS/FSR:
AI software acceleration is becoming way too good.

Huge FPS boosts with minimal artifacts/distortions.
DLSS 4.5 has even less distortions.


2) Hardware is very fast, very cheap:

Desktop: $300 GPU runs everything fluid 1400p
Laptop: $1500 machine runs everything fluid 1400p (desktop replacement with full 7950X + 5070ti + just 2.5kg 16'')
Handheld: Halo runs everything fluid 1080p and is a full 9950X / desktop replacement
(Halo 388 should reduce handheld price to more normal ranges)

Too fast and efficient that Desktop itself is already facing an existential crisis.


3) Games/graphics perf requirements are hitting ceilings / plateauing.
Same with resolution:

1080p/1400p is perfect for most. 4K 120FPS is luxury. 8K won't happen in this decade and compute:benefit ratio is not looking good.

There used to be the reality that next-gen games are increasingly more demanding and realistic.

This was in effect for many cycles but not anymore.

The weird industry-push to make "graphics even more realistic" is hitting walls itself. Realistic graphics doesn't equal fun.

Nothing specific that shows AAA games of the next years will have increasingly insane requirements.

Nothing radical on the horizon. Only AI...
 
Haha, no

We're still barely moved away from baked lighting and reflections (and all the limitations that entails), and all of the current solutions for dynamic GI and reflections run like crap while still being noisy as hell. We need a hell of a lot more compute to actually run full quality on both of those without all the noise and shimmering we see in current titles.

Transparency in general is still extremely awful. Deferred renderers fundamentally can't handle transparency- a G buffer can only store a single object per pixel, which means as soon as you have a translucent piece of glass in front of a solid object, you start having to use hacks and workarounds. And God forbid you might actually want multiple translucent things layered in front of one another, and expect them to render consistently.

There's just a lot of things that we have bad hacks to work around right now, that would be solved when we move to proper path tracing with sufficient compute power to make it shine.
 
A lot of CPU overheads in UE5

7950X / 9950X3D , same for mobile


We're still barely moved away from baked lighting and reflections (and all the limitations that entails), and all of the current solutions for dynamic GI and reflections run like crap while still being noisy as hell. We need a hell of a lot more compute to actually run full quality on both of those without all the noise and shimmering we see in current titles.

Transparency in general is still extremely awful. Deferred renderers fundamentally can't handle transparency- a G buffer can only store a single object per pixel, which means as soon as you have a translucent piece of glass in front of a solid object, you start having to use hacks and workarounds. And God forbid you might actually want multiple translucent things layered in front of one another, and expect them to render consistently.

There's just a lot of things that we have bad hacks to work around right now, that would be solved when we move to proper path tracing with sufficient compute power to make it shine.

All these things are already included in some games in Ultra settings? Or yet unimplemented?

The prices quotes can do Ultra settings fluid
 
Three main pillars I base my theory on:

1) DLSS/FSR:
AI software acceleration is becoming way too good
I don't feel like running my games at a low resolution then upscaling them is the best solution.
This has been a solution to weak hardware for ever it's just that the upscaling is better now.
.2) Hardware is very fast, very cheap:
Disagree. GPUs are more expensive than ever, RAM is more expensive than ever, CPUs are not cheap.
3) Games/graphics perf requirements are hitting ceilings / plateauing.
I think that's because hardware isn't fast enough (plus games aren't optimised like they used to be)

Same with resolution:

1080p/1400p is perfect for most. 4K 120FPS is luxury. 8K won't happen in this decade and compute:benefit ratio is not looking good.
I run a 34" monitor at 3440x1440. I do that because it's a sweet spot for performance not because I don't think that a higher resolution wouldn't look better.
There used to be the reality that next-gen games are increasingly more demanding and realistic.

This was in effect for many cycles but not anymore.

The weird industry-push to make "graphics even more realistic" is hitting walls itself. Realistic graphics doesn't equal fun.

Nothing specific that shows AAA games of the next years will have increasingly insane requirements.
I think that this partly because most games need to run well on consoles plus there's going to be diminishing returns on visuals using the same rendering process. Plus realistic graphics aren't always the optimal thing.
 
AI upscaling and frame generation have been normalized yes, but ultimately there is still the original core market of PC gamers with working eyes. 8k would have come in this decade if it werent for all the worlds money being funneled into cloud AI.

Fact of the matter is console plebs are more than happy with their 30-60fps motion blur always on experiences. Microsoft bought minecraft for 2 billion dollars and it uses 16x16 pixel art for textures. There are dozens of 2D games being made and selling hundreds of thousands of copies. Desirable fidelity is down to the product but I think there is a general negative trend with DLSS/FSR/XESS.

Games like Stalker 2 or cyberpunk literally are unplayable on most GPUs for 1080p 120fps at *native resolution*. Even turning settings down basically does nothing anymore and tons of games even have 'potato' mods for them. The real negative is that people on cheap hardware have no idea what they're missing out on.

I think this is also part of why the mass market often thinks games like call of duty or battlefield have impressive graphics. Because those games are actually optimized they can sport native resolutions, high fps/smooth visuals, long render distance, and all the HQ textures you could ever want.
 
I agree with the gist of the OP but the games are definitely getting much more demanding. They just look about the same as games from 5-7 years ago but run much worse. It just costs too much to make the games look significantly more detailed and the development costs of AAA are already in the 9 figures. Even if it looked better, a game won't sell just on the basis of graphics anymore because they all look pretty good.

But all the PC hardware makers know this too. They will shift away from selling hardware to pushing cloud gaming on everyone once the AI bubble bursts. Instead you can set up your own gaming cloud with Apollo/Sunshine.

We actually do have 8K though, Best Buy has refurb LG Z3s occasionally and they do 8K at 100hz. But most modern games will need upscaling for it.
 
I agree with the gist of the OP but the games are definitely getting much more demanding. They just look about the same as games from 5-7 years ago but run much worse.

exactly what matters is they look the same, gaming dev is hitting ceilings

also peaking in reverse is lazy development (resulting in "more demanding") which cannot be exponential, won't get "more and more" demanding just for the sake of it

plus it's inevitable AI is soon going to automatically write faster/more optimized code

and yes 8K screens exist but nobody is going to run it reasonably for a very long time.. except upscaling lol

AI upscaling also getting better and better to the point of negligible amount of artifacts

free FPS for everyone


GPUs are more expensive than ever, RAM is more expensive than ever, CPUs are not cheap.

$1500 = full highend desktop replacement (7950X, 2.5kg) playing everything fluid @ 1400p

never before in history was this possible

soon its 16'' laptop OLED screen will automatically expand to 24'' with a button too 😉

you can ignore the ludicrous 5080/5090 pricing/scalping etc and RAM shenanigans
(32GB mainly $100 -> $200, not end of the world. Just gets unaffordable for the big/fast kits
i.e. $500 -> $1200 etc)
 
Last edited:
There used to be the reality that next-gen games are increasingly more demanding and realistic.

This was in effect for many cycles but not anymore.
We were happy with 120+ FPS on our GPUs with 8gb of VRAM.
Until the next gen console ports arrived on PC from the PS5.
Lived experience is very much the opposite of anything you're speaking of here.

And just as it looked like consumers were in revolt over crippled GPUs, a VRAM crisis has struck.
Throwing future certainty that affordable GPUs will have sufficient VRAM into question.
 
you doubt the inevitable!
Lolling is the only sane reaction, really. The last few years should have made it obvious even to hobbyists like us that gaming workloads are just elaborate shortcuts through a mountain of compute. Nothing is solved yet in gaming if we want immersion, realism and cost effective projects. The simulation is still a collection of hacks that need to be hand picked and carefully orchestrated to fit the artistic vision of the devs.

To give the illusion of faster progress, graphics cards have doubled in TDP over the past 5+ years. Meanwhile some new games from big studios advertise absurd system requirements for a fluid 30 fps. In the mind of the PC gamer the word "optimization" eclipsed "innovation" as we realized current game engines and their features can easily defeat the fastest GPUs and CPUs on the market.

Gaming is not a solved problem, it's a managed problem.
 
IMO it's quite an irrational viewpoint to be certain that a regurgitation engine is going to have an original thought.

Define "original thought" 😉

You are 100% right, but in case of graphics: AI can predict graphics output accurately enough so it's indistinguishable from native rendering and the accuracy ratio and compute efficiency just keep increasing

Free FPS for everyone
 
I definitely think we are approaching the limit of graphics. Sure lighting affects etc are nice but graphics have been nice enough for a long time so that games look good. That wasn't the case in the late 90's or early 2000's. We don't need photo realistic.
 
I definitely think we are approaching the limit of graphics. Sure lighting affects etc are nice but graphics have been nice enough for a long time so that games look good. That wasn't the case in the late 90's or early 2000's. We don't need photo realistic.
I would say we do "need" photrealistic but it will never happen due to the cost of developing such games. No publisher would greenlight games costing billions to make.
 
If my PC can not pass the "Can it Run X" challenge, i am not a PC Gamer, and instead a console user incognito, trying to fit with the PC Gamer Master Race Elite crowd.
 
I do agree that games look good enough these days. Game engines and hardware have hit a wall in terms of performance for the most part as well. My main gripe in all this, is that graphics card prices (mainly nVidia) are getting higher and higher, and the benefits are mainly software improvements each new generation. Also, after playing a couple of UE5 games, I think we could use a bit more actual raster performance. Finally, even though AI software is getting better at not introducing artifacts, they are still present and I'd rather beefier hardware instead of better software at the end of the day.
 
Red Dead Redemption 2, AC:Odyssey, Gears 5, Shadow of the Tomb Raider, God of War. They could have stopped right there, and I'd have zero complaints. Hell, Middle Earth Shadow of War with the texture pack too. Even 2077 has been optimized greatly over half a decade. The latest buzz word, transformative, can suck it. The only thing it transforms, is your money into leather jackets.
 
Right every game with excellent production and graphics in the last ~15 years looks similar. Not sure what UE5 really adds, UE4 games look just great
 
Back
Top