- May 19, 2011
- 16,888
- 7,814
- 136
Admittedly I feel like I'm on the outside looking in as the last game I bought was XCOM2 WOTC (and I've largely held back on buying new ones until I at least can realistically afford to replace my R9 380X, as Cyberpunk 2077 runs at about 30FPS @ 1080p and doesn't feel great), but I'm really puzzled about the apparent lack of difference between graphics detail settings for more modern games.
Here's an example:
Here's another:
In one comparison video I noticed some lighting differences, unlike comparing between raster and ray-tracing though a light and day difference IMO.
An easier to compare example here:
www.techpowerup.com
I'm also wondering just how much of a difference the low/high graphics detail slider made historically, because a quick look at a few Witcher 3 detail level comparisons had me scratching my head at the lack of differences. I had the impression that historically the low/high slider ought to have been labelled "potato or decent graphics", but then when I come to think of it I've always played at high graphics detail levels, occasionally doing a few tweaks of extra settings like 'motion blur' I think it's called in W3.
StarCraft 2 has a pretty big difference between graphics details levels such as in this cut scene:
https://www.youtube.com/watch?v=QJc75j82QZs
In-game I recall the graphics detail changes went from the lower settings giving the game less lighting, plainer textures and a more cartoonish look.
Here's an example:
Here's another:
An easier to compare example here:

Immortals of Aveum Benchmark Test and Performance Analysis Review - Optimization Fail
Immortals of Aveum is built using Unreal Engine 5 and has both Lumen and Nanite enabled, which achieves impressive graphical effects. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern graphics cards.
I'm also wondering just how much of a difference the low/high graphics detail slider made historically, because a quick look at a few Witcher 3 detail level comparisons had me scratching my head at the lack of differences. I had the impression that historically the low/high slider ought to have been labelled "potato or decent graphics", but then when I come to think of it I've always played at high graphics detail levels, occasionally doing a few tweaks of extra settings like 'motion blur' I think it's called in W3.
StarCraft 2 has a pretty big difference between graphics details levels such as in this cut scene:
https://www.youtube.com/watch?v=QJc75j82QZs
In-game I recall the graphics detail changes went from the lower settings giving the game less lighting, plainer textures and a more cartoonish look.