The great misconception about a graphic card being "overkill" for a resolution.

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Why would you not go above the resolution of your display if you have the extra gpu power? I'd prefer to super sample over anything. Just leaving your resolution static makes zero sense.
I meant downgrading resolution.

Sent from my HTC One M9
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
I've looked at 27” PC monitors with 1440p/4K and you would have to sit incredibly close to it to notice any difference vs 1080p

I have a 1440p 27" monitor and I came from a 24" 1080p monitor. I notice a massive difference in sharpness in games like Rainbow Six: Siege, NS2 and many others.

Went from 27" 1080 to 32" 1440. Honestly did not see that much of a difference

1440p at 32" is the same PPI as 1080p at 24". You don't nearly get the same benefits of a sharper monitor at such a big size, as you would going from a 1080p 24" to a 27" 1440p monitor. 4K would be a bigger bet for 32" and up.

What's your eyesight like? Honest question. Because I sit ~2.5-3 feet from a 23-inch 1080p monitor and it's like looking through a screen door when gaming. Aliasing and a lack of fine detail is extremely obvious and honestly painful to look at

Indeed. Eyesight plays a major part.

Overkill doesn't mean there is 0 benefit, just that it's an extremely small benefit.

Bystander, you keep talking as if there is an objective metric, whereas people like me have been trying to drill into your mind that there is none. The main problem is what the goal is. If someone wants a high refresh rate, there isn't a "small benefit". There is a huge benefit towards getting very beefy GPUs.

This is the thing you don't seem to understand after all these pages. People's needs differ and there is no universal standard for what you are after. I don't get why you struggle so much with such a basic concept.
 
  • Like
Reactions: monkeydelmagico

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Bystander, you keep talking as if there is an objective metric, whereas people like me have been trying to drill into your mind that there is none. The main problem is what the goal is. If someone wants a high refresh rate, there isn't a "small benefit". There is a huge benefit towards getting very beefy GPUs.

This is the thing you don't seem to understand after all these pages. People's needs differ and there is no universal standard for what you are after. I don't get why you struggle so much with such a basic concept.

You've clearly not read my posts very clearly, or missed a few. All I've said all along is why people consider it overkill and why I consider it overkill, but I've also said that people who feel they "need" more will know it, and buy it anyway. The term overkill applies to the average person. My choices are also overkill, as I have a higher than typical requirement for gaming. I need 80 FPS or better in games because of motion sickness.

What appears to be the issue, from my perspective, is that some people can't accept that others consider their GPU overkill for their resolution. So what if I think it's overkill for most people. If you don't, then go for it and have fun.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
This is the same thinking that keeps people around here from using 4K monitors. The slider bars and settings presented to us are set to be able to push 1080p resolutions, so it is up to us to turn down settings to use 4K, but because those settings are presented to us, the games are unplayable by these same people. So every new generation of card that is about to come out is super exciting to them, because they will finally be able to play at 4K with a single card, only to learn that the newest games that are released, present us with sliders that push hardware even further.

If you realize, as you pointed out, that these settings are just a range the dev's present to us, but much higher and lower ones still exist, you can start to understand that 4K vs 1440p vs 1080p becomes a choice of what settings that you want to use. The resolution as one of those settings. If you use a higher resolution, you use lower settings, and visa versa. This will never change. The only thing that changes what you find more meaningful, high resolution or higher settings. It's all about finding the right balance. As resolution and settings increase, diminishing returns kick in.

One thing you have to consider is how much money are you willing to spend for those settings you can't currently use? Is MSAA x4 instead of TAA on a single game worth spending $200? Is 80 FPS instead of 40 FPS worth $200? I don't think the first is worth it, but I will spend $200 for 80 FPS or more in the games I play. Maybe not for 1 game, as if it's just 1 game I will turn down settings, but 80 FPS in 1st person view games is when motion sickness no longer effects me.

Absolutely spot on, well said.

I run 4k, I can't max a handful of my ~600 games, in those games it's a sacrifice between certain things like say draw distance or shadow resolution and screen resolution. Some of them I have no problem keeping at 1080p and having blocky pixels, others I'll happily run in 4k and drop a few of the top end settings like HBAO and quite often I prefer to disable some of these things, I can't stand motion blur for example so that's preferentially removed anyway.

Obsessing over "max" or "ultra" settings is really just autistic/OCD behaviour, it's almost like seeing a sliding bar at 49% and setting it to 50% so you feel better it's exactly half or a multiple of 10. If you want to use the phrase "this is a 1080p card" or "this is a 4k card" at all then you're forced to abandon edge cases, in fact all throughout life definitions are often fuzzy and edge cases cause problems, picking sensible limits is the key. Feeling good that the bar is at 100% is meaningless when you understand the wider picture.

Open ended settings are an interesting idea, i've seen this before I'm sure, I think it was Age of Conan, several of the draw distance and LOD settings were just integer value input boxes and you could type anything in and you'd always be able to tank your frame rate by setting an insane number, the only reason we don't see that more is because with no sense of perspective, users aren't going to know sensible values to set for their hardware, sliders with min/max values give people context but limit options.

I'm rambling now but I do remember a long time ago when Oblivion came out and not long after I got a new video card and maxed everything and it ran amazing still, so I went to the options and set the view distance/LOD/grass density values to be massive and was in awe at the almost endless sea of grass stretching out in front of me. I bet if I went back to that game today with a few 1080's in SLI I could set those ini files so high that I only got 30fps, I'm betting that would look insane.