This is the same thinking that keeps people around here from using 4K monitors. The slider bars and settings presented to us are set to be able to push 1080p resolutions, so it is up to us to turn down settings to use 4K, but because those settings are presented to us, the games are unplayable by these same people. So every new generation of card that is about to come out is super exciting to them, because they will finally be able to play at 4K with a single card, only to learn that the newest games that are released, present us with sliders that push hardware even further.
If you realize, as you pointed out, that these settings are just a range the dev's present to us, but much higher and lower ones still exist, you can start to understand that 4K vs 1440p vs 1080p becomes a choice of what settings that you want to use. The resolution as one of those settings. If you use a higher resolution, you use lower settings, and visa versa. This will never change. The only thing that changes what you find more meaningful, high resolution or higher settings. It's all about finding the right balance. As resolution and settings increase, diminishing returns kick in.
One thing you have to consider is how much money are you willing to spend for those settings you can't currently use? Is MSAA x4 instead of TAA on a single game worth spending $200? Is 80 FPS instead of 40 FPS worth $200? I don't think the first is worth it, but I will spend $200 for 80 FPS or more in the games I play. Maybe not for 1 game, as if it's just 1 game I will turn down settings, but 80 FPS in 1st person view games is when motion sickness no longer effects me.