bystander36,
Your argument, as near as I can tell,
Until developers stop creating games to be run at 1080p, and assume everyone has a 4K monitor, no video card will be able to play maxed settings at 4K and maintain 60 FPS.
is basically that developers can add more bells and whistles and bring a gpu to its knees, and that if they weren't trying to do all these bells and whistles, the games would play fine at 4k. AKA, if we could magically get rid of 1080p for everyone, developers would be forced to turn off a lot of bells and whistles (a lower base line) to get a game to run at 4k.
I'm curious exactly what the point of a higher resolution screen is, where you'd be able to better see all these details, if you're just turning them all off just to make it work on 4k?
Logically, I can see where you maybe turn off, or at the least lower MSAA or the likes with the higher resolution, which frees up some processing power, but your overall point is lost on me.
They typically make it so the top end video card can run 1440p at 60 FPS in their games
There's a realistic reason for that: ownership statistics.
Steam survey being easily accessible says that 1920x1080 is some 60.55% of folk's primary display resolution. You can't not keep 60% of your gamers in mind. 4k resolution looks like 1.21%. 1440p is 3.45%
1366x768 has 14.47% | silly laptops...
If I was a AAA developer, I wouldn't be trying to write my games for 4k 144 fps as the base line.
If I was a hard core gamer, I wouldn't want my AAA developer turning off all these bells and whistles to achieve 144 fps on 4k, when the whole point of 4k is to SEE more details.
There's a balance, and I'm sure developers know and realize that. Otherwise they would've lost the game a long time ago.