cmdrdredd
Lifer
- Dec 12, 2001
- 27,052
- 357
- 126
It only became standard because Dev's targeted that resolution. When do you think Dev's will target 4K?
1080p is only able to handle "Ultra" because dev's don't put in higher end graphics. They could target 640x480 again, and 1080p would have to play at medium.
It's more likely you'll decide higher end graphics aren't worth the lower resolution, before 4K handling Ultra on all games is possible on a single card.
And it's ok that you don't want 4K, I'm mostly just trying to open your eyes to the reality of the gaming market. 4K @ultra is not possible on the most demanding games, not because of hardware, but because of the resolution the dev's target their ultra settings. At some point, you'll have to decide when 4K is worth lower settings.
I think this concept was a LOT easier when CRT's were the norm, as we could easily compare resolution to settings on the same hardware. At that point, people were very fluid with their graphical settings and resolutions. Now, people are a lot more fixated on settings, and forgot what resolutions bring.
There's no target, you can set any resolution you want. When Crysis 1 released there were monitors at 1080p and many at 1920x1200 but it was unheard of to be able to play that game at 1080p because the hardware wasn't fast enough not because the devs wanted you to play at 1280x1024. Also widescreen wasn't really a big thing yet.
It has nothing to do with what developers do with an engine. This isn't a console where they hard code it to run at 900p or something to keep performance up. You want to run higher resolution, you need more power or have to turn settings off or down. It's always been that way. For 4k we just need the hardware that can do it. Games have been getting better looking over the years and we've gotten better performance with faster hardware. A system today can run titles more demanding than Crysis due to better APIs and faster hardware. I think DX12 will be a benefit here.
Last edited:
