GTX 980 Matrix SLi or GTX 980Ti for 4K monitor?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It only became standard because Dev's targeted that resolution. When do you think Dev's will target 4K?

1080p is only able to handle "Ultra" because dev's don't put in higher end graphics. They could target 640x480 again, and 1080p would have to play at medium.

It's more likely you'll decide higher end graphics aren't worth the lower resolution, before 4K handling Ultra on all games is possible on a single card.

And it's ok that you don't want 4K, I'm mostly just trying to open your eyes to the reality of the gaming market. 4K @ultra is not possible on the most demanding games, not because of hardware, but because of the resolution the dev's target their ultra settings. At some point, you'll have to decide when 4K is worth lower settings.

I think this concept was a LOT easier when CRT's were the norm, as we could easily compare resolution to settings on the same hardware. At that point, people were very fluid with their graphical settings and resolutions. Now, people are a lot more fixated on settings, and forgot what resolutions bring.

There's no target, you can set any resolution you want. When Crysis 1 released there were monitors at 1080p and many at 1920x1200 but it was unheard of to be able to play that game at 1080p because the hardware wasn't fast enough not because the devs wanted you to play at 1280x1024. Also widescreen wasn't really a big thing yet.

It has nothing to do with what developers do with an engine. This isn't a console where they hard code it to run at 900p or something to keep performance up. You want to run higher resolution, you need more power or have to turn settings off or down. It's always been that way. For 4k we just need the hardware that can do it. Games have been getting better looking over the years and we've gotten better performance with faster hardware. A system today can run titles more demanding than Crysis due to better APIs and faster hardware. I think DX12 will be a benefit here.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
There's no target, you can set any resolution you want. When Crysis 1 released there were monitors at 1080p and many at 1920x1200 but it was unheard of to be able to play that game at 1080p because the hardware wasn't fast enough not because the devs wanted you to play at 1280x1024. Also widescreen wasn't really a big thing yet.

It has nothing to do with what developers do with an engine. This isn't a console where they hard code it to run at 900p or something to keep performance up. You want to run higher resolution, you need more power or have to turn settings off or down. It's always been that way. For 4k we just need the hardware that can do it. Games have been getting better looking over the years and we've gotten better performance with faster hardware. A system today can run titles more demanding than Crysis due to better APIs and faster hardware. I think DX12 will be a benefit here.

What do you think determines the settings the dev's choose to have available to us? Do you think they just code and release a game, and hope the hardware at the time can play it?

I'm pretty certain that they dial the settings we get, based on the hardware that is available at the time. And the results would be vastly different if they test on a 4K or 1080p system.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm saying we need better hardware for 4k because I don't think things should be turned off or down to get the performance. If a developer did what you're saying it would be a worse looking game because current hardware cannot process the effects at 4k today with the same performance as 1080p. So in the end we need better hardware anyway or settle for fewer or lower quality effects which I think is the wrong step.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
This is an interesting read, not quite on topic except in one section, which really is what we are talking about: http://n4g.com/news/1715331/console-players-deserve-pc-style-graphics-settings-project-cars-dev-says

Performance
Performance and its relation to settings also closely relates to the user experience and to how game options are presented. It’s also where I believe that some PC gamers really need to change their thinking.
The problem is that, due primarily to the length of the previous console generation, many new PC gamers are now operating under the assumption that they should be able to “max out” all settings in every game and still achieve good performance—otherwise a lack of optimization is to blame. This is, of course, a counterproductive stance, as it actively discourages the inclusion of high-end features. For example, a game which limits its “max” shadow settings to 2048x2048 shadow maps will be seen as more optimized than one which goes up to 4096x4096 or even 8192x8192. Despite being completely wrong, this thinking still needs to be dealt with in some way.
The most obvious choices are to either hide the highest settings in configuration files, or alternatively be very careful when choosing the wording for such options. For example, have a setting beyond “Ultra” called “Future.” It makes no functional difference, but it can make a large psychological one. Furthermore, the expected performance impact of each option should be noted clearly in its description.

Dev's have been hiding or choosing not to implement settings for years. Some can be controlled in .ini files, others require mod tools, while others may have added sliders in their game menu's that go beyond "ultra".

This is all because of the mind set above. This also plays a direct role on whether 4K can be played at ultra settings. All they have to do is change their "optimized" target resolution, and 4K is possible on max settings, but then 1080p doesn't get those flashing effects.

You say it isn't fair for them to give lower quality effects, but they've been doing it for years, and would have to back up further to provide for 4K.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
There's no target, you can set any resolution you want. When Crysis 1 released there were monitors at 1080p and many at 1920x1200 but it was unheard of to be able to play that game at 1080p because the hardware wasn't fast enough not because the devs wanted you to play at 1280x1024. Also widescreen wasn't really a big thing yet.

It has nothing to do with what developers do with an engine. This isn't a console where they hard code it to run at 900p or something to keep performance up. You want to run higher resolution, you need more power or have to turn settings off or down. It's always been that way. For 4k we just need the hardware that can do it. Games have been getting better looking over the years and we've gotten better performance with faster hardware. A system today can run titles more demanding than Crysis due to better APIs and faster hardware. I think DX12 will be a benefit here.

Crysis 1 came out in 2006 IIRC. Retail 2560x1600 monitors had been out since 2004.
 
Last edited: