gamegpuEverybodys Gone to the Rapture Benchmarks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
What? You think GTA5 in 800*480 looks impressive? I would like to disagree. As somone who owns a 110 dpi screen and has seen a 140 dpi screen side-by-side with it I'll say that we're far from the point of diminishing returns in dpi on desktop screens.

Or did I just misunderstand you and you meant increasing resolution while maintaining the same AA setting? Because that is indeed silly.
If you go from ~90 dpi (1080p 24") to 110 dpi (1440p 27") or 140 dpi (4k 32") then you'll have a natural reduction in aliasing artifacts. Applying the same amount of AA to a 90 dpi and a 140 dpi screen is just not going to gain anywhere near the same improvement in IQ.

I meant that increasing resolution and AA is the least efficient way to increase IQ, when you compare it to pretty much any other method (better textures, shaders, lighting, higher poly count etc.).

For instance what would you prefer, playing Crysis 3 at very high and 1080P, or on low and 4K (the jump from low to very high has roughly the same performance penalty as the jump from 1080p to 4k, i.e 60-65%)
 

Piroko

Senior member
Jan 10, 2013
905
79
91
I meant that increasing resolution and AA is the least efficient way to increase IQ, when you compare it to pretty much any other method (better textures, shaders, lighting, higher poly count etc.).

For instance what would you prefer, playing Crysis 3 at very high and 1080P, or on low and 4K (the jump from low to very high has roughly the same performance penalty as the jump from 1080p to 4k, i.e 60-65%)
I think that's a flawed argument. If you have enough hardware to run it on 1080p very high then you'll likely have enough VRAM on your GPU to run it at least in 4k high textures and a couple more settings that aren't too compute intensive. That goes a long way in equalizing the iq on both resolutions.

So, what i'd prefer would probably be 1440p high settings in that case, trying to keep high render distance with as high as performance allows in shadow and lighting effects at the expense of AA, motion blur and similar settings. The sheer gain in sharpness of the picture makes its iq superior imho.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I think that's a flawed argument. If you have enough hardware to run it on 1080p very high then you'll likely have enough VRAM on your GPU to run it at least in 4k high textures and a couple more settings that aren't too compute intensive.

If hardware is only just capable of running at 60fps on very high at 1080p, then it will not be capable of running at anything higher than low on 4K. Seriously go and look up what the performance hit is for very high over low, and what it is for 4K over 1080P.

THG did a test of the impact of quality settings in Crysis 3 and found roughly a 60% performance hit for going from low with medium textures to very high.

TPU, has done tons of reviews for Crysis at 1080P and 4K and again the hit is roughly 60%.

So if you want 4k plus high textures and a couple of other settings then you are looking at less performance than very high at 1080P.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If hardware is only just capable of running at 60fps on very high at 1080p, then it will not be capable of running at anything higher than low on 4K. Seriously go and look up what the performance hit is for very high over low, and what it is for 4K over 1080P.

THG did a test of the impact of quality settings in Crysis 3 and found roughly a 60% performance hit for going from low with medium textures to very high.

TPU, has done tons of reviews for Crysis at 1080P and 4K and again the hit is roughly 60%.

So if you want 4k plus high textures and a couple of other settings then you are looking at less performance than very high at 1080P.
Another benchmark to prove your point in the restrictions you've set upon yourself:
http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/8

However, you only need a bit of wriggle room beyond your restriction - like a different game - and you can get a very different impact:
http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/11
That's a single setting they turned off in 4k and you already get 60% of the performance that you get in 1080p.

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/10
Very high quality @ 4k, still only 40% slower than max quality in 1080p.

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/7
Medium quality @ 4k only 10% slower than ultra @ 1080p (at least for Hawaii, shame on the 970).

And all that is ignoring that 1440p and 4k give you the option to run games at lower resolutions while you'll always be limited to a peasantry 1080p max on a screen with, well, 1080p resolution - even in games that you picked up in a sale and could easily run in higher resolutions.
 
Feb 19, 2009
10,457
10
76
^ I find 4K with maxed textures and normal/medium settings run games at a good performance, quite comparable to 1080p maxed.

You will be hard pressed to find the difference though, most games on Normal vs Ultra, especially with maxed texture settings, looks very similar but runs twice or three times as fast.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Another benchmark to prove your point in the restrictions you've set upon yourself:
http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/8

However, you only need a bit of wriggle room beyond your restriction - like a different game - and you can get a very different impact:
http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/11
That's a single setting they turned off in 4k and you already get 60% of the performance that you get in 1080p.

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/10
Very high quality @ 4k, still only 40% slower than max quality in 1080p.

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/7
Medium quality @ 4k only 10% slower than ultra @ 1080p (at least for Hawaii, shame on the 970).

And all that is ignoring that 1440p and 4k give you the option to run games at lower resolutions while you'll always be limited to a peasantry 1080p max on a screen with, well, 1080p resolution - even in games that you picked up in a sale and could easily run in higher resolutions.

Anandtech didn't test very high at 1080P, only high, so that doesn't really have much to do with this argument.

Also this may come as a surprise to you but 40% slower is not the same as 0% slower. Unless you can show what it takes to get down to a 0% performance hit it doesn't really matter. Also the reason why you often see these massive performance hits from a single ultra setting, is because a lot of developers likes to take their pre existing shaders/shadows/lighting and then simply render them at some stupidly high resolution and call it "ultra", even though the high/very high setting was already at such a high resolution that you can hardly tell the difference, but that's exactly the point I'm making, increasing resolution is generally the least efficient way of improving IQ.

And switching to another game is just dodging the question at hand. Yes we all know that there are games with very little performance hit between low/medium and max, but those games also tend to have very little IQ improvement from low/medium to max settings.

Please just answer the question, would you prefer playing Crysis 3 at 60 fps on max settings and 1080P or at 60 fps on low settings and 4K. None of the data you provided showed that this isn't the performance level you would see, so the question still stands.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
Anandtech didn't test very high at 1080P, only high, so that doesn't really have much to do with this argument.

Also this may come as a surprise to you but 40% slower is not the same as 0% slower. Unless you can show what it takes to get down to a 0% performance hit it doesn't really matter.
No. Your whole argument is based around this arbitrary restriction that you only have exactly enough hardware to run this one game on very high settings in 1080p and it doesn't run well in 4k and thus 4k is stupid. You completely ignore resolutions between 1080p and 4k, you completely ignore that your restriction is based around a game with barely 20h of playtime value.

Also the reason why you often see these massive performance hits from a single ultra setting, is because a lot of developers likes to take their pre existing shaders/shadows/lighting and then simply render them at some stupidly high resolution and call it "ultra", even though the high/very high setting was already at such a high resolution that you can hardly tell the difference, but that's exactly the point I'm making, increasing resolution is generally the least efficient way of improving IQ.
Are you now talking about increasing shadow resolution within the actual game? I can mostly agree with that. Though I do appreciate shadows when they're done right.

And switching to another game is just dodging the question at hand. Yes we all know that there are games with very little performance hit between low/medium and max, but those games also tend to have very little IQ improvement from low/medium to max settings.
Actually, Crysis 3 is an example of that, other games scale better between their min/med/high settings (Battlefield series for example).

Please just answer the question, would you prefer playing Crysis 3 at 60 fps on max settings and 1080P or at 60 fps on low settings and 4K. None of the data you provided showed that this isn't the performance level you would see, so the question still stands.
I already gave you the answer several posts ago, you conveniently forgot to quote and answer it. You're just too narrow-minded to accept that your question is flawed.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
No. Your whole argument is based around this arbitrary restriction that you only have exactly enough hardware to run this one game on very high settings in 1080p and it doesn't run well in 4k and thus 4k is stupid. You completely ignore resolutions between 1080p and 4k, you completely ignore that your restriction is based around a game with barely 20h of playtime value.

I never said anything about having exactly enough hardware to run on very high settings in 1080P. It doesn't matter whether your hardware is capable of 30, 60 or 120 fps at very high in 1080P, the point is simply that whatever your hardware is capable of at very high in 1080P in Crysis 3 is exactly the same as what it will be capable of at low in 4K.

As such given that the performance is the same between those two settings/resolution, it is then simply a matter of picking the one you think looks better.

I already gave you the answer several posts ago, you conveniently forgot to quote and answer it. You're just too narrow-minded to accept that your question is flawed.

Yes I'm the narrow minded one, even though you're the one displaying an extreme amount of tap dancing around the subject and deflection, whilst still refusing to answer a straightforward question.

Maybe if I try another version of the question you might feel more inclined to answer. What do think looks better between GTA 5 maxed out at 1080P and WoW maxed out at 4K?. The performance is essentially the same here (1,2)
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Sorry in retrospect, my last post was needlessly agressive.
Maybe if I try another version of the question you might feel more inclined to answer. What do think looks better between GTA 5 maxed out at 1080P and WoW maxed out at 4K?. The performance is essentially the same here (1,2)
GTA 5 maxed out at 1080p, due to its graphics style being more appealing to me and more detailed in general. But both are far and beyond better looking than WOW at 1080p. By a decade. Literally.

Having the option to play games at 1080p or 1440p (my current native screen resolution) gives me the option to play a game with an image clarity that is not achievable on 1080p screens. 4k gives even more options. And it's a natural fix for aliasing as we know it.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Sorry in retrospect, my last post was needlessly agressive.
GTA 5 maxed out at 1080p, due to its graphics style being more appealing to me and more detailed in general. But both are far and beyond better looking than WOW at 1080p. By a decade. Literally.

Having the option to play games at 1080p or 1440p (my current native screen resolution) gives me the option to play a game with an image clarity that is not achievable on 1080p screens. 4k gives even more options. And it's a natural fix for aliasing as we know it.

Obviously we don't actually have the option to turn up the settings in WoW to achieve GTA 5 level graphics, which has a lot to do with the fact that as you point out they are a decade apart in development time (and thus the kind of graphical features we have in GTA 5 simply weren't feasible at the time for WoW).

My point was never that going out and buying a 1440P or 4K monitor is a poor investment, nor that it didn't appreciably improve image quality. My point is simply that if Blizzard had somehow had the capability of putting GTA 5 level graphics into WoW (as some sort of ultra mega max settings), then you would have been better off turning those on instead of using a higher resolution, since they have the same performance hit, but look better. Hence my claim that increasing resolution is an inefficient way of improving IQ, when looking at the performance hit.

Unfortunately it just so happens that increasing the resolution is often the only realistic way to increase graphics quality, due to the simple fact that it is easy for developers to include the option to render at a higher resolution, but very hard and time consuming to develop new game engines, shaders and/or lighting/shadow effects.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
My point was never that going out and buying a 1440P or 4K monitor is a poor investment, nor that it didn't appreciably improve image quality. My point is simply that if Blizzard had somehow had the capability of putting GTA 5 level graphics into WoW (as some sort of ultra mega max settings), then you would have been better off turning those on instead of using a higher resolution, since they have the same performance hit, but look better. Hence my claim that increasing resolution is an inefficient way of improving IQ, when looking at the performance hit.
Then I'll reformulate my original point, there's only so much information and effects that you can realistically cram into two million pixels worth of game content. Pumping up the shadows, Godrays and AA settings to get that little bit of extra smoothness into a 1080p picture is something I have found to be worse in efficiency than bumping the resolution to 3.6 million pixels worth of game content.
There's something quite enjoyable if that tree in the distance isn't rendered through six and three quarters worth of pixels, but instead beautifully detailed out in twelve pixels. Joking aside, so far I prefer to run every game in 1440p native, even if this means that I have to lower the settings substantially.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Then I'll reformulate my original point, there's only so much information and effects that you can realistically cram into two million pixels worth of game content. Pumping up the shadows, Godrays and AA settings to get that little bit of extra smoothness into a 1080p picture is something I have found to be worse in efficiency than bumping the resolution to 3.6 million pixels worth of game content.
There's something quite enjoyable if that tree in the distance isn't rendered through six and three quarters worth of pixels, but instead beautifully detailed out in twelve pixels. Joking aside, so far I prefer to run every game in 1440p native, even if this means that I have to lower the settings substantially.

Obviously you could find 4k to be better than pumping extra high res shadows, AA and those kinds of things. When I say that increasing the rendering resolution is inefficient, it doesn't mean that there aren't also other methods that are equally inefficient, it simply means that there some methods that are significantly more efficient. One method that is more efficient is switching to a modern game engine (e.g. from the WoW engine to the GTA 5 engine), of course this isn't an option for developers in reality, since the more modern game engine doesn't exist when they are making their game (unless they are deliberately using an old subpar engine), and building it themselves is often too costly and time consuming.

As such higher resolution (and other silly high res settings), may be inefficient at increasing IQ, but they are often the only option available.