The great misconception about a graphic card being "overkill" for a resolution.

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Sure thing but does it make up for significantly lower settings? Not in my opinion.

Also higher PPI with no AA is worse than lower PPI with AA.

Sent from my HTC One M9

Does it really require "significantly lower settings"? As mentioned by those you disagree with, they prefer 1440p with a some reduced settings. Those settings often are barely noticeable. For the most part, what we've all be saying revolves around this same point; if you alter a few settings on a select few demanding games, a GTX 1080 becomes overkill at 1080p (for most people at least).
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
I know for certain that I would rather play Deus Ex at 1080p with a 1080 than at 1440p. Anybody who says otherwise is just being dishonest. I'll dial down a few settings and play in gloriously crisp 4xMSAA. What is the point of uber resolution if you can't even achieve a crisp picture at high settings?

This isn't something new at all. I remember this resolution argument took birth with the 8800 GTX and all the people who had bought high resolution LCDs to go along with it were left totally crippled in Crysis. Where as I actually managed to play it in high settings on an 8800 GTS because I still had a CRT monitor back then.

Sent from my HTC One M9
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I know for certain that I would rather play Deus Ex at 1080p with a 1080 than at 1440p. Anybody who says otherwise is just being dishonest. I'll dial down a few settings and play in gloriously crisp 4xMSAA. What is the point of uber resolution if you can't even achieve a crisp picture at high settings?

This isn't something new at all. I remember this resolution argument took birth with the 8800 GTX and all the people who had bought high resolution LCDs to go along with it were left totally crippled in Crysis. Where as I actually managed to play it in high settings on an 8800 GTS because I still had a CRT monitor back then.

Sent from my HTC One M9

Have you tried it at 1440p? I bet you haven't. The only way you can know that is if you test it, or are simply so closed minded to the idea of lowering settings that you can make that judgement without any evidence.

And AA isn't the only thing that can be adjusted to make it very playable. The words you use also do not make much sense. No matter what settings are used, the advantage of 1440p will be it gives a crisper image. It's shadowing, lighting and other IQ settings which may need to be downgraded. The higher pixel density will automatically give 1440p a more crisp image.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
lol are you serious? A 24" 1080p screen with 4xMSAA gives a crisper image than a 27" 1440p screen with no AA. Maybe we have very different definitions of what a crisp image is....

What does it matter if I haven't played Deus Ex at 1440p? I know very well how games look at high PPI and I know for a fact it cannot substitute AA. Maybe a 24" 4K screen would do without AA? I don't know as I have never had the experience of using a very small 4K screen for gaming. My dad's laptop has one actually so maybe I'll give that a try.

What I do know is that 4K at bigger screens still need AA and 1440p needs it at every available size.

Sent from my HTC One M9
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
lol are you serious? A 24" 1080p screen with 4xMSAA gives a crisper image than a 27" 1440p screen with no AA. Maybe we have very different definitions of what a crisp image is....

What does it matter if I haven't played Deus Ex at 1440p? I know very well how games look at high PPI and I know for a fact it cannot substitute AA. Maybe a 24" 4K screen would do without AA? I don't know as I have never had the experience of using a very small 4K screen for gaming. My dad's laptop has one actually so maybe I'll give that a try.

What I do know is that 4K at bigger screens still need AA and 1440p needs it at every available size.

Sent from my HTC One M9

AA does not make anything crisper. AA blurs the edges of lines to make them appear less jagged. You also mentioned a resolution, not a size. Unless you also mention a size of screen, we have to assume we are talking about the same size. 1440p at 27" is higher PPI than 1080p at 27".

Last, you don't have to turn off AA to use 1440p.

Edit: if you are going to make it about 24" 1080p vs 27" 1440p, we add size to the list of improvements.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
No card currently can use MSAA in Deus Ex at 1440p with respectable settings. I consider a clean and smooth image with no aliasing to be a crisp one. MSAA does not cause any significant blurring.

Sent from my HTC One M9
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Might as well just start posting pictures of literal strawmen at this point.
straw-man3.jpg

Screen-Shot-2013-09-02-at-5.22.07-PM.png

nicolascage_notthebees_0.gif
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No card currently can use MSAA in Deus Ex at 1440p with respectable settings. I consider a clean and smooth image with no aliasing to be a crisp one. MSAA does not cause any significant blurring.

Sent from my HTC One M9

http://www.bit-tech.net/hardware/graphics/2016/08/23/deus-ex-mankind-divided-benchmarked/3
http://www.bit-tech.net/hardware/graphics/2016/08/23/deus-ex-mankind-divided-benchmarked/4

Both use the same settings, and the difference in FPS isn't even much.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91

Thinker_145

Senior member
Apr 19, 2016
609
58
91
The performance impact of MSAA is directly proportional with resolution. For example there would be a much bigger performance difference between 1080p+2xMSAA vs 1440p+2xMSAA than 1080p+no AA vs 1440p+no AA.

Sent from my HTC One M9
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
lol in case you didn't notice they are not using MSAA. TAA is a blurrier form of AA and I am sure you would appreciate MSAA on your "crisp" display.

Sent from my HTC One M9

Every game is different. As of the last several years, I've found MSAA to be pretty poor. Not that it got worse, but because games have added tons of detail that MSAA doesn't touch. I can't tell you if MSAA is even good in that game, but given how poor it has performed in recent years, I'm more likely to use VSR, TAA or some other method of AA.

One thing is sure, you can make a lot of claims without any actual experience.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Every game is different. As of the last several years, I've found MSAA to be pretty poor. Not that it got worse, but because games have added tons of detail that MSAA doesn't touch. I can't tell you if MSAA is even good in that game, but given how poor it has performed in recent years, I'm more likely to use VSR, TAA or some other method of AA.

One thing is sure, you can make a lot of claims without any actual experience.
The best form of AA is SSAA but it has a downright ludicrous performance impact to the point of impossible with latest games. VSR is also extremely demanding. Any game which has MSAA as an in game option works pretty well with it and Deus Ex does.

Sent from my HTC One M9
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm done here. Being able to be certain about something you've never seen makes it pretty futile for anyone to argue with you.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
I personally disagree with OP regarding 1440p no AA vs 1080p 4x MSAA. I think it's a false choice. I have 1440p and I've found you can often get away with 2x MSAA or, even better, a custom AA solution for Nvidia GPUs which makes the performance hit much smaller than it would be for 4x MSAA. Therefore I don't think the choice is either no AA or 4x MSAA.

That being said, I do agree with his initial post. People underestimate the power it takes to run top games at max settings. Last gen's big GPU, the 980 Ti which I have, still struggles to reach 60 fps on Witcher 3 on max settings. Having a 1440p monitor with 144 Hz as I do means that even with two 1080s, I wouldn't max the game at very high framerates.

Most people's "counter-examples" I've seen have involed very basic games like rocket league, WoW or similar types of games which are either A) old or B) incredibly basic. That's not a good counter-argument. The games one should be looking/comparing at are the newest and most demanding games since AAA games are popular for a reason. People want to know: can I play Battlefield 1 at high settings with my GPU or not? What about Hitman? What about Star Citizen? Etc.

I think the fundamental dividing line here is basically what games you play. A lot of people play older and/or basic games and if that is your subconscious bias, then it isn't strange if you recommend underwhelming GPUs for 1440p or 1080p. It of course also depends how willing you are to compromise with settings in newer games.

Therefore it is silly to think that there is a universal standard, and that goes to OP as well. He clearly plays demanding games at a high fidelity, and his advice wouldn't be much better than what he castigates others for if he pushes a GTX 1080 for 1080p to someone with modest demands.
 
  • Like
Reactions: Thinker_145

Thinker_145

Senior member
Apr 19, 2016
609
58
91
I personally disagree with OP regarding 1440p no AA vs 1080p 4x MSAA. I think it's a false choice. I have 1440p and I've found you can often get away with 2x MSAA or, even better, a custom AA solution for Nvidia GPUs which makes the performance hit much smaller than it would be for 4x MSAA. Therefore I don't think the choice is either no AA or 4x MSAA.
This is a totally reasonable and fair opinion. Remember my initial argument was never about those who prefer high resolution but those who discourage people from spending on high end GPUs if they don't have very high resolution.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
No, I agree with that. I've personally made the point on this forum on several occassions that many games released several years ago, like Crysis 3, still can't reach 60 fps@1080p with a GTX 980 Ti. I already mentioned Witcher 3. SC is looking like a tough game. Hitman is yet another game where a 980 Ti struggles at max settings @ 1080p. There are many more examples. An OC'd 980 Ti is basically where a 1070 is today, so its not like the GPU is underwhelming.

People just have these subconscious biases, depending on what games they play and how much they are willing to lower their standards, and they seem to think those subjective biases are universal. I'm sure my recommendation of a GPU wouldn't be awesome for someone with more modest ambitions than those I have for gaming. The bottomline is that humility is needed. There isn't a universal answer - and that goes both ways!
 
  • Like
Reactions: Thinker_145

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The thing is, for most people, a GTX 1080 is overkill for 1080p. The existence of a single game that can use it, if you are unwilling to simply not use MSAA, or use 2x MSAA, doesn't really change that. This is why the review sites do consider it overkill for 1080p, but that doesn't mean there are some people out there who would still find it useful (i.e. you two). Not only do you get good FPS with everything but MSAA maxed, but the odds that any given person would be playing that game is also not very good.

But if you know that you will be looking to play the most demanding games, and unwilling to turn down the occasional performance draining settings, even if that setting may not improve IQ much at all, then by all means, go for it.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Now I know Gamespot is not an authority on PC hardware but that actually makes this significant because they have a lot of casual following. Look at their conclusion of the GTX 1070 review.

"With the exception of the BioShock Infinite benchmark, which was razor thin, the GTX 1070 was able to consistently beat the Titan X at 1080p. Regardless, both the Titan X and GTX 1070 are overkill at 1080p, unless you’re looking to take advantage of a super high refresh rate monitor.

While the GTX 1070 is overkill for 1080p gaming, it makes good sense at 1440p. You’ll be able to max out just about every single game here, even the most taxing games like Metro with above 30 average FPS. You’ll also get plenty of mileage if you opt for a high refresh rate UHD monitor."

Basically telling people not to buy the 1070 if they only have a 1080p@60Hz screen which is just BS.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
[QUOTE="Thinker_145, post: 38439075, member: 369494"...Basically telling people not to buy the 1070 if they only have a 1080p@60Hz screen which is just BS.[/QUOTE]

Nope, they seem to know what they're talking about.

You on the other hand appear to be completely lost. You realize someone decided "lets make the maximum graphics settings barely playable with top end hardware". Some person decided to make the game perform like that. Get it? It has nothing to do with optimizations or best graphics quality. Get it? Otherwise maximum settings could be 3fps on top hardware. Or 300fps. Or 3000fps. Do you really think it's a coincidence it's barely playable with top end hardware?

Now you turn it around as some sort of proof 1070 is only a FHD 60Hz card. Please.
 

Jaskalas

Lifer
Jun 23, 2004
33,442
7,506
136
Soon as a video card becomes "overkill" for a resolution, that just means next gen games have room to crank up the graphics.
Whatever slack is achieved, will be consumed.
1080p gaming for life!
 
  • Like
Reactions: Thinker_145

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
...Basically telling people not to buy the 1070 if they only have a 1080p@60Hz screen which is just BS.

Nope, they seem to know what they're talking about.

You on the other hand appear to be completely lost. You realize someone decided "lets make the maximum graphics settings barely playable with top end hardware". Some person decided to make the game perform like that. Get it? It has nothing to do with optimizations or best graphics quality. Get it? Otherwise maximum settings could be 3fps on top hardware. Or 300fps. Or 3000fps. Do you really think it's a coincidence it's barely playable with top end hardware?

Now you turn it around as some sort of proof 1070 is only a FHD 60Hz card. Please.

I think you are saying something I tried to say on page 5. PC games are designed for us to self optimize. They just give us a bunch of settings, and it is up to us to make the game run well on our system. When a single game exists that can push a particular card at your resolution, and 1 setting adjustment makes the game run very well, it's pretty hard not to consider that card "overkill". Now think about all the games you play that will never be pushed in any way. It seems that money spent on the card could have been spent elsewhere.

Is that 1 setting, on 1 game worth spending $200+ extra? If the answer is yes, just consider yourself an exception.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
This thread doesn't make sense. near 60 FPS in all titles? Thats great FPS for a single card.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Having to spend £600 on a card to max out games on the resolution of a £100 monitor,ie 1080P would be overkill. I would rather wait another year until better cards come out,save a few £100 and spend it on things like beer.