The great misconception about a graphic card being "overkill" for a resolution.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Thinker_145

Senior member
Apr 19, 2016
609
58
91
https://www.techpowerup.com/reviews/Palit/GTX_780_Ti_Jet_Stream/14.html

Whatever settings they used, the 780 Ti was below 50 FPS average at 1080p. So, basically if averaging 50 fps on a single game using TPU's settings, at a resolution is insufficient, not a single GPU was sufficient for 1080p in 2013. The Titan only managed 45 fps, the dual GPU 690 and 7990 averaged under 60 fps.

Heck, AMD's top of the line card when Crysis 3 lauched, the 7970 Ghz edition apparently wasn't sufficient for 900p either, because it only averaged 42.7fps with TPU's Crysis 3 settings. Same for nvidia's top of the line card when Crysis 3 released...the GTX 680 only averaged 43 fps on TPU's Crysis 3 test.
I mean it can happen, no card was sufficient when the original crysis launched.
 

Squeetard

Senior member
Nov 13, 2004
815
7
76
To me there is no such thing as overkill. once you experience a game that you can drive at 120fps on a blur reduction strobing backlight monitor you will not want to play any other way.
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Right, but I think his entire point is that there's a difference between explaining why the 1080 might not be necessary in that situation, and giving specific reasons why, versus just saying "it's overkill!!" and implying that it would provide no value under any situation.

This thread could have been a lot shorter if he said

Someone with a 1080p monitor who wants to max out every graphics setting, and maintain frame rates above 60 fps could use the power of a GTX 1080.

I don't think anyone would disagree with that.

However, he is arguing that no single gpu is sufficient for 1440p, using cherry picked numbers.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
This thread could have been a lot shorter if he said

Someone with a 1080p monitor who wants to max out every graphics setting, and maintain frame rates above 60 fps could use the power of a GTX 1080.

I don't think anyone would disagree with that.

However, he is arguing that no single gpu is sufficient for 1440p, using cherry picked numbers.
How is using popular games like Crysis 3, Assassin's creed and Hitman as "cherry picking"?

Even if I give you that all those games will run better by letting go of 8xMSAA or some other ridiculous setting fine. But what does this really tell us about the future of 1440p gaming with a GTX 1080?
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
How is using popular games like Crysis 3, Assassin's creed and Hitman as "cherry picking"?

Again, semantics but...

Crysis 3 is not popular, it is a benchmark title. It's also 2.5 years old.

Ass Creed Syndicate is a terrible port. Unoptimized and filled with Gameworks.

Hitman is DX12 and doesn't preform well on many GPUs. It is the most popular game you listed by far. AC and Crysis are bad choices to base a purchase on, can't argue that.

Didn't you just say a couple posts ago that one shouldn't buy a GPU based on games that are out?
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
You are not helping end this discussion one bit by refusing to refute his thesis. Stop trying to nitpick tertiary portions of his argument, that's childish.

Its refuting the evidence he's providing. Just relax, you don't have to assert yourself as the moderator of this thread.

Cool, I'm sure it's super productive to resort to the same tactics.
This is stupid and adds nothing to the conversation.
No one who has spent any time on the internet would expect to be able to express an opinion such as that, without hard evidence, and get a response akin to "yeah that's a good point."

Just let the thread die in peace.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Again, semantics but...

Crysis 3 is not popular, it is a benchmark title. It's also 2.5 years old.

Ass Creed Syndicate is a terrible port. Unoptimized and filled with Gameworks.

Hitman is DX12 and doesn't preform well on many GPUs. It is the most popular game you listed by far. AC and Crysis are bad choices to base a purchase on, can't argue that.

Didn't you just say a couple posts ago that one shouldn't buy a GPU based on games that are out?
Why do we buy an expensive GPU? So we can comfortably play the most demanding games. If you are simply going to brush away every demanding game then what is the point of high end PC gaming to begin with?

Crysis and AC are both popular franchises especially the latter, trying to pass them off as generic games nobody cares about is just being ridiculous. Yes AC could be a badly optimized game but THAT'S WHY WE PAY $$$ for high end parts. If someone has such a problem with badly optimized games then they probably shouldn't be gaming on a PC to begin with, it has been a reality of PC gaming for many years now.

Your last point is purposefully taking things out of context I am sure you can't be so naive? I said you don't simply buy a GPU that takes care of all current games. I'll explain this to you further just in case.

If a GPU is maxing out all current games at my resolution that does not mean there is no point in buying a better GPU since future games will be more and more demanding. If a GPU isn't even good enough to max out current games then of course that is a wholly different situation. How you managed to mix the 2 is beyond me.
 
Nov 19, 2011
122
0
76
And don't higher fps result in lower frame times which is really good for competitive type games? There is always a reason to upgrade :)
 

Nhirlathothep

Senior member
Aug 23, 2014
478
2
46
www.youtube.com
I have been PC gaming for nearly 10 years and I have seen this downright terrible argument time and time again.

Gamer looking for advise: Is this a good GPU to buy for my PC?

Forum expert: What monitor do you have? Don't you dare buy that graphic card if you don't at least have an xyz monitor as otherwise you will be in massive overkill territory. While in theory the argument will work when we talk about extreme scenarios. But people practically fail to apply it in an effective manner.

Right now if someone were to say that they bought a GTX 1080 to play games on their 1080p screen everyone would go like "OMG such a waste". Some people would even say that if you bought a GTX 1070 for 1080p. But I bet almost everyone will say that about the GTX 1080 hence that is what I am gonna concentrate on here.

The next resolution jump while maintaining aspect ratio after 1080p is 1440p so let's just examine how the "overkill" GTX 1080 performs at 1440p shall we?

anno2205_2560_1440.png


Oh no that's not a good start lol.

acsyndicate_2560_1440.png


Opps...

crysis3_2560_1440.png


So overkill? lolol

hitman_2560_1440.png


Does not look well for future games...

So the GTX 1080 does not max out every game at 1440p even at launch what do you think is going to happen going forward? Are you guys really that clueless about how fast GPU requirements increase to even consider such arguments? A GPU would need 120+ FPS in EVERY game for that argument to have any merit.

To force a noob gamer into buying an inferior GPU to what they could actually afford just because their resolution doesn't meet your lofty standards is just bad advice and will have a negative impact on PC gaming when 2 year onwards that gamer would be struggling with most games at decent settings.

Now your "PC master race" mentality may not be able to digest it but the fact is 1440p continues to be a dual card resolution. PC graphics cards have simply not left 1080p in the dust and that's fine. The consoles still don't do 1080p in every game last time I heard.

Not to mention insane resolutions are actually detrimental to the progress of graphics. If a game developer wants their game to run very well at very high resolutions with single GPUs then they are clearly sacrificing the fidelity for that.

Now YOU may be fine playing at higher resolution at the cost of lower settings and lower performance. But please stop pretending that your monster GPU eats out all games at 1440p because it really doesn't.

forum experts?

this is a oxymoron
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Why do we buy an expensive GPU? So we can comfortably play the most demanding games. If you are simply going to brush away every demanding game then what is the point of high end PC gaming to begin with?

Crysis and AC are both popular franchises especially the latter, trying to pass them off as generic games nobody cares about is just being ridiculous. Yes AC could be a badly optimized game but THAT'S WHY WE PAY $$$ for high end parts. If someone has such a problem with badly optimized games then they probably shouldn't be gaming on a PC to begin with, it has been a reality of PC gaming for many years now.

Your last point is purposefully taking things out of context I am sure you can't be so naive? I said you don't simply buy a GPU that takes care of all current games. I'll explain this to you further just in case.

If a GPU is maxing out all current games at my resolution that does not mean there is no point in buying a better GPU since future games will be more and more demanding. If a GPU isn't even good enough to max out current games then of course that is a wholly different situation. How you managed to mix the 2 is beyond me.

This is why you buy high end parts, and you do need to clarify your statement a bit. You buy high end parts so that you can play at 60+ FPS with every setting turned up on the latest demanding games.

Somewhere in the last 10 years, people went from not considering AA as part of maxed settings, to including it. We also went from being fluid about our resolution, to requiring a specific resolution and not being willing to turn it down. It also seems that people need to use that experimental tech, like PhysX/GameWorks and the like, even it is not performance friendly. We also went from wanting 30+ FPS to 60+ FPS.

I get it to some degree, on some of those items, but when people recommend a GPU, including these professional sites, they do so with the understanding that you are going to tweak a few settings here and there, only use minimal to adequate AA, and only need 40-50 FPS. If you drop the absurd settings on a handful of super demanding games, a GTX 1080 would be overkill for those, and obviously the majority of games out there.

There are enthusiasts, who have higher expectations. I, myself, and such a person. When playing 1st person view games, I get simulator sickness with sub 80 FPS and I like to play 3D Vision in many games too. I have a 1080p 120hz monitor as a result. When they say X is overkill for 1080p, I can look at the benchmarks and know if it applies to me or not. The overkill comment applies to "most" people. If it doesn't apply to you, don't take it personal. If it is a forum expert saying such things, feel free to mention that there are situations where a GTX 1080 can be beneficial.

My rambling comes down to this: to most people, a GTX 1080 is overkill for 1080p, and "future proofing" is generally a terrible idea with highest end GPU's, as they are grossly more expensive than the tier just below, that are only slightly slower. Don't get bent out of shape when people think they are overkill, even if you disagree. There is room for people who don't fit the normal buying habits of the typical gamer.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
There is room for people who don't fit the normal buying habits of the typical gamer.

Well said :thumbsup:

And I have been getting a bit bent out of shape and contemptuous with my commentary. For that I apologize, Thinker. I think everyone here has a passion for hardware and sometimes its hard to accept differing schools of thought. Gets me a bit riled up when people speak so matter of factly, is all. Don't take it personally. I was definitely in your shoes thinker, thinking that I'd have to max all games to get the most enjoyment out of them. Instead I've found that its the games I play and the people I play them with that matter most. At the end of the day, max settings don't mean too much if you aren't enjoying your time gaming.

I think its awesome that we still haven't moved away from 1080p. Means there is still so much more to come, in terms of graphical fidelity and the performance need to attain higher levels of. Exciting times. With full die Pascal/Vega looming, I'm on the edge of my seat. :eek:
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
This is why you buy high end parts, and you do need to clarify your statement a bit. You buy high end parts so that you can play at 60+ FPS with every setting turned up on the latest demanding games.

Somewhere in the last 10 years, people went from not considering AA as part of maxed settings, to including it. We also went from being fluid about our resolution, to requiring a specific resolution and not being willing to turn it down. It also seems that people need to use that experimental tech, like PhysX/GameWorks and the like, even it is not performance friendly. We also went from wanting 30+ FPS to 60+ FPS.

I get it to some degree, on some of those items, but when people recommend a GPU, including these professional sites, they do so with the understanding that you are going to tweak a few settings here and there, only use minimal to adequate AA, and only need 40-50 FPS. If you drop the absurd settings on a handful of super demanding games, a GTX 1080 would be overkill for those, and obviously the majority of games out there.

There are enthusiasts, who have higher expectations. I, myself, and such a person. When playing 1st person view games, I get simulator sickness with sub 80 FPS and I like to play 3D Vision in many games too. I have a 1080p 120hz monitor as a result. When they say X is overkill for 1080p, I can look at the benchmarks and know if it applies to me or not. The overkill comment applies to "most" people. If it doesn't apply to you, don't take it personal. If it is a forum expert saying such things, feel free to mention that there are situations where a GTX 1080 can be beneficial.

My rambling comes down to this: to most people, a GTX 1080 is overkill for 1080p, and "future proofing" is generally a terrible idea with highest end GPU's, as they are grossly more expensive than the tier just below, that are only slightly slower. Don't get bent out of shape when people think they are overkill, even if you disagree. There is room for people who don't fit the normal buying habits of the typical gamer.
There are a few reasons for that. From the 8800GTX onwards the penalty for AA declined significantly. Since that time I have never played a game without AA unless there was simply no way to do it.

If my GPU gets terribly outdated for a game I simply wait before I can play it at better settings. But that doesn't mean I play every game at max settings because I can't afford it and I know there are certain games where absolute max settings are a waste. I have no issues turning off proprietary settings and the latest DX version effects. However that doesn't mean I wouldn't want those things if I can.

And the reason why we became inflexible on resolution is because of the transition from CRT to LCD.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Well said

And I have been getting a bit bent out of shape and contemptuous with my commentary. For that I apologize, Thinker. I think everyone here has a passion for hardware and sometimes its hard to accept differing schools of thought. Gets me a bit riled up when people speak so matter of factly, is all. Don't take it personally. I was definitely in your shoes thinker, thinking that I'd have to max all games to get the most enjoyment out of them. Instead I've found that its the games I play and the people I play them with that matter most. At the end of the day, max settings don't mean too much if you aren't enjoying your time gaming.

I think its awesome that we still haven't moved away from 1080p. Means there is still so much more to come, in terms of graphical fidelity and the performance need to attain higher levels of. Exciting times. With full die Pascal/Vega looming, I'm on the edge of my seat. :eek:
Nothing personal mate I might also sometimes come off as passing my opinions as facts but I never mean to sound like that. We are all passionate about our hobby and hence sometimes we can get a bit worked up. No need for apologies. :)
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
This thread could have been a lot shorter if he said

Someone with a 1080p monitor who wants to max out every graphics setting, and maintain frame rates above 60 fps could use the power of a GTX 1080.

I don't think anyone would disagree with that.

Indeed. Technically speaking im not interested in frame rates above 60fps but rather a perfectly synchronized 60fps. On the other hand this means, minimum unsynchronized framerate need to be above 60fps. In conclusion, if i don't want to sacrifice pixel quality it would be 1080p screen resolution (with a single GTX 1080).
I could imagine going up to 1440p with GTX1080 SLI but that is outside of the price range i am willing to pay.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
So I thought this is a good time to bump this thread as the new Deus Ex game is a validation of this thread lol.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm not sure the existence of a game which at "maxed out" settings dips below 60 FPS, is validation that a GTX 1080 is a good fit for 1080p gaming. I won't argue that a GTX 1080 can be occasionally useful at 1080p, but it's likely a bit wasteful still. If you have the money, go for it, but if you could have improved your rig in other ways, you wasted money on the wrong component.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Except you'll need a high end CPU to pair with it, which is what people were talking about.

proz_11_nv.png


No point in buying a 1080 for... 1080p if you have a low end CPU.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
To counter the OP and give some perspective in the opposite direction, I would like to point out that often bugs me, and that is using the very newest games to benchmark GPUs to aid in purchasing decision, this is not a perfectly effective method of picking hardware.

Let's put this in perspective. Most of the examples given in the OP are of new games with demanding graphics, taken straight from a benchmarking website. However I personally own 589 steam games which range in graphical complexity from 2d pixel art all the way up to GTAV.

Judging a GPU on how it runs just the top 10 most demanding games is judging on it's ability to run just 1.7% of my library, actually it's less because that assumes I own the top 10 demanding games currently used for bench marking, and I do not.

What you find when you look at distribution of time played vs games, for example you look at the top 100 most played games, the live steam stats are here. What you find is that of the top 100 games the overwhelming number of players and hours put into games is spent in things like DOTA2, CS:GO, TF2 and things like that. These are significantly less demanding games than those used for benchmarking and my 980 has no problems handling games like CS:GO in 4k much les 1080p

All I can really add is that asking what resolution people run on their monitor when trying to determine video cards is actually a good thing to be doing, but it doesn't tell the whole story so we should also be asking what games do you predominantly play and for how long. And is it important that you run games all max or are you happy for 1-2% to need a few settings lowered.

And saying that there's no way to call a GPU good enough for 1080p or 1440p or whatever, just because there exists an application, or game or mod out there which it cannot run at that resolution is unhelpful nonsense. Because I could write a mod for the Unreal engine which is just a particle generator that spawns a trillion model penises each with a million polygons and flings them about at room and it runs at 0.0001fps at 1080p with 4 titans. So what? That doesn't describe what that card can do in the main, so it's a balance between edge cases which are unusually demanding games found in modern benchmarks and the vast body of other games out there, with some consideration as to how much each one is played.

Just be wary that bench marking websites deliberately use the most graphically demanding games to stress the GPU and it's absolutely not representative of the typical gamers library or play time.
 
  • Like
Reactions: RussianSensation