The great misconception about a graphic card being "overkill" for a resolution.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Thinker_145

Senior member
Apr 19, 2016
609
58
91
To counter the OP and give some perspective in the opposite direction, I would like to point out that often bugs me, and that is using the very newest games to benchmark GPUs to aid in purchasing decision, this is not a perfectly effective method of picking hardware.

Let's put this in perspective. Most of the examples given in the OP are of new games with demanding graphics, taken straight from a benchmarking website. However I personally own 589 steam games which range in graphical complexity from 2d pixel art all the way up to GTAV.

Judging a GPU on how it runs just the top 10 most demanding games is judging on it's ability to run just 1.7% of my library, actually it's less because that assumes I own the top 10 demanding games currently used for bench marking, and I do not.

What you find when you look at distribution of time played vs games, for example you look at the top 100 most played games, the live steam stats are here. What you find is that of the top 100 games the overwhelming number of players and hours put into games is spent in things like DOTA2, CS:GO, TF2 and things like that. These are significantly less demanding games than those used for benchmarking and my 980 has no problems handling games like CS:GO in 4k much les 1080p

All I can really add is that asking what resolution people run on their monitor when trying to determine video cards is actually a good thing to be doing, but it doesn't tell the whole story so we should also be asking what games do you predominantly play and for how long. And is it important that you run games all max or are you happy for 1-2% to need a few settings lowered.

And saying that there's no way to call a GPU good enough for 1080p or 1440p or whatever, just because there exists an application, or game or mod out there which it cannot run at that resolution is unhelpful nonsense. Because I could write a mod for the Unreal engine which is just a particle generator that spawns a trillion model penises each with a million polygons and flings them about at room and it runs at 0.0001fps at 1080p with 4 titans. So what? That doesn't describe what that card can do in the main, so it's a balance between edge cases which are unusually demanding games found in modern benchmarks and the vast body of other games out there, with some consideration as to how much each one is played.

Just be wary that bench marking websites deliberately use the most graphically demanding games to stress the GPU and it's absolutely not representative of the typical gamers library or play time.
A big section of the PC gaming crowd only play certain games which aren't very demanding. However those gamers dont usually visit these forums and are not representive of the sort of gamer that comes for help on forums.

Your argument about total games is totally invalid. We spend money on high end hardware for those games which actually need them. The less demanding games have nothing to do with our decision to spend money on expensive parts.

If you actually paid attention you will see that one of the games in my OP is from 2013. Benchmarking a GPU with the most demanding titles currently available shows us how long the card will possibly hold in future as those games will be much more demanding than the ones currently available.
 

NTMBK

Lifer
Nov 14, 2011
10,239
5,024
136
lol those results in the OP all looked fine to me. They're all way over 30fps, looks good. You guys are too fussy.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I haven't read the full thread, will in a bit, but some of us have more than one "monitor."

Hell, I play Castlecrashers on my GTX 1080 fed to the big screen in the office :D Nothing like some couch CO-OP to liven up the afternoon!

Also Dark Souls 3 @ 4K when I use to the TV or (now) 3440x1440 on the desk.

With this massive shift of pro-Japanese support for PC, there is going to be a lot more games I'll be playing @ 1080p on the TV versus the monitor. And if I can get away with some DSR 4K upscaling - consider that GTX 1080 cost well spent :D

Most games I play, before I got my new monitor is I had excess GPU usage @ 1440p, I'd up the resolution. Beats some of the other types of AA for sure.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
I grew up with Goldeneye on the N64, chugging away at like 12fps. You kids are spoilt.
No. When I didn't have the money I have played games on PC with quite dismal performance.

The point here is when you DO have a choice. Spend more money to play on high resolution with low performance or play at lower resolution with high performance and save money in the process as well.

Sent from my HTC One M9
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
No. When I didn't have the money I have played games on PC with quite dismal performance.

The point here is when you DO have a choice. Spend more money to play on high resolution with low performance or play at lower resolution with high performance and save money in the process as well.

Sent from my HTC One M9

With my new G-Sync monitor and how broken it seems PhysX is on Win10 systems, 30-35 FPS doesn't seem so bad! haha.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
A big section of the PC gaming crowd only play certain games which aren't very demanding. However those gamers dont usually visit these forums and are not representive of the sort of gamer that comes for help on forums.

I don't think that has been demonstrated with evidence at all. In fact of my decades on hardware forums my anecdotal experience has been of speaking to people with wildly different levels of expertise with PC hardware. And that some with a lot of experience typically already know, or believe they know, what hardware to get.

Your argument about total games is totally invalid. We spend money on high end hardware for those games which actually need them. The less demanding games have nothing to do with our decision to spend money on expensive parts.

No that's what you do, you're assigning your behaviour to others and again there's simply no evidence presented to suggest this is the case. Some people buy high end hardware because they want extremely smooth frame rates, in the region of 120-144fps or higher, some people buy it to max video settings, others do it because they're just enthusiasts and like bench marking and overclocking. Others do it because they play some games but also dabble in GPGPU work such as crypto mining, or hash cracking, folding or whatever else.

Even if we just focus on games and ignore the rest, it still matters because not everyone approaches the problem with the mindset of "I want all my games to run at 1080p at least 60fps", typically because doing so means targeting extremely high level hardware which can obtain that standard in demanding games. However it is wasted money in lower demanding games. So it becomes a trade off, of an entire library of games are you going to spend the extra money on 2 titans vs a single 980 to get a tiny fraction of those games to meet that standard? Outside of enthusiasts and people with a lot of money, the answer is typically no, and evidence like steam hardware stats back this up, only a tiny number of people are on that fringe. Just like bench marking results using top end games is fringe vs the larger scope of all games that gamers play.

If you actually paid attention you will see that one of the games in my OP is from 2013. Benchmarking a GPU with the most demanding titles currently available shows us how long the card will possibly hold in future as those games will be much more demanding than the ones currently available.

Sure but the argument here is a grapics card being overkill for a specific resolution, on the aggregate new titles tend towards greater graphical complexity in future so nothing is ever going to be completely future proof. This is a red herring, if your standard is that it has to handle all future games then the of course no video card is 1080p ready or 1440p ready.

The simple fact is the situation for each person is different, what they play is different and how people decide to buy hardware is different. Sometimes factoring these things in it's perfectly reasonable to say that video card X is overkill for resolution Y. And we should endevour when people ask for advice, to inquire about these different factors when giving our opinions. In general this forum is pretty good at doing that.
 

NTMBK

Lifer
Nov 14, 2011
10,239
5,024
136
No. When I didn't have the money I have played games on PC with quite dismal performance.

The point here is when you DO have a choice. Spend more money to play on high resolution with low performance or play at lower resolution with high performance and save money in the process as well.

Sent from my HTC One M9

Aha, I get the best of both worlds with my HD7770, I get to play at low resolutions and still get low performance!
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Those video cards which only get stressed fully in a couple of games(not actually a realistic thing mind you) are going to justify themselves by lasting a lot longer while also providing you with a superior experience in those few games. This does not apply to multi-GPU setups.

Sent from my HTC One M9
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
OP and others are confused and reaching the wrong conclusions because they're evaluating performance at defined graphics settings.

This is almost a circular argument since the defined settings are based upon arbitrary performance goals in the first place. But as you consider the graphics settings available, especially up the top, you'll see this line of thinking is backwards. One must remember that these particular maximum settings were chosen for a reason.

To put it simply; maximum graphics settings in most AAA games is less about graphics fidelity than it is about keeping the customer happy. The customer who cares most about running games at maximum settings obviously being the same customer who buys a powerful computer, and they want to know they're making the most of their powerful computer. I suppose it's a form of confirmation bias and this thread is another great example of it.
 
  • Like
Reactions: NTMBK

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No. When I didn't have the money I have played games on PC with quite dismal performance.

The point here is when you DO have a choice. Spend more money to play on high resolution with low performance or play at lower resolution with high performance and save money in the process as well.

Sent from my HTC One M9

Your entire argument is too black or white since it fails to account for how most real world gamers approach gaming:

1) A lot of PC gamers approach PC gaming from a compromise perspective -- i.e., what they desire the general PC gaming experience to be. Is it worth to spend thousands of dollars more to max out 2-3 games? Is it worth to put up with a mediocre outdated real estate of a small 1080p 24" monitor when the PC can also be used for work, media consumption and enjoying 95% of other games that run well?

2) A good monitor lasts 5, 7 or even more years. In that time, the type of gamers who buy a GTX1070/1080 level cards will have upgraded several times. I am sure many gamers would agree that spending $500-1000 on a monitor is actually a worthwhile investment while a $600-700 GPU depreciates and becomes outdated like a rock ($250 RX 480/1060 are now as fast or faster in modern titles than a $700 780Ti or $550 980). GTX1080 will suffer the same fate, even worse since it's even more overpriced than the 980 was. In 3 years from now, the $500-1000 monitor will still be amazing but GTX1080 will be $250 level GPU max.

3) Not everyone wants to play games with everything at Ultra. I would easily take 32" 2560x1440 at High/VH settings over 19-24" 1080p at Ultra in 98% of games. I know there will be X number of games that will never run maxed out or at locked 60 fps at 1080p on anything but a GTX1080/Titan XP but 30-50x more games that will run like butter on a 32" 2560x1440 monitor. I will not magically accept gaming on a low quality 19-24" 1080p 60Hz monitor for those few games at the expense of enjoying all the other games that look and run amazing. Maybe you don't play RTS/strategy games but there is absolutely no comparison between the experience of a 19-24" 1080p monitor vs. a 32-40" 2560x1440/3440x1440/4K monitor in that genre. Same for racing games -- the larger the screen, the greater the immersion. Playing racing games on a 24" monitor is a mediocre experience to me, no matter the FPS or resolution. At that point I'd rather play racing games on a PS4/XB1 and a large LCD/Plasma TV. You aren't taking into account that different gaming genres require different FPS and benefit differently from larger screen/real estate. Maybe if you play competitive FPS like CS:GO and you want 200-300 fps and narrow field of a view of a 19-24" monitor? That's great! And someone else wants to play Indie games on a 40-60" 4K monitor in all their cartoon/devinia glory.

4) You are also arbitrarily assigning higher weighting to certain PC gaming genres that require 60 fps over others. For instance, many gamers can enjoy RTS/strategy genre or 3rd person action/adventure games without a locked 60 fps. Those gaming genres don't "require" the same level of smoothness as FPS games or online competitive gaming.

5) You are ignoring new A-Sync monitor technologies (FreeSync/GSync) that allow smooth PC gaming experience below 60 fps.

Your line of thinking would have ensured that I am still gaming on my 1998 19" CRT because I would have just stuck to 1600x1200 because there were always games that were too demanding for a graphics card of any generation. There will always be certain settings like HairWorks that simply waste performance and aren't worth it.

The odd part about all of this is that you are trying to maintain that "1080p 60fps locked gaming" is a must for an enjoyable PC gaming experience and yet you owned an HD7850 2GB before your got a GTX1070? Now how is that even possible when HD7850 could not achieve even 1080p 60 fps in modern games of 2012?!

avp_1920_1200.gif


arkhamcity_1920_1200.gif


bf3_1920_1200.gif


civ5_1920_1200.gif


crysis_1920_1200.gif


dragonage2_1920_1200.gif


hardreset_1920_1200.gif


metro_2033_1920_1200.gif


shogun2_1920_1200.gif


skyrim_1920_1200.gif

That was in 2012: according to your own criteria, the HD7850 that you supposedly owned for almost 4 years was outdated on Day 1, and yet you "suffered" through this sub-60 fps 1080p gaming for that long.

How do you expect for your argument that GTX1070/1080 are only good enough for 1080p and no more to come off? Not only are you contradicting how you yourself experienced PC gaming for the last 3-4 years, but now suddenly that you have a GTX1070, it's only good enough for 1080p 60Hz? What an illogical argument. Using that logic should people with GPUs below that be gaming at 1024x768, 1280x800, 1366x768, 1280x1024, 1600x900 only? Secondly, your argument entirely rests on the assumption that as long as there are some examples of AAA titles that smash high-end GPUs at 1440p, that's sufficient enough to prove that GTX1070/1080 are not 1440p videocards. The problem with that argument is that it heavily weighs a handful of titles while ignoring the 90-95% of gaming library that PC gamers have on Steam that would look glorious in 2.5K or even 4K. There will always be titles that can max out a $700 GPU but it doesn't magically make GTX1080 a 1080p card.

The other tendency on forums is that once new GPUs come out, suddenly the old GPUs are no longer fast enough for 1080p. That's not how it works. Just because DE:MD and Anno 2205 smash modern cards at 1080p, it doesn't mean that on average RX 480/1060/R9 390/980 aren't excellent 1080p cards for the vast majority of titles out there. Obviously everyone has their own opinion but I'd rather take a $750-1000 monitor with a $250 GPU than a $700-1000 GPU with a $250 monitor.
 
Last edited:

TidusZ

Golden Member
Nov 13, 2007
1,765
2
81
There are two big problems with the original argument:
1. The fps on those benches are significantly lower than those in anandtech bench, http://www.anandtech.com/bench/product/1714

2. I don't think anyone would argue that GTX 1080 is overkill for 1440p. It's like arguing an F1 car is overkill for driving in the Grand Prix. When people say it's overkill they'd more likely be talking about a 1080 being overkill for 1080p @ 60hz, or a low resolution laptop.

In some cases overkill might depend on the person's budget, but if a card is really, really overkill, there will literally be no benefit over a cheaper card (i.e., 1070 will get >60 fps on 1080p@60hz until the end of time, so 1080p is 100% overkill
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Your entire argument is too black or white since it fails to account for how most real world gamers approach gaming:

1) A lot of PC gamers approach PC gaming from a compromise perspective -- i.e., what they desire the general PC gaming experience to be. Is it worth to spend thousands of dollars more to max out 2-3 games? Is it worth to put up with a mediocre outdated real estate of a small 1080p 24" monitor when the PC can also be used for work, media consumption and enjoying 95% of other games that run well?

2) A good monitor lasts 5, 7 or even more years. In that time, the type of gamers who buy a GTX1070/1080 level cards will have upgraded several times. I am sure many gamers would agree that spending $500-1000 on a monitor is actually a worthwhile investment while a $600-700 GPU depreciates and becomes outdated like a rock ($250 RX 480/1060 are now as fast or faster in modern titles than a $700 780Ti or $550 980). GTX1080 will suffer the same fate, even worse since it's even more overpriced than the 980 was. In 3 years from now, the $500-1000 monitor will still be amazing but GTX1080 will be $250 level GPU max.

3) Not everyone wants to play games with everything at Ultra. I would easily take 32" 2560x1440 at High/VH settings over 19-24" 1080p at Ultra in 98% of games. I know there will be X number of games that will never run maxed out or at locked 60 fps at 1080p on anything but a GTX1080/Titan XP but 30-50x more games that will run like butter on a 32" 2560x1440 monitor. I will not magically accept gaming on a low quality 19-24" 1080p 60Hz monitor for those few games at the expense of enjoying all the other games that look and run amazing. Maybe you don't play RTS/strategy games but there is absolutely no comparison between the experience of a 19-24" 1080p monitor vs. a 32-40" 2560x1440/3440x1440/4K monitor in that genre. Same for racing games -- the larger the screen, the greater the immersion. Playing racing games on a 24" monitor is a mediocre experience to me, no matter the FPS or resolution. At that point I'd rather play racing games on a PS4/XB1 and a large LCD/Plasma TV. You aren't taking into account that different gaming genres require different FPS and benefit differently from larger screen/real estate. Maybe if you play competitive FPS like CS:GO and you want 200-300 fps and narrow field of a view of a 19-24" monitor? That's great! And someone else wants to play Indie games on a 40-60" 4K monitor in all their cartoon/devinia glory.

4) You are also arbitrarily assigning higher weighting to certain PC gaming genres that require 60 fps over others. For instance, many gamers can enjoy RTS/strategy genre or 3rd person action/adventure games without a locked 60 fps. Those gaming genres don't "require" the same level of smoothness as FPS games or online competitive gaming.

5) You are ignoring new A-Sync monitor technologies (FreeSync/GSync) that allow smooth PC gaming experience below 60 fps.

Your line of thinking would have ensured that I am still gaming on my 1998 19" CRT because I would have just stuck to 1600x1200 because there were always games that were too demanding for a graphics card of any generation. There will always be certain settings like HairWorks that simply waste performance and aren't worth it.

The odd part about all of this is that you are trying to maintain that "1080p 60fps locked gaming" is a must for an enjoyable PC gaming experience and yet you owned an HD7850 2GB before your got a GTX1070? Now how is that even possible when HD7850 could not achieve even 1080p 60 fps in modern games of 2012?!

avp_1920_1200.gif


arkhamcity_1920_1200.gif


bf3_1920_1200.gif


civ5_1920_1200.gif


crysis_1920_1200.gif


dragonage2_1920_1200.gif


hardreset_1920_1200.gif


metro_2033_1920_1200.gif


shogun2_1920_1200.gif


skyrim_1920_1200.gif

That was in 2012: according to your own criteria, the HD7850 that you supposedly owned for almost 4 years was outdated on Day 1, and yet you "suffered" through this sub-60 fps 1080p gaming for that long.

How do you expect your argument comes off that GTX1070/1080 are only good enough for 1080p and no more? Not only are you contradicting how you yourself experienced PC gaming for the last 3-4 years, but now suddenly that you have a GTX1070, it's only good enough for 1080p 60Hz? What an illogical argument. Using that logic should people with GPUs below that be gaming at 1024x768, 1280x800, 1366x768, 1280x1024, 1600x900 only? Secondly, your argument entirely rests on the assumption that as long as there are some examples of AAA titles that smash high-end GPUs at 1440p, that's sufficient enough to prove that GTX1070/1080 are not 1440p videocards. The problem with that argument is that is heavily weighs a handful of titles while ignoring the 90-95% of gaming library that PC gamers have on Steam that would look glorious in 2.5K or even 4K. There will always be titles that can max out a $700 GPU but it doesn't magically make GTX1080 a 1080p card.

The other tendency on forums is that once new GPUs come out, suddenly the old GPUs are no longer fast enough for 1080p. That's not how it works. Just because DE:MD and Anno 2205 smash modern cards at 1080p, it doesn't mean that on average RX 480/1060/R9 390/980 aren't excellent 1080p cards for the vast majority of titles out there. Obviously everyone has their own opinion but I'd rather take a $750-1000 monitor with a $250 GPU than a $700-1000 GPU with a $250 monitor.
1. This forum is dominated by gamers. I do buy hardware that has no benefit for gaming but I don't buy anything which has a negative impact on gaming. If a gamer wants more real estate they can always go for a multi-monitor solution.

2. Actually this isn't as much the case as it may seem. My biggest gripe with the more expensive monitors is that they don't offer any IQ improvements. They just throw more pixels and refresh rate but the fundamental IQ remains the same. All current high end monitors would be terribly outdated once OLED and HDR screens come in the mainstream.

3. Again your assumption that 1080p screens are "low quality" is entirely wrong as you are implying they have worse IQ. I agree with you on the part of racing games but why would you need a console for that? You can just buy a TV as a secondary display for your PC which I am actually thinking of doing myself.

4. I don't agree that 3rd person adventure games don't benefit from high FPS. You are right about strategy genre.

5. Fair point but such monitors are still not that common. An average user unless specifically advised is unlikely to end up with one. You also have to make some compromises in other areas of the display if you want to go for these technologies due to the fact that choice is still not very good with these.

I was in fact playing at 1680x1050 a bit over an year ago but my monitor died and then all the good monitors were at least 1080p. I also wanted to move to 16:9 as some games had started letter boxing at 16:10.

There are several reasons why I ended up having a 7850 till the first half of 2016. I bought it initially due to severe budget constraints and situation didn't change for a long time. I was also on a Core 2 Quad platform until 2015 so that was the bottleneck making things difficult for me. I had to first make a full platform upgrade before a GPU upgrade. Then I was waiting so long for the next gen cards.

I haven't played several games from several years. I simply put aside all games that I couldn't do better than medium settings. You can say I have a lot of patience in delaying games I also hardly buy games at full price.

Now my financial condition allows me to get back to higher end gaming. I am not trying to demean mid range gaming in any way. My argument is for those people who do have the money not those who don't.

Sent from my HTC One M9
 

amenx

Diamond Member
Dec 17, 2004
3,909
2,130
136
3) Not everyone wants to play games with everything at Ultra. I would easily take 32" 2560x1440 at High/VH settings over 19-24" 1080p at Ultra in 98% of games. I know there will be X number of games that will never run maxed out or at locked 60 fps at 1080p on anything but a GTX1080/Titan XP but 30-50x more games that will run like butter on a 32" 2560x1440 monitor. I will not magically accept gaming on a low quality 19-24" 1080p 60Hz monitor for those few games at the expense of enjoying all the other games that look and run amazing.
Absolutely. Will always prefer 1440p with lowered settings than anything 1080p maxed out. Also, generally, 1440p monitors are of higher quality build/specs than 1080p. So aside from the higher detail res, IQ is typically better. Often there are many crippling settings that are more gratuitous in nature than making much difference in IQ. If maxed out gets you a 30% penalty in FPS, it doesnt mean you will automatically get a 30% improvement in IQ, yet I think thats the impression many people seem to have. Oftentimes its barely noticeable. Surprised at how many people with very powerful GPUs still stuck on 1080p. If I were to ever come across an unusually crippling game that was impossible to run at 1440p with lowered settings, I simply will not touch it. Havent come across one yet.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Absolutely. Will always prefer 1440p with lowered settings than anything 1080p maxed out. Also, generally, 1440p monitors are of higher quality build/specs than 1080p. So aside from the higher detail res, IQ is typically better. Often there are many crippling settings that are more gratuitous in nature than making much difference in IQ. If maxed out gets you a 30% penalty in FPS, it doesnt mean you will automatically get a 30% improvement in IQ, yet I think thats the impression many people seem to have. Oftentimes its barely noticeable. Surprised at how many people with very powerful GPUs still stuck on 1080p. If I were to ever come across an unusually crippling game that was impossible to run at 1440p with lowered settings, I simply will not touch it. Havent come across one yet.
Please show me which of these monitors have superior IQ? Last time I checked 1440p monitors suffered from the same backlight bleed and screen uniformity issues and they don't have any better dead pixel policy either. My $250 Dell monitor has the same dead pixel policy as a $1000 one.

As far as IPS panels are concerned the bigger they get the worse is the IPS glow and most of the GSync/FreeSync monitors are IPS or TN.

Sent from my HTC One M9
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
The fabled ASUS ROG series of monitors have laughably bad IQ for their monstrous prices.

Sent from my HTC One M9
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Even if we just focus on games and ignore the rest, it still matters because not everyone approaches the problem with the mindset of "I want all my games to run at 1080p at least 60fps", typically because doing so means targeting extremely high level hardware which can obtain that standard in demanding games. However it is wasted money in lower demanding games. So it becomes a trade off, of an entire library of games are you going to spend the extra money on 2 titans vs a single 980 to get a tiny fraction of those games to meet that standard?

It depends on what games you spend the most time on. If the 2% of games in your library are 90% of your gaming time going forward it matters. If you are a hardcore gamer that plays new releases as they come out that huge library might just be played games or what I call "Steam Decoration" (aka games you buy on sale and never play).

What matters is how you spend your time and some people want to play new releases on max settings. Other people want to play eGames or whatever the call older but popular games like CS Go and Dota 2 nowadays. That is why the third question after resolution and budget is what games do you play or plan to play.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
And IPS "glow" is usually not even noticeable in actual use. I've been using IPS panels since 2007 and would never go back to TN. My Acer XR341CK is amazingly beautiful.
Hate to say this but you must not have very high standards of IQ if you can't notice IPS glow. I never said anything about TN being good.

Sent from my HTC One M9