Do frame-per-second and resolution have a linear relationship ?

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
If a game needs to render twice the amount of pixels in a frame, will that take twice the time ?

Suppose I run 1920x1080 at (exactly) 60 fps.
Suppose I replace my monitor with a 2560x1440 monitor.
The new monitor has horizontally and vertically a third more pixels.
Total amount of pixels is 1.33 x 1.33 = 1.7689 more.
That's a 77% increase in pixels.

So on 1920x1080 my machine could render 124416000 pixels/second.
That means on 2560x1440 if it renders the same amount of pixels per second, it can reach only 33.75 frames per second.

Ouch. That's painful. Dropping from 60 fps to 34 fps.

I want to replace my old 1920x1080 screen with a G-Sync screen.
The Acer XB270HU is a nice screen, because it's the only IPS screen with G-Sync. But it is a 2560x1440 screen. That means my framerates will drop.

Note, G-Sync works best with fluctuating framerates, and when framerates are below 60fps. But it works less good when framerates drop under 30 fps. (To be precise: when it takes a game sometimes longer than 33 milliseconds to render the next frame). Today, in many game I have a nice 60fps. (I have a gtx680). In (older) games that are less demanding, I often enable SGSSAA and SSAO to make the game look better. While trying to maintain 60 fps or close to 60 fps. (Examples: The Wither 1 looks great with SSAO and 4xSGSSAA. I also used 4xSGSSAA in WoW when TXAA was the only option). Point I try to make: today I aim for 60dps both with new games and older games.

Now if in all those cases, if my framerates drops in those games from ~60 to ~34, I'm getting very close the minimal fps that is fluent even with G-Sync. What did I buy for my 750 euros ? I had fluent fps at 1080p before, and now I have maybe less fluent fps at ~34 fps. The only gain is a higher resolution. And tbh, I don't care much for resolution. 1080p at 27" is fine for me. I rather have SGSSAA, SSAO and other eyecandy than just a higher resolution.

My whole concern depends on one question:
Does framerate really go up and down linear with resolution ?

I understand that a game can also be CPU-limited. But at the moment, most games are not. Maybe RPSs or MMOs. But not the RPGs, FPSs and adventure games I play. And when DX12 arrives, even less games will be CPU-limited.

So what other factors are there ? If a game is fill-rate limited, or bus-bandwidth limited, I suspect that the game and videocard can push out a fixed amount of pixels per second, not more than the fill-rate or bus allows. And thus fps and resolution will have a linear relationship. Anything else ? Anyone know of examples of games when the relationship is not linear ?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
No, due to bottlenecking at various places in the graphics pipeline some cards drop frame rates faster than others as resolution goes up. You can see this with the GTX 970 compared to the 290x. The 970 typically leads at 1080p, leads less or ties at 1440p and loses at 4k.

Fill rate, bus bandwidth, etc, are all good theoretically but it all really boils down to how does it actually perform real-world? Maxwell and Tonga have high quality memory compression which reduced memory bandwidth requirements on the same size bus. You can't really tell much from those theoretical maximum figures.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
It's not perfectly linear, but pretty close.

Check out a gpu review on techpowerup, they test at different resolutions.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
No, due to bottlenecking at various places in the graphics pipeline some cards drop frame rates faster than others as resolution goes up. You can see this with the GTX 970 compared to the 290x. The 970 typically leads at 1080p, leads less or ties at 1440p and loses at 4k.
I forgot to mention VRAM.
Of course, the size of VRAM functions as a hard limit. If you go over it, performance will plummet. So in case where your card doesn't have enough VRAM for 1440p, performance will be even worse than linear scale predicts.

it all really boils down to how does it actually perform real-world?
Understood. That's why I ask my question here. (In stead of reading and comparing all benchmarks for all resolutions for a few dozen games on a dozen websites).
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Check out a gpu review on techpowerup, they test at different resolutions.

Thanks. Do you mean all of their benchmarks every time a new GPu gets released ? Or do they have a specific benchmark on the subject ?

It's not perfectly linear, but pretty close.
That was my gut feeling too.
It's weird. It seems many people want a 4k screen for gaming.
A 4k screen as 4x as many pixels as a 1080p screen.
And therefor framerates will be 4x as low.

Suppose you can (just) maintain 144 fps on your 144Hz screen at 1080p.
That means that a 4k, you can maintain only 35 fps.
If I had to chose between 144fps@1080p or 35fps@4k, than the choice is easy. I'd go for 144k@1080p.
Yes, I realize that some people will be two Titan Xs. But even though I have the money to buy overpowered hardware, I rather spend it a little smarter. And even then, I rather throw more eyecandy at my hardware than just larger resolution.

The dilemma that triggers these questions for me is about the Acer XB270HU. I really would like to have a new monitor with G-Sync and ULMB. If it would be an IPS screen, that would be awesome. But the XB270HU is 2560x1440. And thus my framerates would drop significantly. I could replace my gtx680. But I was planning to replace it only when the first new 20nm (or 16nm) GPUs will be released. Unfortunately that's gonna take another 12 months or so.

An alternative would be 1080p TN G-Sync monitor by Acer that they released last year. I hadn't realized it, but that monitor is also 8-bit (just like the Switf ROG). It is still not IPS, that's the big downside. But the lower resolution might actually be the factor that is most important.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
There's a ton of variables that make it not so simple. Going to assume off the bat that you're not CPU limited in the first place. Some of the workload done on the GPU scales with geometry instead of resolution, like vertex shading, tessellation, and triangle setup. Larger polygons may end up more efficient than smaller ones due to granularity of rasterization and quad shading. If textures are already at their highest level of detail at the lower resolutions an increase in resolution won't increase their texture bandwidth footprint.

On the other hand, the resolution increase can push working sets outside of a sweet spot that fits nicely in cache and buffer structures, resulting in a worse than linear degradation.

Like others have said your best bet is to look at GPU reviews using running games you're interested in, or ask people to do some benchmarks for you.

This also kind of goes without saying, but it's not like you have to use native resolution on the new display. Maybe it's implied that upscaling is unacceptable to you; personally I don't think it's that bad for things like games.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Gryz, I'm in a similar situation to you. I've been holding out going past 1080p for years simply because I am an fps fiend. For me, a smooth frame rate is anything that stays above 75-80 fps constantly. While high resolution panels have been out for years, I have been holding out for the GPU horsepower to match those increases in resolution.

I believe that we are finally there, the horsepower is there to give me the fps that I want at higher res, and ALSO technologies such as GSYNC and other panel tech (like low response time) has finally caught up to allow me to upgrade.

If I were you I would of course not feel comfortable upgrading with a GTX 680 to drive 2560X1440 resolution, but then again I am not you and you may have much less need for those higher frame rates in every title.

My question is, if you are willing to drop 800$ on a premier panel, why not drop a little more on a high clocked GTX970 or a used 980 ? Then, in my opinion you will have the best of both worlds.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
I've done a little counting.

I took this data as my source.
http://www.anandtech.com/bench/product/1442
Recent benchmarks of the gtx980.
Most games were benchmarked at 1440p and 1080p.

If you look purely at the number of pixels, than you could expect 60 fps at 1080p to translate to 33.75 fps at 1440p. That means at 1440p you can render only 56.25% of the frames that you can render at 1080p.

Now let's look at the measured numbers.
Format: game - fps at 1080p - fps at 1440p = percentage

Battlefield 4 - 91.3 - 58.9 = 64.5%
Crysis - 103.6 - 65.7 = 63.4%
Mordor - 93.6 - 65.3 = 69.8%
Civilization - 124.9 - 93.8 = 75.1%
Dragon Age - 88.2 - 59.3 = 67.2%
Talos Principle - 102.2 - 71.8 = 70.3%
Far Cry 4 - 81.9 - 58.8 = 71.8%
Total War - 54.4 - 36.3 = 66.7%
Grid - 121.6 - 92.9 = 76.4%

The framerates at 1440 are between 63.4% and 76.4% of the framerates at 1080p. That's a lot better than the expected 56.25%.


Numbers for the gtx680.
http://www.anandtech.com/bench/product/1348

CoH2 - 37.4 - 23.8 = 63.7%
BioShock - 81 - 51.6 = 63.7%
Battlefied 4 - 53.5 - 34.6 = 64.7%
Crysis 3 - 66.3 - 42.1 = 63.5%
Crysis WH - 53.4 - 33.8 = 63.3%
Total War - 48.1 - 29.3 = 60.9%
Thief - 48 - 31.2 = 65%
Grid 2 - 82.9 - 58.7 = 70.8%
Metro LL - 52.7 - 36.4 = 69% (note: quality was dropped from VHQ to HQ).

With a gtx680, the framerates at 1440p are between 60.9% and 70.8% of the framerates at 1080p. And most of the games are at 63-65%. Still better than the expected 56.25%. But slightly worse than what we saw with a gtx980.
 
Last edited:

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
simply because I am an fps fiend.
I am not. But I enjoy more eyecandy. Good AA, SSAO, good shadows, lots of foliage, higher polycounts, higher-res textures, etc. I rather have 50-60fps with all of those, than 144fps with less eyecandy.

My question is, if you are willing to drop 800$ on a premier panel, why not drop a little more on a high clocked GTX970 or a used 980 ?
I might. But one fun part of building your own PC is to try to maximize the return for every euro invested. Just throwing money at a problem is not the most fun solution. I was planning to buy a new videocard every 2-3 years. But with the slow-down of smaller transistors, it takes even longer than that to go from 28nm to something smaller.

Also, I have a watercooled system. Because I want my PC to be (almost totally) quiet. Watercooled CPU and GPU, low-rpm fans, large SSD, suspended HDD, etc. Silencing the water-pump was the hardest. If I buy a new videocard, I also have to buy a new waterblock (80-100 euros). And my old videocard is harder to sell, because of the voided warranty. gtx980 is 600 euros here. So 700 with waterblock. And a TitanX is 1200 euros. Crazy numbers. And I don't even have a game I to play atm. (Gogogo GTAV and Witcher3!)
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
I am not. But I enjoy more eyecandy. Good AA, SSAO, good shadows, lots of foliage, higher polycounts, higher-res textures, etc. I rather have 50-60fps with all of those, than 144fps with less eyecandy.


I might. But one fun part of building your own PC is to try to maximize the return for every euro invested. Just throwing money at a problem is not the most fun solution. I was planning to buy a new videocard every 2-3 years. But with the slow-down of smaller transistors, it takes even longer than that to go from 28nm to something smaller.

Also, I have a watercooled system. Because I want my PC to be (almost totally) quiet. Watercooled CPU and GPU, low-rpm fans, large SSD, suspended HDD, etc. Silencing the water-pump was the hardest. If I buy a new videocard, I also have to buy a new waterblock (80-100 euros). And my old videocard is harder to sell, because of the voided warranty. gtx980 is 600 euros here. So 700 with waterblock. And a TitanX is 1200 euros. Crazy numbers. And I don't even have a game I to play atm. (Gogogo GTAV and Witcher3!)


yes, I see now why your situation is a bit complicated.

As Termie pointed out, the drop seems to be about 50%, so ask yourself if you are ok with that drop, or would be ok with lowering IQ settings to stay where you want.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
This article actually has the answers you're looking for, all summed up on one page:

http://www.techspot.com/news/60148-multi-monitor-ultra-wide-or-4k-what-delivers.html

The resolution jump from 1080p to 1440p is 77%. The performance penalty is around 50%, but depends on the game.
I found that article not very informative.
They tested only 3 games. There are no numbers, only a graph. The graph gives you a general feeling: "more pixels means lower fps". I already knew that.

The data taken from Anandtech's benchmarks indicate that the loss of fps is more like 36% (or 30%-40%, if you want to word it like that), not 50%.

I will check a few more benchmarks on other websites. But I've answered my own question: it's not a 1-on-1 relationship between pixels and fps. And my fps loss will be ~36% if I go from 1080p to 1440p.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
1440p may be 36% slower than 1080p, but looked at from the other direction, 1080p is 50% faster than 1440p.

Whether any of this matters to you depends entirely on your video card. Dropping from 90 to 60 on a 980 doesn't matter much. Dropping from 60 to 40 on something like a GTX 770 will ruin the experience.

You answered your own question, but you forgot your premise. Starting at 60fps at 1080p and then moving to 1440p. The outcome would be unplayable.