- Dec 8, 2010
- 11,897
- 74
- 91
I've always thought that if you increase resolution by x % your fps will be x % higher with your original resolution. For instance, 2560x1440 has 77% more pixels than 1920x1080, so 1980x1080 would have 77% better fps, bottlenecks notwithstanding. And by extension, I've thought that higher resolution requires proportionally faster graphics card to reach the same fps.
And this is something I didn't really even think to question because I thought it was obvious, but apparently it isn't true. Looking at fps numbers on different resolutions in literally any graphics card or game performance review, it seems a doubling in resolution only results in 1/3 lower fps. Or going from 1080p to 1440p (which has 77% more pixels) results in only 25-30% lower fps, and not 1 - (1/1.77) = 44% lower as I expected.
Can someone explain to me why this works like this? If a GPU has twice as many pixels to render, shouldn't it take twice as long to put out each frame?
And this is something I didn't really even think to question because I thought it was obvious, but apparently it isn't true. Looking at fps numbers on different resolutions in literally any graphics card or game performance review, it seems a doubling in resolution only results in 1/3 lower fps. Or going from 1080p to 1440p (which has 77% more pixels) results in only 25-30% lower fps, and not 1 - (1/1.77) = 44% lower as I expected.
Can someone explain to me why this works like this? If a GPU has twice as many pixels to render, shouldn't it take twice as long to put out each frame?