- Aug 28, 2010
- 1,551
- 204
- 106
If a game needs to render twice the amount of pixels in a frame, will that take twice the time ?
Suppose I run 1920x1080 at (exactly) 60 fps.
Suppose I replace my monitor with a 2560x1440 monitor.
The new monitor has horizontally and vertically a third more pixels.
Total amount of pixels is 1.33 x 1.33 = 1.7689 more.
That's a 77% increase in pixels.
So on 1920x1080 my machine could render 124416000 pixels/second.
That means on 2560x1440 if it renders the same amount of pixels per second, it can reach only 33.75 frames per second.
Ouch. That's painful. Dropping from 60 fps to 34 fps.
I want to replace my old 1920x1080 screen with a G-Sync screen.
The Acer XB270HU is a nice screen, because it's the only IPS screen with G-Sync. But it is a 2560x1440 screen. That means my framerates will drop.
Note, G-Sync works best with fluctuating framerates, and when framerates are below 60fps. But it works less good when framerates drop under 30 fps. (To be precise: when it takes a game sometimes longer than 33 milliseconds to render the next frame). Today, in many game I have a nice 60fps. (I have a gtx680). In (older) games that are less demanding, I often enable SGSSAA and SSAO to make the game look better. While trying to maintain 60 fps or close to 60 fps. (Examples: The Wither 1 looks great with SSAO and 4xSGSSAA. I also used 4xSGSSAA in WoW when TXAA was the only option). Point I try to make: today I aim for 60dps both with new games and older games.
Now if in all those cases, if my framerates drops in those games from ~60 to ~34, I'm getting very close the minimal fps that is fluent even with G-Sync. What did I buy for my 750 euros ? I had fluent fps at 1080p before, and now I have maybe less fluent fps at ~34 fps. The only gain is a higher resolution. And tbh, I don't care much for resolution. 1080p at 27" is fine for me. I rather have SGSSAA, SSAO and other eyecandy than just a higher resolution.
My whole concern depends on one question:
Does framerate really go up and down linear with resolution ?
I understand that a game can also be CPU-limited. But at the moment, most games are not. Maybe RPSs or MMOs. But not the RPGs, FPSs and adventure games I play. And when DX12 arrives, even less games will be CPU-limited.
So what other factors are there ? If a game is fill-rate limited, or bus-bandwidth limited, I suspect that the game and videocard can push out a fixed amount of pixels per second, not more than the fill-rate or bus allows. And thus fps and resolution will have a linear relationship. Anything else ? Anyone know of examples of games when the relationship is not linear ?
Suppose I run 1920x1080 at (exactly) 60 fps.
Suppose I replace my monitor with a 2560x1440 monitor.
The new monitor has horizontally and vertically a third more pixels.
Total amount of pixels is 1.33 x 1.33 = 1.7689 more.
That's a 77% increase in pixels.
So on 1920x1080 my machine could render 124416000 pixels/second.
That means on 2560x1440 if it renders the same amount of pixels per second, it can reach only 33.75 frames per second.
Ouch. That's painful. Dropping from 60 fps to 34 fps.
I want to replace my old 1920x1080 screen with a G-Sync screen.
The Acer XB270HU is a nice screen, because it's the only IPS screen with G-Sync. But it is a 2560x1440 screen. That means my framerates will drop.
Note, G-Sync works best with fluctuating framerates, and when framerates are below 60fps. But it works less good when framerates drop under 30 fps. (To be precise: when it takes a game sometimes longer than 33 milliseconds to render the next frame). Today, in many game I have a nice 60fps. (I have a gtx680). In (older) games that are less demanding, I often enable SGSSAA and SSAO to make the game look better. While trying to maintain 60 fps or close to 60 fps. (Examples: The Wither 1 looks great with SSAO and 4xSGSSAA. I also used 4xSGSSAA in WoW when TXAA was the only option). Point I try to make: today I aim for 60dps both with new games and older games.
Now if in all those cases, if my framerates drops in those games from ~60 to ~34, I'm getting very close the minimal fps that is fluent even with G-Sync. What did I buy for my 750 euros ? I had fluent fps at 1080p before, and now I have maybe less fluent fps at ~34 fps. The only gain is a higher resolution. And tbh, I don't care much for resolution. 1080p at 27" is fine for me. I rather have SGSSAA, SSAO and other eyecandy than just a higher resolution.
My whole concern depends on one question:
Does framerate really go up and down linear with resolution ?
I understand that a game can also be CPU-limited. But at the moment, most games are not. Maybe RPSs or MMOs. But not the RPGs, FPSs and adventure games I play. And when DX12 arrives, even less games will be CPU-limited.
So what other factors are there ? If a game is fill-rate limited, or bus-bandwidth limited, I suspect that the game and videocard can push out a fixed amount of pixels per second, not more than the fill-rate or bus allows. And thus fps and resolution will have a linear relationship. Anything else ? Anyone know of examples of games when the relationship is not linear ?