4-to-1 Scaling: the Key to Smooth FPS on Ballooning Resolutions

know of fence

Senior member
May 28, 2009
555
2
71
People with a liquid crystal display have to put up with a lot of BS: "native" resolution, aliased text, latency, 60Hz, 16:10, Twisted Nematic... I spare you the full list.
Meanwhile on the horizon geometrically increasing display resolutions seem to add to a gamer's anxiety. Will I have to run Quad-SLI² just to be able to get measly 60 FPS on "4K" Display without any scaling artefacts?

The answer to this is a resounding, YES! To drive 8 Megapixels worth of screen you need 4 times the power³ of a normal 1080p card. [Edit] After charting a graph with randomly selected performance data from the internet and numerically extrapolating the actual curves, it would appear that about 3.2 times the processing power would be required to drive a screen 4 times as large. Generously assuming video cards improve by 20% every two years, it will take around 13 years to reach that point. It deserves mentioning that multi card setups scale less than linear as well. (FYI Radeon HD 6970 did scale much better than the GTX 580 in AT-tests). When all is said and done, a 4K Resolution still requires Quad-SLI².[/edit]

But before Nvidia shareholders can rub their hands in glee, there is this little, completely trivial thing of 4-to-1 pixel scaling, which can give you smooth framerates and a crisp picture without having to spend a lot of money.

To retain any geometrical shape without scaling artefacts you simply double both sides and thus quadruple the total area. This simple idea is used for instance by many indie games to create a crisp pixelated look on a big screen, 4 pixels instead of 1. The same principle is presumably behind the 4K resolution, to retain the quality and value of fullHD, 1080p video.

4K_res.png

But there is more, 4K also happens to be 720p times 9; 1280x720 tripled. This may not be rocket science but it is exciting. You can render games and scale them up smoothly in 720p or 1080p for 60, better yet 120 FPS, without any scaling artefacts, and still enjoy high PPI displays for your fotos and text - the future looks bright indeed!

But what about the present:
1. Do any of you render games in 720p on a 2560x1440 TFT today?
2. Does your graphics card even allow to run 960x540 on a fullHD screen?
3. Do you still say "give me native or give me death!?

²Quad-SLI is a loosely used label for a ridiculously overpowered set-up.
³ [EDIT] As AT-poster blastingcap pointed out the relationship between resolution and framerate is not strictly linear, more on page 2. The graphs are approaching linear however, and they get more straight as resolutions increase.
 
Last edited:

Spikesoldier

Diamond Member
Oct 15, 2001
6,766
0
0
But what about the present:
1. Do any of you render games in 720p on a 2560x1440 TFT today?
2. Does your graphics card even allow to run 960x540 on a fullHD screen?
3. Do you still say "give me native or give me death!?

1. no
2. probably
3. yes
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
People with a liquid crystal display have to put up with a lot of BS: "native" resolution, aliased text, latency, 60Hz, 16:10, Twisted Nematic... I spare you the full list.
Meanwhile on the horizon geometrically increasing display resolutions seem to add to a gamer's anxiety. Will I have to run Quad-SLI² just to be able to get measly 60 FPS on "4K" Display without any scaling artefacts?

The answer to this is a resounding, YES! To drive 8 Megapixels worth of screen you need 4 times the power of a normal 1080p card, not even counting the SLI penalties.

But before Nvidia shareholders can rub their hands in glee, there is this little, completely trivial thing of 4-to-1 pixel scaling, which can give you smooth frame rates and a crisp picture without having to spend a lot of money.

To retain any geometrical shape without scaling artefacts you simply double both sides and thus quadruple the total area. This simple idea is used for instance by many indie games to create a crisp pixelated look on a big screen, 4 pixels instead of 1. The same principle is presumably behind the 4K resolution, to retain the quality and value of fullHD, 1080p.

4K_res.png

But there is more, 4K also happens to be 720p times 9; 1280x720 tripled. This may not be rocket science but it is exciting. You can render games and scale them up smoothly in 720p or 1080p for 60, better yet 120 FPS, without any scaling artefacts, and still enjoy high PPI displays for your fotos and text - the future looks bright indeed!

But what about the present:
1. Do any of you render games in 720p on a 2560x1440 TFT today?
2. Does your graphics card even allow to run 960x540 on a fullHD screen?
3. Do you still say "give me native or give me death!?

²Quad-SLI is a loosely used label for a ridiculously overpowered set-up.

I started playing at 5760x1080 at native resolution near the beginning of this year on a single 7970, albeit with settings not maxed out. That's 75% as many pixels as a 4K screen. In the future, I think we'll be fine, especially since 4K probably won't become affordable and commonplace for many years, by which point we'll be gaming on far faster GPUs.

Eyefinity on three 4K screens may be a tough nut to crack, though.

Edit to add: I would also add that framerate doesn't drop linearly with resolution, believe it or not. Just because you're on triplescreen does not mean you get 33% of the fps, more like 40-60% depending on the game. Furthermore, TPU reported that NV was working on some sort of Surround mode that would devote more GPU power to the middle screen and less to the sides, which makes sense because the side screens are for peripheral vision anyway. So if efforts like that are successful, 5 years from now 4K may finally be affordable and we'll also have enough GPU power to play on three 4K screens at good framerates, with "just enough" image quality on the side screens.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
How is this different from, say, running the screen at a lower-than-native resolution?

On one hand, you could let the screen do the scaling. On the other hand, you can do scaling in the graphics card. On a third hand, I recall some people doing this (driving a lower resolution on a 1440p display) even when the screen lacked its own scaler, without having to turn on scaling in the graphics card. So isn't this already being done now, without any need to do anything special?
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I already have a PS3 I could buy games for if I wanted my games rendered at a low resolution and then upscaled.
 

MarkLuvsCS

Senior member
Jun 13, 2004
740
0
76
I already have a PS3 I could buy games for if I wanted my games rendered at a low resolution and then upscaled.

^ too true.

Sorry there is no magical replacement for more detail. These tricks allow for the simplest upscaling techniques without a performance impact but do nothing to improve quality.
 

know of fence

Senior member
May 28, 2009
555
2
71
How is this different from, say, running the screen at a lower-than-native resolution?

Although some scalers are better than others, they all have to blur the picture and distort text. You could read the Wiki article or you can hold down [Ctrl] and scroll you mouse-wheel to see the how your browser handles scaling, [Ctrl]+[0] to reset. Suffice it to say that pictures either become pixely or blurry fast. http://en.wikipedia.org/wiki/Image_scaling
The technical solution to this is vector graphics rather than pixels, which is why some 2D Adobe Flash ads scale flawlessly sometimes. 3D graphics seem to be a combinations of nicely scaling vector 3D geometry and textures. However 3D requires a lot of computational resources, which arguably would be completely wasted rendering pixels that you can't really see on a high PPI display.

The big difference is if you scale in multiples, meaning 1-to-4 (1-to-9, 1-to-16) there is no complicated algorithm required, every pixel just becomes 4 (9x,16x) times as big as the original with no blur at all, and with no additional aliasing.
A good scaling algorithm should remove all blur and AA once it hits those multiples of the original picture resolution. I don't know if that the case for scaling with the browser, graphics hardware scalers or built in display scalers. The easiest way to test this by setting your resolution at half your native screen hight and widths and see for yourself:

- to test built in monitor scaling windows resolution or game options resolution needs to be changed. This is also the most promising option for the future.
- testing video card scaling is the hardest, because it requires setting up an internal rendering resolution, though it's said to have the best results. The set-up process differs with games and drivers.

In the past when people tried to play a game at lower than native they suffered drawbacks from an additional filtering layer of blurring, not to mention that it's hard to even keep the original aspect ratio. Have you ever tried to set 800x600 SVGA on any of the modern screens? - It's aliased, distorted and blurry, which sends people screaming into the "native" camp.

With 4-to-1 scaling you keep the aspect ratio, there should be no blur, the resolution is low, but this massively benefits your framerate, it also enables one to easily record or stream video footage. Don't expect desktop card manufacturers to jump on that bandwagon though. But Trinity APU on a small 2K screen, may offer viable gaming already with this (that is 720p resolution scaled up 4-to-1). Somebody has to open their mind and try, kudos to the review site that mentions this possibility first.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It will be a reasonable way for a low end graphics card to achieve acceptable quality if 4k means we will get pixel densities 4x todays screens. But if the pixel density drops it wont look good.
 
Oct 16, 1999
10,490
4
0
This is good info. You're always going to have the best picture quality scaling by even multiples because your scaler isn't having to approximate fractions of a pixel. I bought a 540P projector years ago based on this (I know, opposite direction, but the principle holds). One day I might even buy a blu-ray player.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
... every pixel just becomes 4 (9x,16x) times as big as the original with no blur at all, and with no additional aliasing.
A good scaling algorithm should remove all blur and AA once it hits those multiples of the original picture resolution.

Is that even technically possible? If you have a diagonal line at the low res, it looks like this:
_
|_
|_

After you blow it up, it would get "rougher"

__
|
|__
|
|__

So you blow up the actual pixels, but the problem is the "valleys" still grow with it. So your slanted line will grow to show the same defects in a bigger scale.

I guess my point is you seem to overlook that the scaling will also scale-up the artifacts, like the jagged saw-edge of slanted lines, that when magnified, begin to look like giant Lego blocks?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
But what about the present:
1. Do any of you render games in 720p on a 2560x1440 TFT today?
Maybe, but I doubt it. If you can spend the money for the display, you can probably spare enough for a suitable video card.
2. Does your graphics card even allow to run 960x540 on a fullHD screen?
No idea. I've never hooked up that way, and probably never never will.
3. Do you still say "give me native or give me death!?
No, but flexible calling options don't exist right now, at the driver level, AFAIK, so native is a must for a decent image.

Right now, I'd love to be able to set my 1680x1050 monitor to a crisp-scaled 840x525, for good-looking 640x480. For a 1920x1200 display, I'd like to see 960x600 done the same way, for decent-looking 640x480 and 800x600.

It's not some magical feature just for 4k. Some of us would like to have it now. There are several games I have avoided, FI, just because of the fixed resolution and blurry or irregular monitor scaling.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
When I first got my Dell 30" 2560x1600 panel about 3-4 years ago now, it didn't come with an internal scaler, this meant that it only accepted it's native resolution of 2560x1600, but it turns out that it's such a large resolution that 1280x800 is actually 1/4 of the screen exactly, and that by simply using a block of 2x2 pixels for every 1 pixel in 1280x800 you can perfectly scale this.

You're right about doubling the resolution (in each axis) quadruples the area, same goes for pixel count, so very high resolutions grow insanely fast. But it's worth keeping in mind that GPU power actually grows at a similar rate, it tends to approximately double every 18-24 months. It's pretty trivial to run 1080p today so 4k will be equally as trivial in about 4 years.

Realistically extremely high end GPU solutions from this generation can already cope with 4k and higher, some people have Tri/Quad SLI to deal with multi-monitor 3x30" configurations @ 2560x1600x3 which is approximately 12Mpix. I'm betting a single GTX690 could power 4k displays today in a great many games.

If you want to know how well your GPU would do, enable 2xSSAA at 1080p, Super Sampling Anti-Alasing effectively renders the scene at 4k then downsamples that to achieve the AA effect.

I will almost certainly be looking at getting a 4k projector when they become affordable, my 1080p does really well at about 120" screen size from a 9 foot viewing distance but 4k would look beautiful. I'm not completely convinced that the investment would be worth it for a desktop monitor.

Currently Sony do a 4k projector which is £17,000 inc VAT, I'm saving my pennies :)
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Right now, I'd love to be able to set my 1680x1050 monitor to a crisp-scaled 840x525, for good-looking 640x480. For a 1920x1200 display, I'd like to see 960x600 done the same way, for decent-looking 640x480 and 800x600.

It's not some magical feature just for 4k. Some of us would like to have it now. There are several games I have avoided, FI, just because of the fixed resolution and blurry or irregular monitor scaling.
You mean like this?
840x525.png
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
1. Do any of you render games in 720p on a 2560x1440 TFT today?
2. Does your graphics card even allow to run 960x540 on a fullHD screen?
3. Do you still say "give me native or give me death!?

1. No. 720p sucks. so does 1080p. I have had "1200p" for better part of a decade.
2. No idea, I use a monitor not a TV.
3. Yes, definitely. See #2
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Pixel doubling (which is a pretty bad misnomer since it's really pixel quadrupling at minimum) is neither a new concept, nor something that most of us aren't fully aware of.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Right now, I'd love to be able to set my 1680x1050 monitor to a crisp-scaled 840x525, for good-looking 640x480. For a 1920x1200 display, I'd like to see 960x600 done the same way, for decent-looking 640x480 and 800x600.

It's not some magical feature just for 4k. Some of us would like to have it now. There are several games I have avoided, FI, just because of the fixed resolution and blurry or irregular monitor scaling.

I'd be mighty tempted to test custom resolution in games, most games have either command line access or cfg/ini files where you can manually set the xResolution and yResolution, try setting them to 1/4 of your screen resolution and make sure image scaling is allowed in the video driver control panel.

I'm betting that gives you perfect 4:1 scaling, however this might only work well if 1/4th of your screen resolution is higher than the games minimum accepted screen resolution.

Some monitors come with hardware scalers inside them and can probably be set to 1/4 of their resolution in windows so everything runs that low and the panel takes care of scaling itself, this may require a custom resolution entry in your driver supported resolutions config.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You mean like this?
840x525.png
Partly. The next step is to center full-screen resolutions smaller than that. The final step is control the scaling mechanism (whole number nearest neighbor >1, most of the time).

IE, the ideal situation is not setting 840x525 in control panel, but being able to set scaling and centering for lower resolution requests, such that the monitor gets a native resolution signal/framebuffer, and the driver hides the reality of it from the application.

IoW, a 640x480 framebuffer gets rendered, overlaid onto 840x525 at 100,22, and the 840x525 buffer is then upscaled by a controllable method (again, nearest neighbor most games) to 1680x01050, then actually displayed. The middle overlay step could probably be skipped, in an actual implementation, though it may be too much to ask for a full customized scaling setup, v. being able to declare that the 840x525 should be nearest neighbor upscaled, and full-screen centered within that.

IoW, what the OP is talking about, but for any resolution where some whole number >1 scaling multiple can fit inside native res.

The worst part? I used to have a laptop that did this in hardware, automatically, so I find it hard to believe that it is technically infeasible.

I'd be mighty tempted to test custom resolution in games, most games have either command line access or cfg/ini files where you can manually set the xResolution and yResolution, try setting them to 1/4 of your screen resolution and make sure image scaling is allowed in the video driver control panel.
Those games are not a problem. It's the early non-3D Windows games that are. Games like FO1-2 and Icewind Dale get too tiny, if the resolution is made native, yet fuzzy if not; and some games, especially older adventure games, don't have any good ways to change res.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Partly. The next step is to center full-screen resolutions small than that.

Depending on what your monitor reports as an available input res, most video drivers will pad any resolutions in between if you shut off scaling such that they will send whatever is set, but letter and pillar box it to fit.

You may have to make a custom monitor inf file to make this work right.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Depending on what your monitor reports as an available input res, most video drivers will pad any resolutions in between if you shut off scaling such that they will send whatever is set, but letter and pillar box it to fit.

You may have to make a custom monitor inf file to make this work right.
But, how do you get that to also scale by some amount, like in the OP, without blurring? 640x480 w/o upscaling is much smaller than intended, on today's monitors.

FI, a 17" CRT will have a height of around 9.5", though it varied by make and model (usually 1-2" of the diagonal weren't viewable).
1080 lines on a 24" will also be about 9.5" *.
480 unscaled lines on a 24" 1080P monitor, though, will be about 4.25".
Upscaled to 960, those 480 lines would be 8.5", which would be a much nicer compromise. Outside of DOSbox, however, I've found it difficult to impossible to make that kind of thing happen.
Upscaled to full and fuzzy size, even with the right AR, just looks way too bad.

* I wouldn't move up for a mere 30 pixels, but I will admit that 1080P is more popular and more important of a resolution than the superior 1920x1200, even if 1920x1200 would give a nice 800x600, and my own 1680x1050 would give a larger 480x2 actual size :).
 
Last edited:

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
Lol at display manufacturers touting this as a 'feature' rather than a cop-out. What about the change from 16:10 to 16:9 screens? Where where they then when screen real estate kept getting shorter and shorter? Did they care about our jaggy texts and scaling artifacts then?

It's just cheaper for them to stitch together their 4x smaller screens into one big quadruple screen.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
This is basically a problem that hardly anyone has. Once this becomes a problem, GPUs will be faster. I expect they will employ tricks that vary the amount of detail rendered to different parts of the screen. Wouldn't surprise me if eyeball tracking was employed someday in the not far future to make dynamic targeted level of detail adaptations.

Also, it's possible to do 2:1 scaling if the video card drivers added support for the weird non-square resolutions.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
3. Do you still say "give me native or give me death!?

Upscaling to 4K from 1080P or needing more GPU power to drive native 4K resolution do not change the forecast that 4K screens/TVs are not expected to come down in price to mainstream levels for at least 5 years from now. Right now a 31.5 inch 4K Viewsonic monitor costs $20-30k. Even if we assume halving of price every 12 months, it'll be $10-15k by end of 2013, 5-7.5k by end of 2014, 2.5-3.75k by end of 2015, 1.25-1.875k by end of 2016, $625-938 by end of 2017 for a 31 inch 4K screen.

4K resolution won't make graphics magically better on its own by much. You still need next generation physics, AI, textures, shader, lighting, shadow effects and so on. The revolution in these will be impacted more by next generation of consoles and the pace of GPU innovation. As games get better looking even at 1080P and 1600P, modern GPUs will be brought to their knees without a need for 4K resolutions.

1343730217Q8scDBKVfx_2_5.gif


Because the current console generation is the longest one in history, many people think that we have too much GPU processing power and that is why the 4K resolution is the next step for our GPUs. As the next generation of games emerges with new consoles, even at 1080P videogames will likely become 10x more graphically demanding (i.e., 1 character model in Unreal Tournament III had more polygons than the entire level of Unreal Tournament 2004). These types of leaps happen every new generation of consoles. Before 4K becomes a factor, there will come a new wave of next generation game engines and games but the majority of game developers are currently tied down by 2005-2006 GPU hardware tech that's present in more than 200 million consoles.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
But, how do you get that to also scale by some amount, like in the OP, without blurring? 640x480 w/o upscaling is much smaller than intended, on today's monitors.

You need to have, for example, for a 2560x1600 monitor, only the following resolutions available: 2560x1600, 1280x800, 640x400. At that point, don't allow any scaling. With properly working drivers, the output should always be one of those three resolutions with intermediate resolutions pillar and letter boxed to the next largest size.

This relies on your monitor properly performing the pixel doubling, however.

edit: it's more a problem of monitors listing every resolution under the sun smaller than native as valid input resolutions (with no trivial way to override it) that makes the problem complex.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You need to have, for example, for a 2560x1600 monitor, only the following resolutions available: 2560x1600, 1280x800, 640x400. At that point, don't allow any scaling. With properly working drivers, the output should always be one of those three resolutions with intermediate resolutions pillar and letter boxed to the next largest size.

This relies on your monitor properly performing the pixel doubling, however.
I trust a monitor to that less than I trust any other computer component to do what I want. Outside of business notebooks, it has been 5+ years since I have seen a display not do blurry scaling. I'm sure some are out there, but if it were done by the GPU, to a native res framebuffer, then the monitor could be entirely removed as a variable, which would be a good thing.

Not merely that, but since the whole point would be lower res videos and video games, even a substantial GPU performance hit would still result in both better FPS/quality, compared to full rendering of 4x or higher the number of pixels. As I'm sure you can guess, I'm more interested in the quality part.
 
Last edited: