BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I have some of those games and I haven't noticed any issues. What menus in UT3 does it corrupt?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
I suspect that it's game engines not handing rendering to extremely large screen resolutions, from a technical standpoint you'd assume that SSAA would be the most compatible since from the engines perspective all it has to do is render out to a much bigger screen resolution (which is then scaled down).

In reality I'm betting many engines simply aren't built/tested for that, hell even with a regular 2560x1600 panel (using no AA) you find games that handle it badly.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
In reality I'm betting many engines simply aren't built/tested for that, hell even with a regular 2560x1600 panel (using no AA) you find games that handle it badly.
Usually when developer sets buffer size to 1280x720, he expects driver to obey him.
If driver forces buffer to be 2560x1440 it can cause some very strange things.

IE.
Developer calculates SSAO buffer to screen using pixels, in this case from 0 to 1280 in x dimension and to 720 in y..
Suddenly game overlays the SSAO into top left half of the screen and player disses developer for something that he couldn't predict.

Also, what happens if developer uses pixel shaders to calculate physics for particles..
It simply doesn't work as buffer sizes are wrong.. etc.

Simply put, giving such control to players trough drivers causes a lot of extra work to developers and driver teams.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Developer calculates SSAO buffer to screen using pixels, in this case from 0 to 1280 in x dimension and to 720 in y..
Suddenly game overlays the SSAO into top left half of the screen and player disses developer for something that he couldn't predict.
Why not just ask the driver what the resolution is? Most PCs are not going to be 1280x720 anyway, and future displays may even greater resolutions, at different ARs. The developer shouldn't have to predict it, they should test it on various displays with various settings, and if it doesn't work, fix it (or figure out that it is actually a driver problem, and whine to nV/AMD/Intel :)).

One trick to AA not quite working right has been to set a custom high res and have the GPU scale it down for the display.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Why not just ask the driver what the resolution is? Most PCs are not going to be 1280x720 anyway, and future displays may even greater resolutions, at different ARs. The developer shouldn't have to predict it, they should test it on various displays with various settings, and if it doesn't work, fix it (or figure out that it is actually a driver problem, and whine to nV/AMD/Intel :)).
So after you tell driver to set certain resolution for a buffer, you ask the driver what is the resolution and mode it actually did set it to? :)
One trick to AA not quite working right has been to set a custom high res and have the GPU scale it down for the display.
The reason why this works is because program knows what resolution the buffer is troughout the pipeline.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Usually when developer sets buffer size to 1280x720, he expects driver to obey him.
If driver forces buffer to be 2560x1440 it can cause some very strange things.

Exactly. But more than that, even if you assume buffer sizes get set correctly, there's often assumptions made by the developers about the range of resolutions the game has to render out to, some developers assume 1080p will be the largest screen resolution they'll see and make design decisions based off that.

There have been numerous games in the past that simply do not handle high resolutions, even if the engine correctly understands the screen resolution and get communicate that back and forth with the drivers.

It's bad practice for developers to make silly assumptions like that but it happens, i suspect that when 4k becomes possible for gaming that a LOT of old games will probably struggle to render correctly at those resolutions because the developers simply didn't account for it.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I suspect that it's game engines not handing rendering to extremely large screen resolutions, from a technical standpoint you'd assume that SSAA would be the most compatible since from the engines perspective all it has to do is render out to a much bigger screen resolution (which is then scaled down).
But that's not really how modern SSAA works. Modern SSAA is an extension of MSAA, which is to say it's based on the concept of taking multiple samples of the scene, usually in a rotated or other non-orthagonal pattern. The effect of SSAA is that you render x times as many pixels, equivalent to a higher resolution, but internally the renderer is still working at a lower resolution.

What's happening in this case is that a buffer is not being super sampled correctly. When the samples are merged/reduced, it's working from presumably 1/4 as many samples as it expects, resulting in a buffer 1/4 the size. A game doesn't have to be capable of rendering at a high resolution for SSAA to work, and conversely just because it can render at a high resolution doesn't mean SSAA will work. Just take one look at BF3, for example.

Anyhow, to answer the question at hand, SSAA has so many compatibility issues because it's a complex process that very few developers are taking into account. Deferred rendering breaks traditional SSAA implementations just like it does MSAA, so right out of the gate it only works on forward renderers. Then you have to take into account how this will work with the various shader stages, and at what point you take the additional samples. This isn't so bad on DX9 games due to the relatively fixed nature of the pipeline (assuming we have a forward renderer), but DX10+ games can basically do whatever they want, which means MSAA and SSAA must be custom tailored to the game in question if the developer goes at all off of the beaten path.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
So after you tell driver to set certain resolution for a buffer, you ask the driver what is the resolution and mode it actually did set it to? :)
The user has set the resolution. All values should be dependent on that set of x and y values. The developer shouldn't be setting 1280x720. The developer should be requesting the display size, calculating the size of the buffer based on that, and then setting its size based on that calculation, assuming it's not 1:1.

You're thinking like they should be console games, which, sadly, many are made from. Compatibility for deferred tricks, and post-processing effects should be tested and worked out, but often are not (but, if not, or if they won't work, they should at least implement an optimized FXAA or MLAA).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The user has set the resolution. All values should be dependent on that set of x and y values. The developer shouldn't be setting 1280x720. The developer should be requesting the display size, calculating the size of the buffer based on that, and then setting its size based on that calculation, assuming it's not 1:1.

You're thinking like they should be console games, which, sadly, many are made from. Compatibility for deferred tricks, and post-processing effects should be tested and worked out, but often are not (but, if not, or if they won't work, they should at least implement an optimized FXAA or MLAA).

It is also quite possible they have a table of values to use based on a resolution, but don't always have a table entry for some resolutions. This would explain why 5760x1080 isn't always supported.