Usually when developer sets buffer size to 1280x720, he expects driver to obey him.In reality I'm betting many engines simply aren't built/tested for that, hell even with a regular 2560x1600 panel (using no AA) you find games that handle it badly.
Why not just ask the driver what the resolution is? Most PCs are not going to be 1280x720 anyway, and future displays may even greater resolutions, at different ARs. The developer shouldn't have to predict it, they should test it on various displays with various settings, and if it doesn't work, fix it (or figure out that it is actually a driver problem, and whine to nV/AMD/Intel 🙂).Developer calculates SSAO buffer to screen using pixels, in this case from 0 to 1280 in x dimension and to 720 in y..
Suddenly game overlays the SSAO into top left half of the screen and player disses developer for something that he couldn't predict.
So after you tell driver to set certain resolution for a buffer, you ask the driver what is the resolution and mode it actually did set it to? 🙂Why not just ask the driver what the resolution is? Most PCs are not going to be 1280x720 anyway, and future displays may even greater resolutions, at different ARs. The developer shouldn't have to predict it, they should test it on various displays with various settings, and if it doesn't work, fix it (or figure out that it is actually a driver problem, and whine to nV/AMD/Intel 🙂).
The reason why this works is because program knows what resolution the buffer is troughout the pipeline.One trick to AA not quite working right has been to set a custom high res and have the GPU scale it down for the display.
Usually when developer sets buffer size to 1280x720, he expects driver to obey him.
If driver forces buffer to be 2560x1440 it can cause some very strange things.
But that's not really how modern SSAA works. Modern SSAA is an extension of MSAA, which is to say it's based on the concept of taking multiple samples of the scene, usually in a rotated or other non-orthagonal pattern. The effect of SSAA is that you render x times as many pixels, equivalent to a higher resolution, but internally the renderer is still working at a lower resolution.I suspect that it's game engines not handing rendering to extremely large screen resolutions, from a technical standpoint you'd assume that SSAA would be the most compatible since from the engines perspective all it has to do is render out to a much bigger screen resolution (which is then scaled down).
The user has set the resolution. All values should be dependent on that set of x and y values. The developer shouldn't be setting 1280x720. The developer should be requesting the display size, calculating the size of the buffer based on that, and then setting its size based on that calculation, assuming it's not 1:1.So after you tell driver to set certain resolution for a buffer, you ask the driver what is the resolution and mode it actually did set it to? 🙂
The user has set the resolution. All values should be dependent on that set of x and y values. The developer shouldn't be setting 1280x720. The developer should be requesting the display size, calculating the size of the buffer based on that, and then setting its size based on that calculation, assuming it's not 1:1.
You're thinking like they should be console games, which, sadly, many are made from. Compatibility for deferred tricks, and post-processing effects should be tested and worked out, but often are not (but, if not, or if they won't work, they should at least implement an optimized FXAA or MLAA).