What's preventing it from using some kind of anti-aliasing in games it runs? Either I've missed some crucial fact about this hardware or I feel as though I'm not understanding you.
If you are unable to play at 900p at a playable FPS, you will be unable to play with any sort of AA at 720 or 800p. With a Vega APU at least, 1080p has a MASSIVE fps impact, to the point that 900p (1600x900) is better both in quality and fps, compared to 1080p, 900p gives you the possibility to use higher details, for example with a overclocked 2200G it was about 30-35 fps on Witcher 3 on 1080p all low, but, i was able to to play at 40fps avg using a combination of low/med/high/ultra settings, to the point i was unable to notice a diference from when i had the RX480 with everything on high-ultra whiout a side by side comparison.
This is why i always hate tech sites APUs reviews, they have no idea of what they are doing because they never actually used one and they do 1080p low and show you bad fps and then they drop to 720p just to tell you it looks awful.
Going from 900p to 720p, it gives you more FPS but not much as you may think, and you really start having some severe quality problems from lack of AA, and if you apply AA to 720p you are defeating the point of going 720p.
So this is why i like 900p more. 800p may be better to use here due to screen size, ill need to see it, but i dont think it has anything to do with performance, unless the alternative was a 1080p screen.