The reason I LOL'd at you is because you said "AA should not be needed, if the games were designed with 4K in mind."
The primary function of AA (anti-aliasing) is to eliminate jaggies (aliasing) on geometry, so your initial premise is completely false and indicated to me that you know nothing about the topic. Higher resolution textures don't do anything for jaggies (aliasing), but rendering at a higher resolution does. Even if you meant to say "textures", you would still be wrong. Some games come with optional 4K (true 4096x4096) and even 8K texture packs direct from the developer and others have active mode communities dedicated to that very purpose. You should try it sometime as it looks incredible. The 4K era of gaming is upon us, you just clearly don't understand "how to even 4K".
That's the problem with only using Wikipedia as a source and then cherry-picking what you want to see. To be clear, you are focusing on
4K DCI, which is not the same as 4K textures and image sensors (4096x4096), or 4K (UHD) televisions.
As shown in that very wiki article, 4K DCI allows many other resolutions and aspect ratios to qualify as 4K DCI, up to a MAXIMUM resolution to ensure that all theaters are capable of achieving it. It also established these MAXIMUM values so that directors/DPs can set target resolutions for their desired aspect ratio in post-production. I'm sure that you have noticed that many films are shot 'scope at 2.35:1, some are 2.20:1, some are even 16:9, like TV shows. "pixels are cropped from the top or sides depending on the aspect ratio of the content being projected."
You will rarely - if EVER - see any movie image in the theater displayed at 4096x2160, it's almost always cropped in one or more ways. This is why consumer TVs and monitors use the 16:9 format - it began as a middle ground between 4:3 and 2.35:1 so that home viewers could still get decent height from 4:3 images and decent width from 2.35:1 images and everything in between. Now that television has moved to 16:9 almost exclusively and more and more people are watching movies at home, we're seeing more 21:9 displays (2.34:1) being released.
What that wikipedia article doesn't tell you is that the Consumer Electronics Association (CEA) itself defines 4K and UHD as being the same thing where 4K qualifies for any television capable of displaying a MINIMUM of 3840x2160.
"The group also defined the core characteristics of Ultra High-Definition TVs, monitors and projectors for the home. Minimum performance attributes include display resolution of at least eight million active pixels, with at least 3,840 horizontally and at least 2,160 vertically. Displays will have an aspect ratio with width to height of at least 16 X 9. To use the Ultra HD label, display products will require at least one digital input capable of carrying and presenting native 4K format video from this input at full 3,840 X 2,160 resolution without relying solely on up-converting."
https://www.ce.org/News/News-Releas...lectronics-Industry-Announces-Ultra-High.aspx
As I posted previously, you are free to buy commercial-grade 4K DCI equipment and enjoy your own version of a proper 4K DCI experience. I know that I can't afford it, but maybe you can.