Also remember that you will not see much improvements within a game until the game is using textures designed for 4k (without the textures, the only real improvement will be seen on object edges with curved and non-linear lines having finer drawling blocks to produce a smoother looking image for the object, but the textures map will still be at 1920x1080 or less and simply upscaled to 4k). And even then, to game at 4k, the textures should be designed at a higher resolution, such as 16k, so that when they are cut/zoomed/stitched onto the various models they will still be at a resolution that can be downsized to 4k.
You won't really see this outside of player mods or a few games because it is too performance intensive (really memory size and bandwidth intensive) especially for consoles which can still bairly output 1080p...
Indeed.
There's modern games coming out today still with 512x512 textures all over the place (on objects, the environments, characters, etc) and some of them aren't even ports from aging or limited consoles, like Nintendo's for instance (purely talking about hardware limitations here, not games quality). The gist is even some exclusives to PC, or exclusives to more powerful consoles to this very day aren't even "fully developed" for 1080p; even if 1080p is definitely an industry standard and has been so for quite long by now. I don't think that even 5 years from now 4K will be 'a thing' in most games (at least textures-wise). It takes a LOT of time and generations of hardware before making leaps like that (especially for developers, and even more so for smaller Indie devs with less money and resources) and establish new standards that end up applied everywhere (and on all "capable") platforms and from everyone.
I mean how long did it take for 1024x768 to be 'flushed out' of the way? Even when people moved to 1920x1080 monitors, many stayed with their CRTs at 4:3 ratios. I, myself, bought my 24" 1080p monitor only in 2015. I literally stayed on my CRT from 2003-04 (not sure which exact year) until that point and barely switched my resolutions. At first I stayed at 1024x768 (at the time I even played some games at 800x600, such as Diablo 2 and StarCraft 1), then increased it a bit every now and then; like going to 1280x960 by the late 2000s as I saw more and more games going on higher resolutions. By around 2012 or so I 'bumped' it up a bit to nearly 1080p (at 900p) since it was capable of 1920x1440, so I created custom resolutions in nearly all my games to play them at 1920x900. That (having to create custom resolutions) became my 'cue' for "hey, maybe it's time now, no?", and decided to go for it finally. After my purchase, started to game at the proper native 1080p on the games I used to play on 900p and only realized fast enough that the only real difference is that I had more physical screen space horizontally and... that was about it. In SOME few select games I could see better details here and there, sure; it's not like the change meant absolutely nothing. There's always some of them that will 'benefit' from such "new" or well-established industry standards but not all of them do. I WAS, however, satisfied with watching movies in true 1080p Blue-Ray HD quality. But, purely for gaming? I'd say by now in 2017 the best thing to do is to simply move closer to my monitor if I want to look at details... and I'm not even joking. I HAVE actually seen 4K gaming in action and I mean... sure it's not "ugly" or anything, it's just... larger? It's more... "clear" maybe? Placebo? Maybe not, some games are being designed with 4K in mind now but... yeah, most of them aren't.
The 'real' current benefit of 4K nowadays is almost exclusively for watching movies on UHD televisions (which on a side note
IS impressive; having seen that myself a few times too).