I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?
Pretty high. We've already seen Watch_Dogs stuttering on <4GB VRAM.
Texture compression would like to stop and have a word with you.It's unlikely that the consoles will have better textures than PC for the simple factor that they don't have enough grunt to push out ultra-HD textures at a decent frame rate, thus, the best they can hope for is 2048 x 2048 textures (each texture object of this size would take 16MB in vram).
PC gamers have had access to 4096 x 4096 textures (67MB in vram) for a long time now (via Fallout, Dragon Age and Skyrim Ultra-HD mods).
Texture compression would like to stop and have a word with you.And note that the last gen consoles supported 4096 x 4096 textures and texture compression, though it's hard to say how widely such large textures were used.
the best they can hope for is 2048 x 2048 textures (each texture object of this size would take 16MB in vram).
Most games need at least 2GB just for the game world
the nintendo 64 had a unified memory pool as did the first xboxAfter all, unified memory access is a recent phenomenon in the gaming scene as far as I know, whereas dedicated VRAM has been the status quo for many years..
The PS4 only gives 6GB total to the game. That has to not only service the GPU but also the CPU. Most games need at least 2GB just for the game world, which puts us into more like 4GB for the GPU.
I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?
Sure you can compress it, but pumping out that many ultra-HD textures at a good fps in a complex game is beyond PS4/Xbone hardware. The transfer of such large textures to their vram over a low bandwidth bus (compared to the 512bit buses and >320GB/s for our current top cards) would also be a bottleneck for consoles.
I read that WD is run at 900p on PS4 at 30 fps, with "High" textures and lower quality settings as well as reduced draw distance.
Expecting their lack-luster hardware to handle 4K texture games is a bit much. Hence, PC gaming will still be the best.![]()
Watch Dogs is an okay game (i'd say it's like GTA V but WITHOUT any life or personality, GTA V is a better game) but the fact that the PC version is so demanding with uneven performance? If this is the future of console ports, count me out. I'll just be done with PC gaming. The fact that Watch Dogs look substantially worse than the best looking PC games yet apparently gobbles up VRAM like it's nothing...even 6GB Titan cards are having 5.8GB of VRAM used? Compressed textures anyone? (nope, ubi can't be bothered) Can developers even be bothered with the fact that the PC doesn't have unified memory? Or just say "screw it" and make a poor port nonetheless?
That's a freakin JOKE man. Considering the game DOES NOT LOOK very good. Crysis 3 and Metro LL both look WAY BETTER than Watch Dogs.
If this is the future of console ports, if developers CANT be bothered to use compressed textures or even understand that PC does NOT have unified memory, count me out. I'll just use a PS4 and be done with it.
The visuals of Watch dogs at ultra and the VRAM required for that is just. a HUGE joke. HUGE HUGE joke. Totalbiscuit is still getting texture popping and sub 60 fps framerates with Titan SLI at 1080p ultra. LOL. And with TXAA it uses 5.8GB of VRAM on some configurations.
Look man, i'd be all about VRAM use if it benefited us. But that isn't happening here. Instead with this POS game and Wolf: TNO, we're getting increasing VRAM requirements with ZERO added visual fidelity compared to "previous generation" AAA titles. So here we are, the new status quo. Because the consoles have unified memory, we get console ports with outrageous VRAM requirements but without better visuals than the prior gen AAA titles. How hilarious is that. Just using VRAM for the heck of it. Count me out if that's the future.
Maybe, just maybe, non idiot developers can actually create game engines that differentiate between system ram and VRAM in the future so that VRAM requirements, if high, actually gives us a real benefit. Instead of saying "screw it" and pretending that a straight PC port which assumes that the PC has unified memory (which artificially inflates VRAM requirements). I can think of DX9 titles that look better. Hell, Witcher 2 can look better than Watch Dogs. But it sure doesn't use 4-6GB of VRAM even at insanely high resolutions.