3GB VRAM vs 8GB Unified RAM on Xbox One/PS4

Dave3000

Golden Member
Jan 10, 2011
1,466
102
106
I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?
 

Majcric

Golden Member
May 3, 2011
1,409
65
91
0 to none. While your VRAM may not be sufficient for the life of the consoles, even if you have to reduce settings the Ti will always look better.
 

NTMBK

Lifer
Nov 14, 2011
10,400
5,635
136
Pretty high. We've already seen Watch_Dogs stuttering on <4GB VRAM.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?

I'd say the chances are slim to none, unless the developer in question intentionally wanted to screw over PC gamers, or were incompetent.

Even on Watch Dogs, the PS4 uses a texture level on high, whereas PC gets access to ultra. The thing is, most high end gaming PCs are going to have a larger pool of memory than the PS4. It may not be unified, but a good developer knows how to circumvent those issues.

After all, unified memory access is a recent phenomenon in the gaming scene as far as I know, whereas dedicated VRAM has been the status quo for many years..

Both methods have their pros and cons. I think for low to mid end machines, unified memory access may be better. But for high end, dedicated will likely always be better as you can get much greater bandwidth and a bigger memory pool.

Latency is an issue, but PCIe 3.0 provides a direct pathway from the CPU to the GPU so it's minimal I'd say. Plus GPUs are good at hiding latency anyway from what I've read..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Pretty high. We've already seen Watch_Dogs stuttering on <4GB VRAM.

Even Titan owners have reported stutter, so the problem isn't simply due to lack of VRAM.

It's likely lack of optimization of their streaming technology for the PC. It's clear that Ubisoft focused heavily on the PS4 in terms of resources. The PC probably got put on the back burner, as it's easier to patch..
 
Feb 19, 2009
10,457
10
76
It's unlikely that the consoles will have better textures than PC for the simple factor that they don't have enough grunt to push out ultra-HD textures at a decent frame rate, thus, the best they can hope for is 2048 x 2048 textures (each texture object of this size would take 16MB in vram).

PC gamers have had access to 4096 x 4096 textures (67MB in vram) for a long time now (via Fallout, Dragon Age and Skyrim Ultra-HD mods).

As to whether 3GB vram cards will be the bottleneck, definitely. We're just starting into the next-gen console life cycle and its already happening.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
It's unlikely that the consoles will have better textures than PC for the simple factor that they don't have enough grunt to push out ultra-HD textures at a decent frame rate, thus, the best they can hope for is 2048 x 2048 textures (each texture object of this size would take 16MB in vram).

PC gamers have had access to 4096 x 4096 textures (67MB in vram) for a long time now (via Fallout, Dragon Age and Skyrim Ultra-HD mods).
Texture compression would like to stop and have a word with you.;) And note that the last gen consoles supported 4096 x 4096 textures and texture compression, though it's hard to say how widely such large textures were used.
 

dacostafilipe

Senior member
Oct 10, 2013
792
274
136
Ass far ass I know, most big 3D engines already support resource streaming, but it's still limited by the actual APIs (because of all the resource types). With DX12/Mantle, tiled resources should be a lot easier to implement and should help a lot in terms of VRAM usage. Devs will then have the ability to manage the VRAM and load more/less resources ass their is VRAM available. It should not reduce VRAM usage, because it makes no sense to not used available VRAM, but you should not hit the VRAM "wall" because the engine will unload/reload needed resources on-the-fly.

So, no I don't think that it will happen.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
Slightly off topic: but are we really talking about "unified memory" in the sense of AMD HUMA or are we simply talking about both the CPU and the GPU sharing the system memory? Because my impression was, that neither the XBOne nor the PS4 uses something like HUMA, but I might be wrong?
 
Feb 19, 2009
10,457
10
76
Texture compression would like to stop and have a word with you.;) And note that the last gen consoles supported 4096 x 4096 textures and texture compression, though it's hard to say how widely such large textures were used.

Sure you can compress it, but pumping out that many ultra-HD textures at a good fps in a complex game is beyond PS4/Xbone hardware. The transfer of such large textures to their vram over a low bandwidth bus (compared to the 512bit buses and >320GB/s for our current top cards) would also be a bottleneck for consoles.

I read that WD is run at 900p on PS4 at 30 fps, with "High" textures and lower quality settings as well as reduced draw distance.

Expecting their lack-luster hardware to handle 4K texture games is a bit much. Hence, PC gaming will still be the best. :D
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The PS4 only gives 6GB total to the game. That has to not only service the GPU but also the CPU. Most games need at least 2GB just for the game world, which puts us into more like 4GB for the GPU.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
the best they can hope for is 2048 x 2048 textures (each texture object of this size would take 16MB in vram).

Neither of those things are true. Pretty much every GPU from the last 10 years has supported 4096 pixel textures and texture compression.

Most games need at least 2GB just for the game world

On the PC, maybe. The PS3 did just fine with 256MB.

Any GPU has a maximum number of texture units it can support, so you couldn't use 6GB on textures if you wanted to. Memory bandwidth and shader complexity will have a bigger impact that this point in their lifecycle. Also, games store way more than just textures in memory like framebuffers and meshes.

Killzone already is using 3GB of video memory.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?

Unless they decide to purposely gimp the PC version of the game, as long as you're on PC you'll have the option to have better looking Textures than Console and with a GTX 780 TI I can't see why it wouldn't look better than the console for it's entire lifetime.

These consoles were built to be cheap and to turn a profit quickly as we can see by Sony already being able to make the PS4 profitable this early in it's life cycle.

Once E3 came around and we saw what the "next gen" games were I was disappointed. When I built a new gaming PC I realized that the next gen consoles really are a joke if you want the best graphics experience. Next Gen Consoles are an ecosystem, not a graphics pushing titan.
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
They still have better textures. OP the answer to your question is no.


You guys can rationalize it as "laziness" on developers parts or whatever, fact is nvidia cards bought before 2014 now run like crap in new games and it's pretty ridiculous. If nvidia didn't bilk their early adopters they might actually gain market share.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sure you can compress it, but pumping out that many ultra-HD textures at a good fps in a complex game is beyond PS4/Xbone hardware. The transfer of such large textures to their vram over a low bandwidth bus (compared to the 512bit buses and >320GB/s for our current top cards) would also be a bottleneck for consoles.

I read that WD is run at 900p on PS4 at 30 fps, with "High" textures and lower quality settings as well as reduced draw distance.

Expecting their lack-luster hardware to handle 4K texture games is a bit much. Hence, PC gaming will still be the best. :D

Sorry, but your last two posts hurt my head. Texture size has absolutely NOTHING to do with the resolution that the game is rendered at. There is zero correlation between them. You can have a game that uses 5x5 pixel textures rendered at 4K, or you can have a game with 8000x8000 textures that is rendered at 320x240. Please stop using the term "4K textures". Loading large textures is not an issue as you seem to think it is. I assure you your storage system is FAR slower than even the cheapest VRAM buses used today. High bandwidth comes into play with pulling out of memory to the frame buffer and post processing. Texture size has little impact on performance in most cases.

As for memory usage, I have no doubt that 3GB will not be enough in the next 1-2 years. just like 2GB is not enough for most newer games today.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Watch Dogs is an okay game (i'd say it's like GTA V but WITHOUT any life or personality, GTA V is a better game) but the fact that the PC version is so demanding with uneven performance? If this is the future of console ports, count me out. I'll just be done with PC gaming. The fact that Watch Dogs look substantially worse than the best looking PC games yet apparently gobbles up VRAM like it's nothing...even 6GB Titan cards are having 5.8GB of VRAM used? Compressed textures anyone? (nope, ubi can't be bothered) Can developers even be bothered with the fact that the PC doesn't have unified memory? Or just say "screw it" and make a poor port nonetheless?

That's a freakin JOKE man. Considering the game DOES NOT LOOK very good. Crysis 3 and Metro LL both look WAY BETTER than Watch Dogs.

If this is the future of console ports, if developers CANT be bothered to use compressed textures or even understand that PC does NOT have unified memory, count me out. I'll just use a PS4 and be done with it.

The visuals of Watch dogs at ultra and the VRAM required for that is just. a HUGE joke. HUGE HUGE joke. Totalbiscuit is still getting texture popping and sub 60 fps framerates with Titan SLI at 1080p ultra. LOL. And with TXAA it uses 5.8GB of VRAM on some configurations. :rolleyes:

Look man, i'd be all about VRAM use if it benefited us. But that isn't happening here. Instead with this POS game and Wolf: TNO, we're getting increasing VRAM requirements with ZERO added visual fidelity compared to "previous generation" AAA titles. So here we are, the new status quo. Because the consoles have unified memory, we get console ports with outrageous VRAM requirements but without better visuals than the prior gen AAA titles. How hilarious is that. Just using VRAM for the heck of it. Count me out if that's the future.

Maybe, just maybe, non idiot developers can actually create game engines that differentiate between system ram and VRAM in the future so that VRAM requirements, if high, actually gives us a real benefit. Instead of saying "screw it" and pretending that a straight PC port which assumes that the PC has unified memory (which artificially inflates VRAM requirements). I can think of DX9 titles that look better. Hell, Witcher 2 can look better than Watch Dogs. But it sure doesn't use 4-6GB of VRAM even at insanely high resolutions.
 
Last edited:

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
New iMacs on the table at WWDC 2014. Have the same finish as the current MacBook Pro and look very sleek


Edit: wrong forum
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Consoles have 3GB of ram reserved for OS use, so only 5GB for game. Of those 5GB, about 3-4GB should be in use for GPU resources.

If you want to match console settings then 3 or 4GB, plus 8GB main system memory will do it for now. There's a possibility that the reserved memory in consoles will decrease and devs will get access to a little more. And resource management is less fine-grained on PC so developers can't really optimize memory usage.

Consoles have unified memory so they save on space where PCs will have a duplicate copy of certain things in both VRAM and system ram.

EDIT: in my opinion matching console settings is a pretty low bar. You should be shooting for 1440p+ decent AA and higher resolution textures. So I think everyone here should really re-evaluate how much VRAM is really wanted (as opposed to needed). If all you want to do is match console settings then buy a console and stop posting in this forum. I'll be updating to a 6GB or bigger card next gen. It's pretty clear consoles have raised the bar, and I'm going to stay ahead of it. The thought of having just enough VRAM to match console settings on a card several times faster... well, I find it repulsive.
 
Last edited:

PC_gamer5150

Member
May 30, 2014
100
0
0
Watch Dogs is an okay game (i'd say it's like GTA V but WITHOUT any life or personality, GTA V is a better game) but the fact that the PC version is so demanding with uneven performance? If this is the future of console ports, count me out. I'll just be done with PC gaming. The fact that Watch Dogs look substantially worse than the best looking PC games yet apparently gobbles up VRAM like it's nothing...even 6GB Titan cards are having 5.8GB of VRAM used? Compressed textures anyone? (nope, ubi can't be bothered) Can developers even be bothered with the fact that the PC doesn't have unified memory? Or just say "screw it" and make a poor port nonetheless?

That's a freakin JOKE man. Considering the game DOES NOT LOOK very good. Crysis 3 and Metro LL both look WAY BETTER than Watch Dogs.

If this is the future of console ports, if developers CANT be bothered to use compressed textures or even understand that PC does NOT have unified memory, count me out. I'll just use a PS4 and be done with it.

The visuals of Watch dogs at ultra and the VRAM required for that is just. a HUGE joke. HUGE HUGE joke. Totalbiscuit is still getting texture popping and sub 60 fps framerates with Titan SLI at 1080p ultra. LOL. And with TXAA it uses 5.8GB of VRAM on some configurations. :rolleyes:

Look man, i'd be all about VRAM use if it benefited us. But that isn't happening here. Instead with this POS game and Wolf: TNO, we're getting increasing VRAM requirements with ZERO added visual fidelity compared to "previous generation" AAA titles. So here we are, the new status quo. Because the consoles have unified memory, we get console ports with outrageous VRAM requirements but without better visuals than the prior gen AAA titles. How hilarious is that. Just using VRAM for the heck of it. Count me out if that's the future.

Maybe, just maybe, non idiot developers can actually create game engines that differentiate between system ram and VRAM in the future so that VRAM requirements, if high, actually gives us a real benefit. Instead of saying "screw it" and pretending that a straight PC port which assumes that the PC has unified memory (which artificially inflates VRAM requirements). I can think of DX9 titles that look better. Hell, Witcher 2 can look better than Watch Dogs. But it sure doesn't use 4-6GB of VRAM even at insanely high resolutions.

great post! Would it be best to just lower the texture setting of the game to use less vram since even on ultra it does not look all that mind blowing?
I agree it is a good looking game but certainly not going to ever make me say crysis 3 or even BF4 looks lame compared to it
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
But you can't compare directly, the consoles run with so much less bloat and run software coded exactly for that hardware, they get more out of less. I remember my Dreamcast only had 8MB of vram, but with compression, as I understand it, developers were able to use it like 40-64MB of vram depending on the technique they use. At any rate, I'm happy I got a 3GB card, I can probably happily use this card until 20nm for 19x12 resolution. Despite what some people say, I think at least 3GB will be as low as you want to go with upcoming games made for the new consoles with settings turned up at all.
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
Regardless of how much VRAM is does use, it's crazy. In another generation or two we will have cards with 8GB+ VRAM. That is more than the average persons RAM! I'm sure a competent developer could remake watchdogs and have it run flawlessly on ultra 1080p on a 2GB card. Screw ubisoft and EA.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Ubisoft sucks because they didn't bother to make their PC "port" for the PC. It treats VRAM as unified memory, as is the case on consoles. This artificially and exponentially inflates VRAM requirements when it in fact doesn't have to.

The bar is not being raised. The bar is being lowered with more shoddy PC ports. I do hope this is not indicative of things to come. As I said. I'm all for more VRAM if it does in fact result in higher fidelity graphics as compared to prior generation games. That is NOT HAPPENING here.

There are DX9 games that look leagues better than Watch Dogs. The fact that ubisoft did more or less a straight port with some hacked on PC features, is not cool. The game engine should address the fact that PC does not have unified memory, but it doesn't address that and simply inflates VRAM requirements for no good reason. Again, there are Titan SLI configurations using nearly 6GB with TXAA. That is simply absurd given that the game does not look better than prior gen DX9 games - there are many excellent DX9 games which looked very good in terms of graphics. This is not what I consider raising the bar. I consider it lowering the bar with poor PC ports which doesn't play to the strenghts of the PC.

Bottom line is this: If games use more VRAM, that's fine, but they should LOOK LIKE they're using more VRAM with higher visual fidelity. That just aint happening. We just get artificially inflated VRAM because the engine thinks the PC has unified memory for some absurd reason and treats the VRAM as such. In the end it may not matter because surely the next gen cards will have 6-8GB of VRAM as the norm if this indicative of things to come. I hope it isn't, but if it is, the GPU market will react accordingly with the GTX 800 and whatever AMD has. The GTX 880 was rumored to have 8GB base. Maybe that's true. Shrug. But it's a truly sad and stupid situation that VRAM is being inflated because of more shoddy PC ports treating VRAM as unified memory, and VRAM being less about assets and more about anti aliasing. A very stupid situation. I hope Watch Dogs is a one off and not all games are like this, because next gen VRAM use should actually look.............next gen. Watch Dogs does not "look" next gen at all.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
These new consoles (in relation the the ones they superseded) are much easier to develop for so some developers (Ubisoft) are being sloppy *because they can get away in doing so*. PC ports are generally an after thought so it doesn't surprise me video cards with 2+GB of fast GDDR5 are suffering.

Unfortunately I don't see this trend changing anytime soon and I bet we will see more and more of this from other studios.

Nvidia really should have read the market better and followed AMD and outfitted at least their higher end cards with 4GB of memory by default. Now we're going to see a bunch of very powerful cards suffer because of lazy development practices.

Another thing to consider is that historically Microsoft and Sony have allowed developers access to more hardware resources when pressured by game studios. Although not nearly as imbalanced this time I imagine history will repeat itself for the new consoles so the problem of lack of VRAM may become worse over time.

I suspect AMD and Nvidia will have to make up for lazy development and start outfitting discreet cards with much more VRAM (even the low and mid tier) soon. This will likely drive up costs which we'll have to pay for. Plus with the push for higher resolution displays this kind of makes sense anyway.
 
Last edited: