3GB VRAM vs 8GB Unified RAM on Xbox One/PS4

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
I recently upgraded to a GTX 780ti 3GB. What are the chances of a future PS4/Xbox One game using more than 3GB as VRAM and having better looking textures than a PC version of the same game running on a PC with a GTX 780ti 3GB?

Very high, though that doesn't mean buying a slower card with 4GB is better either. Also memory frequency is also very important since even at 3GB and if the game needs 4GB, just the faster loading of textures through memory frequency makes is better.

So yes, we are going to see more games using more than 3GB in the future, but its only going to be several games in the next 2-3 years.

I think if anything consoles would cap the memory use to about 3GB since they stay the same and both systems are working with only 8GB memory, 2GB of which are used just for the OS, so that leaves them with only 6GB for video ram and ram and so we are likely to see consoles actually capping the ram use to about 3GB, rather than increasing it to something like 4GB.

Right now Watch Dogs uses 3GB for Ultra textures, and I don't expect many games to go over that, I mean yeah, as time goes by we are going to see more demanding games, that's just the way it is, but consoles are always going to be behind.

GTX 760 or R7 270x is faster than the console GPU's.
 

PC_gamer5150

Member
May 30, 2014
100
0
0
Very high, though that doesn't mean buying a slower card with 4GB is better either. Also memory frequency is also very important since even at 3GB and if the game needs 4GB, just the faster loading of textures through memory frequency makes is better.

So yes, we are going to see more games using more than 3GB in the future, but its only going to be several games in the next 2-3 years.

I think if anything consoles would cap the memory use to about 3GB since they stay the same and both systems are working with only 8GB memory, 2GB of which are used just for the OS, so that leaves them with only 6GB for video ram and ram and so we are likely to see consoles actually capping the ram use to about 3GB, rather than increasing it to something like 4GB.

Right now Watch Dogs uses 3GB for Ultra textures, and I don't expect many games to go over that, I mean yeah, as time goes by we are going to see more demanding games, that's just the way it is, but consoles are always going to be behind.

GTX 760 or R7 270x is faster than the console GPU's.
I agree. I just dont see where all of a sudden games will now require 4-6g's of vram? doesnt make sense? over time yes.
WD is a broken game so it is really hard to use as a catalyst
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
I love how when a game looks crappy everyone is all "omg wtf why didn't dev put super high res textures!!" Then when a game overruns their own personal cards amount of VRAM it switched to "wtf dev didn't optimize!!!1!"



Really a lot of members are eating crow right now about claiming 3GB would be plenty until 2015. And they are ITT complaining about having to high res uncompressed textures.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
All those Nvidia 770's with 2 gigs of VRAM....ouch.

I think 3 gigs is probably ok for a couple of years. WAtch Dogs is just a shoddy game. People are having problems on Titans FFS. Look at TotalBiscuits video.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
I love how when a game looks crappy everyone is all "omg wtf why didn't dev put super high res textures!!" Then when a game overruns their own personal cards amount of VRAM it switched to "wtf dev didn't optimize!!!1!"



Really a lot of members are eating crow right now about claiming 3GB would be plenty until 2015. And they are ITT complaining about having to high res uncompressed textures.

Which game in 2014 is using more than 3 gigs of vram that's not shitty optimized? Watch Dogs can't even run correctly on Titans so that's just a poor example.

I think Witcher 3 could be very very interesting, but that's still a year away and we don't know enough about the game to make a call either way.
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
Which game in 2014 is using more than 3 gigs of vram that's not shitty optimized? Watch Dogs can't even run correctly on Titans so that's just a poor example.



I think Witcher 3 could be very very interesting, but that's still a year away and we don't know enough about the game to make a call either way.


Watchdogs can use all the graphical fidelity it can get and just because it's optimized to use large amounts of VRAM doesn't mean it's poorly optimized. It actually means it's using all resources available and I'm glad they have found use for 4GB plus.
 

PC_gamer5150

Member
May 30, 2014
100
0
0
I love how when a game looks crappy everyone is all "omg wtf why didn't dev put super high res textures!!" Then when a game overruns their own personal cards amount of VRAM it switched to "wtf dev didn't optimize!!!1!"



Really a lot of members are eating crow right now about claiming 3GB would be plenty until 2015. And they are ITT complaining about having to high res uncompressed textures.

the problem with your statement is that even the DEV said it needs a patch to fix it.
there are many that have 4G and 6G cards still having issues with WD;)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
the problem with your statement is that even the DEV said it needs a patch to fix it.
there are many that have 4G and 6G cards still having issues with WD;)

It's pretty obvious we are going to see escalating VRAM usage. 3 gigs will most likely be enough for 1080p until the next gen cards drop. After that???
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,833
1,204
146
My 2GB 770 is starting to look out of the game. o_O I just paid $300 for the dang thing. :|
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
My 2GB 770 is starting to look out of the game. o_O I just paid $300 for the dang thing. :|

Maybe sell it while it still has value and get a 780 or 290. You can get a 290 for not a whole lot more than you paid for the 770.

Or if new enough return it and get a new card.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
fact is nvidia cards bought before 2014 now run like crap in new games and it's pretty ridiculous.

Not sure what games you are referring to, up until I upgraded my monitor to 2560x1080, my 670 was running games really well. For the most part it handled them at the new res but I wanted to be sure for the future so I got my 780, just in time for watchdogs which Is running smoothly on my rig at max details.

Again, there are Titan SLI configurations using nearly 6GB with TXAA. That is simply absurd given that the game does not look better than prior gen DX9 games - there are many excellent DX9 games which looked very good in terms of graphics. This is not what I consider raising the bar. I consider it lowering the bar with poor PC ports which doesn't play to the strenghts of the PC.

I'm really not sure what Nvidia's obsession with TXAA is when SMAA does a great job without blurring textures.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's a freakin JOKE man. Considering the game DOES NOT LOOK very good. Crysis 3 and Metro LL both look WAY BETTER than Watch Dogs.

Why do you continually make such false comparisons between Watch Dogs, a vast, dense, open World seamless game, and Crysis 3 and Metro LL; both of which are comparably much smaller, segmented games with narrow focuses.

For an open world game, Watch Dogs looks amazing if you ask me. Sure it was downgraded, but even after all that, the PC version still looks incredible on max settings.

If this is the future of console ports, if developers CANT be bothered to use compressed textures or even understand that PC does NOT have unified memory, count me out. I'll just use a PS4 and be done with it.
Yeah I'm ticked off about that myself. I'm never going to preorder another Ubisoft game again. That said, I have no doubt that the data streaming system on the PC will be fixed.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I'm really not sure what Nvidia's obsession with TXAA is when SMAA does a great job without blurring textures.

SMAA is good no doubt, but comparing it to TXAA is stretching things a bit. TXAA has much better edge correction, plus it almost totally eradicates temporal aliasing.

The blur is unfortunate I agree, but compared to the advantages, it's a small price to pay imo.
 

janii

Member
Nov 1, 2013
52
0
0
Pretty high. We've already seen Watch_Dogs stuttering on <4GB VRAM.
Doesnt have to do anything with vram i think
I use a gtx 580 and set it to ultra.
Nitw that it has 1.5gb vram and it was pretty much maxed out.

I only lowered AA and shadows a bit.
The game runs at average 45 fps but with that stutter issue. And the exact same stutter were on high end cards.

Not the first game that maxed my vram, but there were no weird stutter issues on these games
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a chance that a patch to WatchDogs will fix the stuttering and pauses caused by the streaming texture loading and actually ultra will run on a 2GB GPU. Its likely a bug in the game or the driver (or interplay of the two) that results in the poor performance.

A game like BF4 for example will happily fill 2GB, 3GB etc, in most multiplayer maps it can happily fill 3.5GB with resources. But to render the scene ahead of it the game doesn't actually need much more than about 1.2GB and it runs smoothly on 2GB. So there is every chance that WatchDogs can and will run smoothly with the streaming fixed as well.

Ideal and required amounts of VRAM are very different and we are now in the situation where its impossible to measure the difference and only performance can tell the story. GPU-Z's VRAM usage graphs are now kind of useless due to everyone using much more advanced world streaming technology.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
High res textures carry diminishing returns on quality when the output is only 1080p, so devs trying for textures sizes pushing over 3GB is unlikely. When you consider the entire OS, game world, and textures have to fit in the 8GB with 3GB preallocated right off the top then I'm sure a 3-4GB VRAM will be plenty for texture parity this gen.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Watch Dogs is an okay game (i'd say it's like GTA V but WITHOUT any life or personality, GTA V is a better game) but the fact that the PC version is so demanding with uneven performance? If this is the future of console ports, count me out. I'll just be done with PC gaming. The fact that Watch Dogs look substantially worse than the best looking PC games yet apparently gobbles up VRAM like it's nothing...even 6GB Titan cards are having 5.8GB of VRAM used? Compressed textures anyone? (nope, ubi can't be bothered) Can developers even be bothered with the fact that the PC doesn't have unified memory? Or just say "screw it" and make a poor port nonetheless?

That's a freakin JOKE man. Considering the game DOES NOT LOOK very good. Crysis 3 and Metro LL both look WAY BETTER than Watch Dogs.

If this is the future of console ports, if developers CANT be bothered to use compressed textures or even understand that PC does NOT have unified memory, count me out. I'll just use a PS4 and be done with it.

The visuals of Watch dogs at ultra and the VRAM required for that is just. a HUGE joke. HUGE HUGE joke. Totalbiscuit is still getting texture popping and sub 60 fps framerates with Titan SLI at 1080p ultra. LOL. And with TXAA it uses 5.8GB of VRAM on some configurations. :rolleyes:

Look man, i'd be all about VRAM use if it benefited us. But that isn't happening here. Instead with this POS game and Wolf: TNO, we're getting increasing VRAM requirements with ZERO added visual fidelity compared to "previous generation" AAA titles. So here we are, the new status quo. Because the consoles have unified memory, we get console ports with outrageous VRAM requirements but without better visuals than the prior gen AAA titles. How hilarious is that. Just using VRAM for the heck of it. Count me out if that's the future.

Maybe, just maybe, non idiot developers can actually create game engines that differentiate between system ram and VRAM in the future so that VRAM requirements, if high, actually gives us a real benefit. Instead of saying "screw it" and pretending that a straight PC port which assumes that the PC has unified memory (which artificially inflates VRAM requirements). I can think of DX9 titles that look better. Hell, Witcher 2 can look better than Watch Dogs. But it sure doesn't use 4-6GB of VRAM even at insanely high resolutions.

I have 0 intention of rewarding developers who put 0 effort into their PC Port.

If they don't fix it, I won't just reward them by picking up a PS4 and just playing it on PS4. I'll just play another game. Tons of games out there, Ubisoft, EA, etc. all need to stop getting rewarded (many times with people preordering the game first so having 0 knowledge of what issues may be there) for making are essentially unfinished games.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Ubisoft sucks because they didn't bother to make their PC "port" for the PC. It treats VRAM as unified memory, as is the case on consoles. This artificially and exponentially inflates VRAM requirements when it in fact doesn't have to.

The bar is not being raised. The bar is being lowered with more shoddy PC ports. I do hope this is not indicative of things to come. As I said. I'm all for more VRAM if it does in fact result in higher fidelity graphics as compared to prior generation games. That is NOT HAPPENING here.

There are DX9 games that look leagues better than Watch Dogs. The fact that ubisoft did more or less a straight port with some hacked on PC features, is not cool. The game engine should address the fact that PC does not have unified memory, but it doesn't address that and simply inflates VRAM requirements for no good reason. Again, there are Titan SLI configurations using nearly 6GB with TXAA. That is simply absurd given that the game does not look better than prior gen DX9 games - there are many excellent DX9 games which looked very good in terms of graphics. This is not what I consider raising the bar. I consider it lowering the bar with poor PC ports which doesn't play to the strenghts of the PC.

Bottom line is this: If games use more VRAM, that's fine, but they should LOOK LIKE they're using more VRAM with higher visual fidelity. That just aint happening. We just get artificially inflated VRAM because the engine thinks the PC has unified memory for some absurd reason and treats the VRAM as such. In the end it may not matter because surely the next gen cards will have 6-8GB of VRAM as the norm if this indicative of things to come. I hope it isn't, but if it is, the GPU market will react accordingly with the GTX 800 and whatever AMD has. The GTX 880 was rumored to have 8GB base. Maybe that's true. Shrug. But it's a truly sad and stupid situation that VRAM is being inflated because of more shoddy PC ports treating VRAM as unified memory, and VRAM being less about assets and more about anti aliasing. A very stupid situation. I hope Watch Dogs is a one off and not all games are like this, because next gen VRAM use should actually look.............next gen. Watch Dogs does not "look" next gen at all.


Just a thought... but what if the game actually isn't coded that badly so much as some hardware just isn't as capable as other hardware? In this case, 2GB cards just can't handle settings that those with more vram can handle. Don't get me wrong, I don't know of many big developer PC games that don't get patches along the way, and I'm sure this one will be the same. But it seems like a lot of people are quick to jump on the bad coding/bad developer band wagon, but is there any real proof that this was some kind of quick and very ugly port?
 
Last edited:

PC_gamer5150

Member
May 30, 2014
100
0
0
Just a thought... but what if the game actually isn't coded that badly so much as some hardware just isn't as capable as other hardware. In this case, 2GB cards just can't handle settings that those with more vram can handle. Don't get me wrong, I don't know of many big developer PC games that don't get patches along the way, and I'm sure this one will be the same. But it seems like a lot of people are quick to jump on the bad coding/bad developer band wagon, but is there any real proof that this was some kind of quick and very ugly port?

Thats the thing, the rep from Ubi said there is a patch in the works to fix stuttering and crashes BUT he also said that this game uses 3+G's of vram for ultra textures.
I took that to mean even fixed this game needs more than 3G's vram for max settings
here is his quote " A 3GB card is ok on Ultra. But pushing past 1080p or activating any MSAA/TXAA you can go over. You need 4GB for that"
 
Last edited:

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
Thats the thing, the rep from Ubi said there is a patch in the works to fix stuttering and crashes BUT he also said that this game uses 3+G's of vram for ultra textures.
I took that to mean even fixed this game needs more than 3G's vram for max settings
here is his quote " A 3GB card is ok on Ultra. But pushing past 1080p or activating any MSAA/TXAA you can go over. You need 4GB for that"

must be a misprint as all the experts said over and over peeps will never never never need more than 3 gb of vram in any game.
this time will be different trust us.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Just a thought... but what if the game actually isn't coded that badly so much as some hardware just isn't as capable as other hardware. In this case, 2GB cards just can't handle settings that those with more vram can handle. Don't get me wrong, I don't know of many big developer PC games that don't get patches along the way, and I'm sure this one will be the same. But it seems like a lot of people are quick to jump on the bad coding/bad developer band wagon, but is there any real proof that this was some kind of quick and very ugly port?

So you're saying that even though cards with 6GB of VRAM (Titan) are having issues with this game still that it's just the cards? That we just need cards with more VRAM?

So how much VRAM you think we need for WatchDogs if a 6GB Titan still displays issues? Maybe 8GB? 10GB?

It seems to me that enthusiasts just want games to use up resources, irregardless of it there is a quality improvement. What's the point of a game that not even 6GB of Titan VRam can't handle if the game looks no better than Crysis?

Would you tout minesweeper as the best game ever if it required 10 GB of VRAM and would we all be bragging about being able to run minesweeper? Or would we say the game is incredibly inefficient?

Only the PC Enthusiast market would fathom scenarios in which we applaud Devs for making games less efficient because now we can buy faster hardware to run games lol.........
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a bug with watchdogs which is causing poor performance with ultra on nvidia. A 780 according to the developers ought to work but it doesn't. I expect that will be fixed by either UbiSoft or Nvidia it looks like a problem with the streaming.

The need for 3GB is still a bit odd. So few people have cards with more than 2GB that this really is a setting aimed at a very small subset of the market. Ultra generally runs on 680/7970 class hardware, its literally just that one setting. I do wonder why they did that but the console version is equivalent to high is its not because of the consoles. Maybe they just decided to target the tiny percentage of super high end users with that one setting but its inconsistent especially to favour AMD cards in this way. This is the first time I have seen. A game target a refresh card and not its predecessor.

I don't think we will see higher than 3GB unless its an AMD sponsored game because it makes little sense to target less than 1% of the market with a setting, its not economic to waste dev time in that way. I am surprised to find such a massive increase with vram usage at this point, it doesn't match the hardware actually being used very well which like I suggest above is an unfortunate consequence of something else. Could be wrong, probably am, my ability to predict the future is as useless as everyone else's.
 

PC_gamer5150

Member
May 30, 2014
100
0
0
Brightcandle summed it up when he said predicting the future is useless
unless of course you are nostradamus or something and even he was wrong
A LOT! :cool: