• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

The main reasons for inflated VRAM requirements

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
The PS3 had more theoretical performance on offer, but due to its odd architecture it took a lot more effort to unlock the potential. With most games being straight ports the PS3 ended up being hard to extract performance for and ultimately many developers rather than spending considerable time to play to PS3's strengths, that no other system would benefit from instead just let the PS3 port have worse graphics and resolution.

To some extent I think we are seeing a reverse of that in the XB1. The fancy high speed cache they have is faster than the PS4's memory, but to properly utilise it requires specialist programming, with the base hardware somewhat slower and with this specialist feature to accelerate it the end result is cross platform developers are going for the easier option of graphics reduction.

The Xbox 360 and PS3 at least had strengths and weaknesses in regard to each other, although I'd give the edge to the 360. (edram + better gpu + more vram is a huge triple threat advantage, although the cell is essentially a 2nd gpu for the ps3, but a really hard to use one with huge memory issues)

The Xbox One's cache isn't particularly hard to use, it's just tiny, and not even all that fast. It seems like Microsoft bet on GDDR5 being way more expensive/limited than it was and lost horribly, with the Xbox One SOC being more expensive than the PS4 SOC, having slower computational resources, and lower bandwidth. The ESRAM is comparable bandwidth to the PS4's GDDR5, but it's a small pool of ram.

Apparently the PS4 was originally supposed to have 2GB or 4GB of GDDR5, it was only last minute luck that GDDR5 advanced enough and got cheap enough that Sony could fit 8GB. Had it been a 2GB or 4GB PS4 versus an Xbox One with ESRAM, things wouldn't be so clear cut.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Having background features like game recording and a heavy OS made the 8 GB much more necessary. Outside of that, 4 GB of GDDR5 would not have been that bad but it would've been limiting as the years rolled by. Though, less memory is somewhat a blessing in disguise. With less memory to hold extraneous textures or geometry, that means devs can't overload the GPU as easily, and perhaps we'd have more 60 FPS games.
 
Last edited:

BD2003

Lifer
Oct 9, 1999
16,815
0
76
Having background features like game recording and a heavy OS made the 8 GB much more necessary. Outside of that, 4 GB of GDDR5 would not have been that bad but it would've been limiting as the years rolled by. Though, less memory is somewhat a blessing in disguise. With less memory to hold extraneous textures or geometry, that means devs can't overload the GPU as easily, and perhaps we'd have more 60 FPS games.

I'm glad the consoles are pumping out 30fps games....it means much better looking 60fps games on PC.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
745
126
The question is why in Unity are they using the CPU for global illumination lighting when lighting is best performed by the GPU? In all of the AMD GE titles such as Hitman Absolution, Tomb Raider, Dirt Showdown, Sniper Elite III, Sleeping Dogs, the programmers used GCN's Compute Shaders for lighting. It is no wonder the difference for them was only 1-2 FPS between the consoles since they didn't even take advantage of PS4's 50% added GPU compute power. This also means they are leaving a lot of performance on the table and overloading the CPU with a very conputationally intensive task.

If what they are saying is true, this could mean really bad optimization for the PC. Cards like 280X/290X are loaded with GPU compute performance for global illumination lighting but if Ubisoft offloaded this function to the CPU, then you start to become CPU bottlenecked, while the core competitive advantage of the underlying GPU architecture is basically swept under the rug/ignored. I guess we will see a large difference between i5 and i7 and 6-8 core models then.

What stands out is he implies they did a really good job with optimizing for a new gen console so early in a generation but 9 months ago the game was only running at 9 fps. That tells me they had no clue how to optimize for PS4/XB1 and Unity is just their early effort. If they were extracting maximum 99% out of the consoles, they wouldn't have been able to go from 9 to 30 FPS because they would have already been running the game at decent levels to begin with. Sounds to me like the full potential of current gen consoles is nowhere near utilized by Ubisoft. And considering every last recent Ubisoft game is an unoptimized turd performance wise, I don't expect them to be the epitome of industry optimization.

As has been mentioned in this thread for most games 3-4 GB of VRAM is still sufficient enough. I would say the culprit of huge VRAM requirements are more to do with lack of proper optimization in the game or the game engine itself. These games like Mordor or Evil Within or Watch Dogs don't look anywhere near as good as Crysis 3, Metro LL, Ryse: Son or Rome. Thus if they have steeper requirements and look/run worse, it simply means they aren't as good optimized as the 3 best looking PC games.

Who is to say with 8-9 more months of optimization, Ubisoft couldn't start using Compute shaders and offload more workload to the GPU? With pressure from MS to have XB1 version look nearly as good, and from NV with GW, it would make no sense for Ubisoft to start taking full advantage of GCN compute power since then PS4 would blow the XB1 version away and 290X would smoke the 970/980 in a GW title. Pretty much you would piss off both of the partners helping to finance your game development costs.

It won't be until Naughty Dog and the like code the game from scratch for PS5 that we will be able to see the full potential of consoles. With deadlines to release games at certain times to achieve optimal earnings, we are going to see more and more unoptimized next gen games, until our hardware catches up enough and it won't matter.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,064
868
126
The question is why in Unity are they using the CPU for global illumination lighting when lighting is best performed by the GPU?
Well assuming it was all true to begin with, the lighting is baked. So that means it's been precomputed and it only has to be decompressed.. Apparently there's so much pre-packaged data that the CPU has to help out for decompression..

Anyway, I think people are really underestimating this title. AC Unity will be BY FAR the most advanced title coming out this year technologically speaking. Here's what one guy had to say about the Xbox One preview build he got to play:

The draw distance in the game was impressive, according to Jindrax, and the Paris city view and scale from top of the building was just "Insane". He stated that "Paris city was spread out as far as the eye could see" and "It really felt like this huge city that was alive. The amount of people of the streets were also breathtaking."

Jindrax was unable to make out resolution and anti-aliasing as he was standing pretty close to the TV as the wire connected to the controller is pretty short but he did shared his view on FPS performance.

He stated FPS did dip under 30 FPS pretty frequently, every time combat sequence is entered or exited or some crazy free running stuff is performed. But as mentioned above this was the demo build and there will be room for more improvement.

Jindrax completed his impression with: "FPS drop didn't take away from the experience, the scope of this game is insane. That first time I climbed and saw the city rendered as far as I could see I was really awe struck."
Source

Given the size and scope of the game, I can easily understand why the current gen consoles would have difficulty running it. Also, the game may be using the Xbox One's cloud computing capability to offload A.I and physics, which might equalize things a bit more than the usual..
 

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
*Giggles*

Alive city? Like AC4? NPC's that do 3 or 4 pre-programmed actions with 3 or 4 pre-programmed snippets along 3 or 4 pre-programmed pathways? The pre-programming is so breathtaking whenever you move the FPS tanks into the 20's!

Next gen is already obsolete. The lack of CPU grunt is the Achilles heel combined with laughable porting efforts.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Aside the obvious affects to PC porting, I think this coming gen will be fine for both console and PC gamers. I'm sure AMD, Sony and MS did plenty of research in terms of what kind of hardware they could reasonably afford given price, die area, thermals, memory config, etc. The biggest mistake hardware wise of either company was MS and their eSRAM solution IMO.

Sure, a couple of Trinity Modules would've made a good substitute, but you could fit the 8 Jaguar cores into the same die area and likely extract the same SIMD performance and get more overall IPC using less power and better thermals.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
130
106
Aside the obvious affects to PC porting, I think this coming gen will be fine for both console and PC gamers. I'm sure AMD, Sony and MS did plenty of research in terms of what kind of hardware they could reasonably afford given price, die area, thermals, memory config, etc. The biggest mistake hardware wise of either company was MS and their eSRAM solution IMO.

Sure, a couple of Trinity Modules would've made a good substitute, but you could fit the 8 Jaguar cores into the same die area and likely extract the same SIMD performance and get more overall IPC using less power and better thermals.
But thats not really the case. Even if you double the performance numbers for the Kabini part its nowhere close overall.

http://anandtech.com/bench/product/1270?vs=1224

The performance difference is simply too big. Specially in anything that doesnt scale almost perfectly.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Not much in most of the benchmarks as you can see:
http://anandtech.com/bench/product/1223?vs=1224
Interesting. While I think there are some obvious disadvantages to the console APUs, I however am optimistic about what they can achieve in the long term. I'm not a console shill (I swear!), but I certainly like to look at the technical side of things when it comes to these closed boxes. I'm sure I'll be impressed by future games, especially as GPGPU has a chance to become a true reality "thanks" in part to these consoles. In the end, all I see is another opportunity for the PC to get some good multiplatform titles that will clearly run better on PC while somewhat pushing the envelope of what to expect. The architecture commonality should also help in the optimization process. :whiste:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
130
106
Interesting. While I think there are some obvious disadvantages to the console APUs, I however am optimistic about what they can achieve in the long term. I'm not a console shill (I swear!), but I certainly like to look at the technical side of things when it comes to these closed boxes. I'm sure I'll be impressed by future games, especially as GPGPU has a chance to become a true reality "thanks" in part to these consoles. In the end, all I see is another opportunity for the PC to get some good multiplatform titles that will clearly run better on PC while somewhat pushing the envelope of what to expect. The architecture commonality should also help in the optimization process. :whiste:
I think that hope died long ago after the first console releases. If anything its only gotten worse, a lot worse.
 

BD2003

Lifer
Oct 9, 1999
16,815
0
76
I think that hope died long ago after the first console releases. If anything its only gotten worse, a lot worse.

Has it really? Aside from a few devs who obviously don't give a damn about PC like ubisoft, seems like the only thing that's changed is VRAM usage.
 

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
Interesting. While I think there are some obvious disadvantages to the console APUs, I however am optimistic about what they can achieve in the long term. I'm not a console shill (I swear!), but I certainly like to look at the technical side of things when it comes to these closed boxes. I'm sure I'll be impressed by future games, especially as GPGPU has a chance to become a true reality "thanks" in part to these consoles. In the end, all I see is another opportunity for the PC to get some good multiplatform titles that will clearly run better on PC while somewhat pushing the envelope of what to expect. The architecture commonality should also help in the optimization process. :whiste:
I'll give you the future bit. Shadow of Mordor is a great AAA port even though gameplay wise its more or less old gen. Fantastic models though. Evil Within, Watch Dogs, and Dead Rising 3 are fat failures in comparison.
 

BD2003

Lifer
Oct 9, 1999
16,815
0
76
I'll give you the future bit. Shadow of Mordor is a great AAA port even though gameplay wise its more or less old gen. Fantastic models though. Evil Within, Watch Dogs, and Dead Rising 3 are fat failures in comparison.

Those all run like garbage on the consoles too though.
 
Aug 11, 2008
10,451
641
126
Has it really? Aside from a few devs who obviously don't give a damn about PC like ubisoft, seems like the only thing that's changed is VRAM usage.
And requirements for system ram, like the 6gb for the new Call of Duty games, and the requirements for hyperthreaded quad core cpus, and 50+ gb installs.

All this without a major leap in gameplay or graphics, IMO. It actually seems like the similarity between the consoles and PC, instead of making much better ports, simply allows even lazier porting than we saw before.
 

BD2003

Lifer
Oct 9, 1999
16,815
0
76
And requirements for system ram, like the 6gb for the new Call of Duty games, and the requirements for hyperthreaded quad core cpus, and 50+ gb installs.



All this without a major leap in gameplay or graphics, IMO. It actually seems like the similarity between the consoles and PC, instead of making much better ports, simply allows even lazier porting than we saw before.

I dunno, seems to me like the latest batch of games look fantastic. Ghosts looked really damn good and performed just fine with an i5 and a 660 as long as you turned off that ridiculously unoptimized depth of field shader. So it uses 6GB system ram, whoop dee doo, who doesn't have at least 8GB? I can understand the VRAM reqs causing pain because most people don't have that much and it's tied to your video card, but HDDs and system RAM are dirt cheap.
 

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
I dunno, seems to me like the latest batch of games look fantastic. Ghosts looked really damn good and performed just fine with an i5 and a 660 as long as you turned off that ridiculously unoptimized depth of field shader. So it uses 6GB system ram, whoop dee doo, who doesn't have at least 8GB? I can understand the VRAM reqs causing pain because most people don't have that much and it's tied to your video card, but HDDs and system RAM are dirt cheap.
Ghosts looked OK for a rejigged engine from 5yrs back. Last I played it I seriously doubt it used 6GB RAM, more like 3GB all up, and 8 took around 2GB of that in single-player. Just because HDDs are dirt cheap (RAM isn't depending where you live) doesn't mean you should become lazy and let cheap hardware take place instead of optimizing.
 

BD2003

Lifer
Oct 9, 1999
16,815
0
76
Ghosts looked OK for a rejigged engine from 5yrs back. Last I played it I seriously doubt it used 6GB RAM, more like 3GB all up, and 8 took around 2GB of that in single-player. Just because HDDs are dirt cheap (RAM isn't depending where you live) doesn't mean you should become lazy and let cheap hardware take place instead of optimizing.

That's the normal course of things though. Better hardware means you can spend more time making the game itself instead of optimizing to suit ancient hardware configs. If the game needed 10GB of ram you'd have a point, but 8GB has been standard even for low end gaming rigs for years, and a 1TB HDD costs less than the game itself. I get the VRAM thing, but this is just a silly complaint.
 
Thread starter Similar threads Forum Replies Date
N Graphics Cards 12

ASK THE COMMUNITY