The main reasons for inflated VRAM requirements

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

The CPU in the consoles is simply a disaster.

Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.
Reads to me like what I'd expect from Ubisoft: incompetence and/or deception, and failed stated goals. The CPU performance was well-known going in. They weren't going to have PC-level CPU performance from the start, and that was anything but a mystery.

And, if it's all CPU, why the incorrect res?

I think the disaster is inside of Ubisoft.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Wasn't the StarSwarm demo running on 8 core FX downclocked to 2GHz? Which is about the same as jaguar? If I recall correctly, this demo had more objects, AI, etc going on than every assassins creed combined.

Star swarm had joke AI running for squadrons. Star swarm exists in a static universe where nothing has to be loaded for HDD once the demo starts or saved. Star swarm consists solely of the ships present. Star Swarm is inherently less complex; things such as clipping, audio, explosions, and other physics are completely glossed over.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Do you know what is responsible for compressing/uncompressing textures?
Oh, I do!

Textures: Ships compressed. Decompressed on the fly by the texture units in the GPU

Audio: Ships compressed. Decompressed on the fly by the Tensilica audio DSP

At no point is the CPU involved in either of these. Decompression is the job of fixed function hardware for performance and power reasons
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Oh, I do!



Textures: Ships compressed. Decompressed on the fly by the texture units in the GPU



Audio: Ships compressed. Decompressed on the fly by the Tensilica audio DSP



At no point is the CPU involved in either of these. Decompression is the job of fixed function hardware for performance and power reasons


Yup. That's why they can get away with a shitty CPUS - because they have a bazillion little coprocessors than handle what PC's need a CPU for.

Both consoles have DSPs for handling just about everything related to audio. Video encoding and decoding are both fixed function as well. Even basic decompression and decryption of data is handled by coprocessors. Physics? That's getting pushed off to the GPU units. Low level APIs reduce the CPU burden as well and a unified memory pool means the CPU doesn't need to get tied up in PCI-E memory transforms either.

The CPUs they tossed in there are absolutely the shittiest little cores they possibly could have used, but if a game is well optimized for the platforms there's practically nothing for them to do but run basic game code and AI.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The way they then port that is to take the uncompressed assets and stick them on the PC, rather than recompressing them for the DX API? Is this really what is happening?
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
The way they then port that is to take the uncompressed assets and stick them on the PC, rather than recompressing them for the DX API? Is this really what is happening?


I haven't seen any reliable indication they're doing that with textures, but that's totally what titanfall did with the audio. Theres no standard dedicated audio DSP on PC, so they said screw it, let's just decompress the sound to disk, soak up 20 extra gigabytes of space and a good chunk of RAM, and that's one less performance issue to deal with. Just like the VRAM scenario, this isn't right or wrong, it's sacrificing one resource to spare another. Yes, it's a massively huge jump in disk space, but it absolutely does ease the burden on the CPU. It helps the guy with the big HDD and the shitty processor, and hurts the guy with the high performance CPU and small SSD.

I think this is far less likely with textures since texture compression is a standard part of any modern GPU though. It's just a theory floating around and I've seen zero evidence that it's the case. But if it were...there's probably a performance based reason as to why they'd do that. The idea that game devs are so stupid that they don't know how to use texture compression is preposterous. I mean honestly, that's the one thing that drives me crazy about this - random dudes on the internet second guessing the intelligence, technical knowledge and decisions of professional game devs who do this day in and day out for a living...the audacity and ego that takes is absolutely beyond me. I have to deal with a similar type of BS at work when my competence is challenged by people who mistake their vague understanding of something for expertise, when I've literally been doing this day in day out for years.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
What I want to know is, why they can't hit 1080p on the PS4 if they aren't GPU limited?

Being CPU limited only affects you at lower resolutions, not at higher ones..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
What I want to know is, why they can't hit 1080p on the PS4 if they aren't GPU limited?

Being CPU limited only affects you at lower resolutions, not at higher ones..


They didn't say they were *never* GPU bound. You're presuming that they're rendering the exact same image and/or the Xbox isn't dropping frames. We've seen this type of thing with a lot of games already, where they run the same res but the Xbox version performs poorer.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
We will know for sure when digital foundry do their performance comparison between Xbone which has 1,75GHz CPU clock and PS4 which has 1,60 GHz CPU clock. The extra 10% should give quite a bit more performance for xbox, if not... ;)

That would be the first game that runs better on Xbone than on ps4.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
We will know for sure when digital foundry do their performance comparison between Xbone which has 1,75GHz CPU clock and PS4 which has 1,60 GHz CPU clock. The extra 10% should give quite a bit more performance for xbox, if not... ;)

That would be the first game that runs better on Xbone than on ps4.


It doubt it. They'd have to be cutting it really really close for that to matter.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
They didn't say they were *never* GPU bound. You're presuming that they're rendering the exact same image and/or the Xbox isn't dropping frames. We've seen this type of thing with a lot of games already, where they run the same res but the Xbox version performs poorer.

While it's true what you are saying, they insinuated that they weren't GPU bottlenecked when they said they could achieve 100 FPS if it was just graphics they had to worry about..

So I'm guessing that they aren't draw call limited if that's the case, and increasing the resolution output has no effect on the amount of draw calls required anyway as the frame rate is locked and the burden is squarely on the GPU..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
While it's true what you are saying, they insinuated that they weren't GPU bottlenecked when they said they could achieve 100 FPS if it was just graphics they had to worry about..



So I'm guessing that they aren't draw call limited if that's the case, and increasing the resolution output has no effect on the amount of draw calls required anyway as the frame rate is locked and the burden is squarely on the GPU..


But obviously they have more than just graphics to worry about...quite a bit more if they have to drop it all the way down to 30. So it's a pointless statement. Sure, it's possible that they're arbitrarily knocking the ps4 version down to 900p just to appease microsoft, but that's a little too far out there for me to accept as the most likely reason. I think it's far more likely that 900p was the frame rate they could achieve a solid lock at 30fps on the ps4, and that lock will be a little less solid on the Xbox one. Maybe there will even be times when the Xbox one version comes out slightly ahead, but the differences in their GPUs are far greater than their CPUs, so I expect the GPU bound frame drops will be more prevalent.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Guys, I think you are taking the quote out of context. I believe he meant that the CPU would be capable of 100 fps without AI (etc) NOT that the console's GPU could actually output 100 fps.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

The CPU in the consoles is simply a disaster.

Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.

I wouldn't call it a disaster considering the APUs were given the GPU grunt (well at least the PS4) to handle graphics + compute. What was needed coming off the last generation was an increase in graphics throughput for both graphics improvement + compute and general processing capability (non-SIMD) which Sony and MS got with 8 Jaguar cores. Yes they are slow clocked, but they would be in the same situation shoe horning their AI into a high speed AMD dual module part or even an i3. Anything more substantial would've cost them extra die area, graphics die area, and/or more TDW to dissipate meaning more cooling hardware and in the end just more money to manufacture.

What I see here is either Ubisoft already pushing the limits of the new systems or they haven't gotten creative enough to offload some of it to GPU compute (if possible) or even perhaps they are just being over-ambitious. These consoles have limitations. I'm sure the crowd AI will be more impressive in the next AssCreed game.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
And Windows in the background uses what, 0.1% CPU time?

It's not exactly as simple as CPU time you see being used in Windows when you pull up task manager at idle.

The amount of overhead a Windows PC has compared to a console is absolutely gigantic; easily in the 30-50% range or more for how much of a performance penalty you have vs direct to the metal low level coding that consoles enjoy. Developers have significantly easier and more straightforward control of console hardware than they do with PCs. A simple way to make it easy to understand if you're not someone who developers both for console and PC is by trying to build a Windows PC with similar to console specs and see how well it runs games.

These new consoles have somewhere in the area of eight to ten times the CPU and GPU power of the previous gen consoles. It is natural to expect system requirements of console games ported to PC to increase compared to last gen.

People who say "this PC game requires X processor or Y amount of ram but none of the consoles are that powerful so it is bs" have a total lack of understanding of how all of this works.

A final note: most gamers have yet to grasp the fact that we have reached a really hard point of diminishing returns when it comes to visuals and the hardware needed to run it. A decade ago a graphics card that was twice as powerful as another would run the same game and it would look worlds apart in visual fidelity. Right now there is hardly any difference at all in visuals when running a game with an R9 290X or a Titan Z. Pixels are already too small to benefit all that much from slightly smaller ones, polygons are so high in count that everything that needs to be round is already round so more doesn't help, etc...

Example: http://cdn.wccftech.com/wp-content/uploads/2013/10/2486940-0248877224-ChsSw.png It is going to take ever increasing graphics performance on the exponential scale to have meaningful improvements in visual fidelity going forward.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Example: http://cdn.wccftech.com/wp-content/uploads/2013/10/2486940-0248877224-ChsSw.png It is going to take ever increasing graphics performance on the exponential scale to have meaningful improvements in visual fidelity going forward.

I somewhat agree with that though resolution, AF, AA all take a part in that issue too. Luckily we are out of the days of when we really had to analyse a polygonal object with crappy textures just to deduce what we were looking at D:

While there have been necessary increases in animation and physics, increasing graphics processing power since 2005 really hasn't been necessary from a gameplay standpoint except in a few extreme cases where long range draw distance and decent graphics at ground level is an issue like Battlefield or GTA.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
I somewhat agree with that though resolution, AF, AA all take a part in that issue too. Luckily we are out of the days of when we really had to analyse a polygonal object with crappy textures just to deduce what we were looking at D:

While there have been necessary increases in animation and physics, increasing graphics processing power since 2005 really hasn't been necessary from a gameplay standpoint except in a few extreme cases where long range draw distance and decent graphics at ground level is an issue like Battlefield or GTA.


VR is gonna change that in a major way. Not even a 980 is sufficient to drive a decent looking game at the resolution and frame rate it requires.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
VR is gonna change that in a major way. Not even a 980 is sufficient to drive a decent looking game at the resolution and frame rate it requires.

:hmm: Good point. But even VR doesn't require teraflops of compute power. Even early experiments into goggle VR wowed people long ago just like using the Oculus on an older flight game can wow people today. What made VR difficult was the getting the actual hardware and software to work in concert with each other while being cheap enough for people to procure thanks to the scale of supply.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The amount of overhead a Windows PC has compared to a console is absolutely gigantic; easily in the 30-50% range or more for how much of a performance penalty you have vs direct to the metal low level coding that consoles enjoy. Developers have significantly easier and more straightforward control of console hardware than they do with PCs. A simple way to make it easy to understand if you're not someone who developers both for console and PC is by trying to build a Windows PC with similar to console specs and see how well it runs games.

The overhead on the PC platform has been slowly diminishing overtime with every Direct3D/OpenGL update though. And now we have Mantle, which if I had to estimate, brought the overhead down to about 10%..

DX12 will continue this trend, and seal the deal. The days of high operating overhead on the PC are numbered..
 
Feb 19, 2009
10,457
10
76
There aren't even any good examples of those super high res textures providing superior visuals. They're just easy to offer because that's how the textures were made.

Umm where have you been? PC gaming scene have always been about mods and in particular high res mods. It's great now these game devs ship ultra textures as an optional download at release!
 

kasakka

Senior member
Mar 16, 2013
334
1
81
Umm where have you been? PC gaming scene have always been about mods and in particular high res mods. It's great now these game devs ship ultra textures as an optional download at release!

Yes, but now there are fewer cases where those ultra high res textures provide any significant improvement. I understand this for something like Skyrim where the high res textures from the developer were not all that high res, but for example in the case of Shadow of Mordor there is not much real difference in visuals.

While games on PC and consoles are now starting to look very close to each other, PC still has the capacity to have much higher resolutions, draw distances and superior lighting yet still have higher framerate. Of course this requires a lot more powerful hardware and effort from developers to make good use of it. Instead of cramming in absurdly high res textures that show little difference in the game, how about providing more realistic lighting and shadows for the PC?
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
:hmm: Good point. But even VR doesn't require teraflops of compute power. Even early experiments into goggle VR wowed people long ago just like using the Oculus on an older flight game can wow people today. What made VR difficult was the getting the actual hardware and software to work in concert with each other while being cheap enough for people to procure thanks to the scale of supply.


But people are going to reject VR if the graphics have to look like an N64 game to not give you a headache or make you sick. Doing VR right means you need a GPU and CPU capable of putting out at least 960x1440 at 150fps (75fps per eye). I think people are gonna be surprised at how quickly you become CPU bound when you're trying to push 150fps at any resolution. Right now I'm playing alien isolation at 1080p and I'm telling you it looks like you're playing at 320x240..it's blurry and the pixels are HUGE. Your eyes are like 2 inches from the screen, you need an unbelievable amount of resolution and pixel density for it to look good under those conditions. And despite that, I have to turn settings way down to maintain that 150fps and it still looks like a mess.

I remember using VR back in the 90s, and it was cool because Ive never seen anything like it, but I don't think I could have tolerated it for longer than the 5 minutes they let us play. If you're buying it for personal use at home you're going to expect to use it for hours at a time, and that's going to require a serious rig to back that experience up. If it's even slightly juddery or laggy for more than a few seconds your head will start to spin and it's super uncomfortable. So it raises the standards to a huge degree if you want to play modern games with it.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
But people are going to reject VR if the graphics have to look like an N64 game to not give you a headache or make you sick. Doing VR right means you need a GPU and CPU capable of putting out at least 960x1440 at 150fps (75fps per eye). I think people are gonna be surprised at how quickly you become CPU bound when you're trying to push 150fps at any resolution. Right now I'm playing alien isolation at 1080p and I'm telling you it looks like you're playing at 320x240..it's blurry and the pixels are HUGE. Your eyes are like 2 inches from the screen, you need an unbelievable amount of resolution and pixel density for it to look good under those conditions. And despite that, I have to turn settings way down to maintain that 150fps and it still looks like a mess.

I remember using VR back in the 90s, and it was cool because Ive never seen anything like it, but I don't think I could have tolerated it for longer than the 5 minutes they let us play. If you're buying it for personal use at home you're going to expect to use it for hours at a time, and that's going to require a serious rig to back that experience up. If it's even slightly juddery or laggy for more than a few seconds your head will start to spin and it's super uncomfortable. So it raises the standards to a huge degree if you want to play modern games with it.


I haven't tried VR so I can't really say much. However I believe optics are require to create a virtual image that is focussed away from the 2" that the screen really is. Our eyes cannot accomodate such a close object. Being near sighted I have increased accommodation for close objects without corrective lenses that is.

The size and position of the virtual image has to be correct for a 1080p image. Which most people will say is not closer than approximately twice the height of the screen. For 4k you can be much closer. And hence have a much wider field of view. I'm not afraid that 1080p will be too little for a first generation product. I might be wrong.

No doubt 150fps is going to extremely hard to accomplish and even though NVidia claims this new VR SLi tech they haven't shown this stuff to work properly without microstutter. I have almost no hope these people can get two GPUs to sync correctly enough for VR.

I'm seeing a lot of evidence suggesting the days of games not using more than 4 cores properly are over. Some recent benchmarks are showing 6 and 8 core CPUs coming out ahead for once. So it's going to take some high end hardware and some great coding to pull this off. I'd say still about 2 years away though. Can't even drive one 4k display properly today.