But, the Xbox 360 has 512MB of shared memory (plus 10MB edram), the PS4 has 256MB system ram and 256MB vram. You probably wouldn't want to play ports from those old consoles on a PC with a total of 512MB split between system ram and vram. Likewise, the new consoles have 5GB available and 8GB total. So I wouldn't be surprised if, just as with the old consoles, you need a fair amount more ram and vram in your PC than is packed inside the new consoles.
That's a contradictory statement about requirements.
It sounds like console developers first time developing on the PC. 4GB VRAM and i7?
Its usually also apples and oranges. Features, IQ and function changes.
6GB VRAM needed for ultra textures in Shadow of Mordor
Lol, I guess that makes the GTX 970/980 the next paper weight card. Good thing I automatically assume game developers have the best technical talent and know what they're doing!![]()
you don't need horse power to push higher res textures. They use virtually no more power than low res textures and only need video RAM.How much VRAM can the consoles even dedicate for textures? 2GB, 3GB? They have puny CPU/GPU's so I doubt they can push through more than 2GB reserved for textures, 3Gb at most for Killzone but that was a showcase techdemo. I love how people always go into panic mode due to a couple of shitty optimized games with outdated engines.
Same with Watch Dogs. Turned out it just runs fine with 2GB VRAM after couple of patches and reasonable settings.
you don't need horse power to push higher res textures. They use virtually no more power than low res textures and only need video RAM.
I have a tough time believing that those sort of requirements make sense. The consoles are really weak compared to PCs, but now we need 6GB VRAM on PC for ultra textures at 1080p in console ports ?! :awe:
As much as they see fit, within the 5-6GB the game can use.How much VRAM can the consoles even dedicate for textures? 2GB, 3GB?
Video cards equivalent to what they have can readily use 2+GB of textures. They aren't high-end, but the GPUs are hardly, "puny."They have puny CPU/GPU's so I doubt they can push through more than 2GB reserved for textures, 3Gb at most for Killzone but that was a showcase techdemo. I love how people always go into panic mode due to a couple of shitty optimized games with outdated engines.
Not sure why you are shocked.
We saw this coming with next gen consoles having ~6GB of available vram, which is a huge leap from the older consoles. Ofcourse any AAA dev worth his salt would take advantage of that and raise the bar in IQ.
Take an old game like Skyrim, it looks ugly by current standards, add in 4K texture mods and shader/lighting/shadow mods and it looks great. You can achieve a lot if you're given the resources to do so. In fact, it is terrible that AAA games don't do this, as in not having an "Ultra" setting.
The new Consoles don't have 6GB vram. They don't even have vram. They have unified system ram, an inordinate amount of which is reserved to OS. PS4 uses 3.5GB RAM for the OS, though that is likely to change. Xbone uses 3GB for the OS(es). So if you call it 4GB for system memory, that leaves a puny 1GB or less for VRAM. Generously giving a 33% reduction in total RAM usage due to non-duplication of assets between system RAM and VRAM still only gets you 2-3GB at very best for VRAM. In real terms, GTX 660/670 and 7850s are already basically equivalent in specs AND in RAM.
The real issue is how ridiculously hungry the background OSs are this time around. They might as well just run Vista at that rate...
http://www.ibtimes.com/ps4-vs-xbox-...-xbox-according-report-does-it-matter-1361395
In a statement issued to Eurogamer's Digital Foundry, Sony addresses a key technical matter from the original story.
We would like to clear up a misunderstanding regarding our "direct" and "flexible" memory systems. The article states that "flexible" memory is borrowed from the OS, and must be returned when requested - that's not actually the case.The actual true distinction is that:
"Direct Memory" is memory allocated under the traditional video game model, so the game controls all aspects of its allocation
"Flexible Memory" is memory managed by the PS4 OS on the game's behalf, and allows games to use some very nice FreeBSD virtual memory functionality. However this memory is 100 per cent the game's memory, and is never used by the OS, and as it is the game's memory it should be easy for every developer to use it.
We have no comment to make on the amount of memory reserved by the system or what it is used for.
It's true if one uses mipmapping and there is no magnification.This is untrue. :|
please show me one game where using higher res textures has any meaningful impact on framerates if you have enough vram. I have never seen more than 1 fps difference in any game by choosing very high textures over low.This is untrue. :|
please show me one game where using higher res textures has any meaningful impact on framerates if you have enough vram. I have never seen more than 1 fps difference in any game by choosing very high textures over low.
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.I don't see why I need to provide any examples. GPUs aren't magical, they can't process arbitrarily large textures at a fixed speed. At some point you will hit fillrate or bandwidth limits.
At some point. Question is: where is that point?I don't see why I need to provide any examples. GPUs aren't magical, they can't process arbitrarily large textures at a fixed speed. At some point you will hit fillrate or bandwidth limits.
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.
The graph shows results that are quite normal for texture resolution-related settings: there is no difference in FPS at each level of the setting. The key issue with Texture Resolution is the amount of Video RAM on your GPU that is consumed to hold textures. If you have a graphics card with a lower amount of VRAM, then you may experience periodic stuttering, or visible texture streaming, if Texture Resolution is set too high. Texture Resolution of Medium is generally a good balance of image quality and smooth performance.