96Firebird
Diamond Member
- Nov 8, 2010
- 5,665
- 282
- 126
In the PS4? The APU.Do you know what is responsible for compressing/uncompressing textures?
In the PS4? The APU.Do you know what is responsible for compressing/uncompressing textures?
Reads to me like what I'd expect from Ubisoft: incompetence and/or deception, and failed stated goals. The CPU performance was well-known going in. They weren't going to have PC-level CPU performance from the start, and that was anything but a mystery.Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus
The CPU in the consoles is simply a disaster.
Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.
Star swarm had joke AI running for squadrons. Star swarm exists in a static universe where nothing has to be loaded for HDD once the demo starts or saved. Star swarm consists solely of the ships present. Star Swarm is inherently less complex; things such as clipping, audio, explosions, and other physics are completely glossed over.Wasn't the StarSwarm demo running on 8 core FX downclocked to 2GHz? Which is about the same as jaguar? If I recall correctly, this demo had more objects, AI, etc going on than every assassins creed combined.
Oh, I do!Do you know what is responsible for compressing/uncompressing textures?
Oh, I do!
Textures: Ships compressed. Decompressed on the fly by the texture units in the GPU
Audio: Ships compressed. Decompressed on the fly by the Tensilica audio DSP
At no point is the CPU involved in either of these. Decompression is the job of fixed function hardware for performance and power reasons
The way they then port that is to take the uncompressed assets and stick them on the PC, rather than recompressing them for the DX API? Is this really what is happening?
What I want to know is, why they can't hit 1080p on the PS4 if they aren't GPU limited?
Being CPU limited only affects you at lower resolutions, not at higher ones..
We will know for sure when digital foundry do their performance comparison between Xbone which has 1,75GHz CPU clock and PS4 which has 1,60 GHz CPU clock. The extra 10% should give quite a bit more performance for xbox, if not...
That would be the first game that runs better on Xbone than on ps4.
While it's true what you are saying, they insinuated that they weren't GPU bottlenecked when they said they could achieve 100 FPS if it was just graphics they had to worry about..They didn't say they were *never* GPU bound. You're presuming that they're rendering the exact same image and/or the Xbox isn't dropping frames. We've seen this type of thing with a lot of games already, where they run the same res but the Xbox version performs poorer.
While it's true what you are saying, they insinuated that they weren't GPU bottlenecked when they said they could achieve 100 FPS if it was just graphics they had to worry about..
So I'm guessing that they aren't draw call limited if that's the case, and increasing the resolution output has no effect on the amount of draw calls required anyway as the frame rate is locked and the burden is squarely on the GPU..
http://www.dsogaming.com/interviews/crytek-talks-ryse-tech-consoles-vs-pc-textures-resolution-mantle-vram-specs-lod-solution/Regarding VRAM, we adjust our texture pool based on the available graphics memory. While the maximum pool size is used with 3 GB of VRAM, 2 GB are enough to achieve great quality. Below that, we have to downscale some of the heavy textures a bit.
I wouldn't call it a disaster considering the APUs were given the GPU grunt (well at least the PS4) to handle graphics + compute. What was needed coming off the last generation was an increase in graphics throughput for both graphics improvement + compute and general processing capability (non-SIMD) which Sony and MS got with 8 Jaguar cores. Yes they are slow clocked, but they would be in the same situation shoe horning their AI into a high speed AMD dual module part or even an i3. Anything more substantial would've cost them extra die area, graphics die area, and/or more TDW to dissipate meaning more cooling hardware and in the end just more money to manufacture.Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus
The CPU in the consoles is simply a disaster.
Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.
It's not exactly as simple as CPU time you see being used in Windows when you pull up task manager at idle.And Windows in the background uses what, 0.1% CPU time?
I somewhat agree with that though resolution, AF, AA all take a part in that issue too. Luckily we are out of the days of when we really had to analyse a polygonal object with crappy textures just to deduce what we were looking at D:Example: http://cdn.wccftech.com/wp-content/uploads/2013/10/2486940-0248877224-ChsSw.png It is going to take ever increasing graphics performance on the exponential scale to have meaningful improvements in visual fidelity going forward.
I somewhat agree with that though resolution, AF, AA all take a part in that issue too. Luckily we are out of the days of when we really had to analyse a polygonal object with crappy textures just to deduce what we were looking at D:
While there have been necessary increases in animation and physics, increasing graphics processing power since 2005 really hasn't been necessary from a gameplay standpoint except in a few extreme cases where long range draw distance and decent graphics at ground level is an issue like Battlefield or GTA.
:hmm: Good point. But even VR doesn't require teraflops of compute power. Even early experiments into goggle VR wowed people long ago just like using the Oculus on an older flight game can wow people today. What made VR difficult was the getting the actual hardware and software to work in concert with each other while being cheap enough for people to procure thanks to the scale of supply.VR is gonna change that in a major way. Not even a 980 is sufficient to drive a decent looking game at the resolution and frame rate it requires.
The overhead on the PC platform has been slowly diminishing overtime with every Direct3D/OpenGL update though. And now we have Mantle, which if I had to estimate, brought the overhead down to about 10%..The amount of overhead a Windows PC has compared to a console is absolutely gigantic; easily in the 30-50% range or more for how much of a performance penalty you have vs direct to the metal low level coding that consoles enjoy. Developers have significantly easier and more straightforward control of console hardware than they do with PCs. A simple way to make it easy to understand if you're not someone who developers both for console and PC is by trying to build a Windows PC with similar to console specs and see how well it runs games.
Umm where have you been? PC gaming scene have always been about mods and in particular high res mods. It's great now these game devs ship ultra textures as an optional download at release!There aren't even any good examples of those super high res textures providing superior visuals. They're just easy to offer because that's how the textures were made.
Yes, but now there are fewer cases where those ultra high res textures provide any significant improvement. I understand this for something like Skyrim where the high res textures from the developer were not all that high res, but for example in the case of Shadow of Mordor there is not much real difference in visuals.Umm where have you been? PC gaming scene have always been about mods and in particular high res mods. It's great now these game devs ship ultra textures as an optional download at release!
:hmm: Good point. But even VR doesn't require teraflops of compute power. Even early experiments into goggle VR wowed people long ago just like using the Oculus on an older flight game can wow people today. What made VR difficult was the getting the actual hardware and software to work in concert with each other while being cheap enough for people to procure thanks to the scale of supply.
But people are going to reject VR if the graphics have to look like an N64 game to not give you a headache or make you sick. Doing VR right means you need a GPU and CPU capable of putting out at least 960x1440 at 150fps (75fps per eye). I think people are gonna be surprised at how quickly you become CPU bound when you're trying to push 150fps at any resolution. Right now I'm playing alien isolation at 1080p and I'm telling you it looks like you're playing at 320x240..it's blurry and the pixels are HUGE. Your eyes are like 2 inches from the screen, you need an unbelievable amount of resolution and pixel density for it to look good under those conditions. And despite that, I have to turn settings way down to maintain that 150fps and it still looks like a mess.
I remember using VR back in the 90s, and it was cool because Ive never seen anything like it, but I don't think I could have tolerated it for longer than the 5 minutes they let us play. If you're buying it for personal use at home you're going to expect to use it for hours at a time, and that's going to require a serious rig to back that experience up. If it's even slightly juddery or laggy for more than a few seconds your head will start to spin and it's super uncomfortable. So it raises the standards to a huge degree if you want to play modern games with it.