But if you look at the actual games
you'll see that most have a cap at 30 frames per second with v-sync. For example: Bioshock Infinite, easily runs at 60FPS with v-sync on a 7970 without dropping to 30 frames a second. An Xbox can only get 30-50 with tearing and no v-sync, or 30 pretty constantly with v-sync, as seen in this discussion.
Secondly, a developer CAN optimize for PC: there are two common sets of drivers, and they each govern a set of video cards with are essentially identical with only a variable amount of the hardware necessary to perform the calculations. Most games only release for Windows, for which there are at best 3 relevant versions: XP, 7 (possibly 8).
Carmack is talking about actual games in the quote that you snipped and ignored.
Ideally, I'd like a conversation with an developer who can explain exactly what is happening to create that overhead, and how much. Galego has made a mighty effort to prove it, which I respect immensely--digging through archives can't be easy, even with Google. I just want a little more proof.So what sort of performance-overhead are we talking about here? Is DirectX really that big a barrier to high-speed PC gaming? This, of course, depends on the nature of the game you're developing.
--Bit-Tech
chances are it will put consoles on par with a 680 or 7970 at 1080@60. if according to lottes, if a game dev is willing to code exclusively for the ps4(with code designed specifically for the apu structure), the performance could go beyond a 680/7970 card. he doesnt say by how much, so that is the area of debate.
They replaced SVOGI by GI solution and did a slight scale down in the number of particles for some FX. But this does not mean that PS4 cannot shine at those as well, but that this is the best they could obtain in some weeks and using a development kit (which are still in their infancy).
Second, a game developer cannot optimize for PC, because there is no one PC, but hundred of millions of different PCs with different hardware, operative systems, drivers...
Last thing I heard about UE4 was that they were taking SVOGI off the engine altogether.
Sparse voxel octrees are a way of storing lighting data for a scene in such a way that makes cone traces reasonably cheap making it a potentially attractive method for global illumination. I don't recall the paper very well but they're apparently still complicated to update dynamically.
Epic was only using it for second bounce and it was always a little overoptimistic of them. Rumors that they'd dropped it have been floating around the industry for a long time, starting almost right after they put out their original demos.
Some day real GI and Raytraceing in games... just not right now.SVOGI is overkill with the current hardware that's currently available. It's a matter of time but just not now.
"[SVOGI] was our prototype GI system that we used for Elemental last year. And our targets, given that we've had announced hardware from Sony, that's where we're going to be using Lightmass as our global illumination solution instead of SVOGI," senior technical artist and level designer Alan Willard told Eurogamer, stressing that this new iteration of the technology has evolved significantly beyond the current-gen system used in titles like Mass Effect 3. Certainly, just the presence of so much more memory on next-gen platforms should improve lightmap quality on its own.
At GDC 2013 Epic showed a demo of the PS4 against an i7 (ivy Bridge), 16 GB RAM, and GTX-680.
The demo was prepared without time to optimize it and the PS4 was already able to run the same AA resolution, meshes, textures, DOF, motion blur... than the high-end gaming pc.
They replaced SVOGI by GI solution and did a slight scale down in the number of particles for some FX. But this does not mean that PS4 cannot shine at those as well, but that this is the best they could obtain in some weeks and using a development kit (which are still in their infancy).
In fact, tessellation was broken on the PS4, but they are already working in fixing the tessellation in the future.
I also agree that the performance will go beyond a 680/7970.
Your ratio isn't correct for a number of reasons. The 7970 can get 84 frames per second in the Ultra presets of Bioshock Infinite at 1080p (only 60 show on a normal screen, but there are 120Hz screens out there). The Xbox 360 gets 30 frames per second at 720p on medium/low. So the PCs are pushing 50% more pixels along with far more details and eye candy, especially demanding ones like shadows and anti-aliasing. I'd guess that those things at least double the workload for a GPU (far more if you do things like SSAA, which can literally multiply the requirements by 16 times or more).
Look at Assassin's Creed II. It runs at 60 frames per second on a Xbox 360 at 720p on the equivalent of a medium preset, with details turned low or off and dips below 60 frames per second in intense scenes. I can run it at 60 frames per second with every detail and 8x supersampling and never see the framerate drop. That's easily more than 10x the demand, yet my stock-clocked GTX 670 can do it better than the Xbox 360.
There are "dozens" of versions of Windows? Okay, there's a statistically significant amount of Windows 7, Windows 8, Windows Vista, Windows XP, and there's the 32 and 64 bit version of each (according to Wikipedia). All the other versions, like Home Server, Enterprise, N, KN, Blade, whatever, are too small of a market share to even overcome Linux at 1.17% market share, or they are simply the normal Windows OS with a few features (like Windows Media Player) removed or added in. That's at best 8, and most of the underlying interactions with the hardware are the same, especially between Vista and 7.
No
http://www.youtube.com/watch?v=mFgVR2Ke0XY
Please watch. There are some clear differences (If the ps4 was equally powerful I would actually expect to see slightly more from it as epic has had more time to develop and tweak the demo).
Edit:
http://www.youtube.com/watch?v=gtfCWYjOsvI
You can clearly see the frame rate is lower on the ps4 version, look at the flags at :47 and the falling debris at 1:55.
1:37 missing fire glow on ps4.
Ps4 is missing lots of things.
No
http://www.youtube.com/watch?v=mFgVR2Ke0XY
Please watch. There are some clear differences (If the ps4 was equally powerful I would actually expect to see slightly more from it as epic has had more time to develop and tweak the demo).
Edit:
http://www.youtube.com/watch?v=gtfCWYjOsvI
You can clearly see the frame rate is lower on the ps4 version, look at the flags at :47 and the falling debris at 1:55.
1:37 missing fire glow on ps4.
Ps4 is missing lots of things.
Trying to draw any conclusions about the PS4's power from an pre-release 3rd party tech demo is an exercise in futility. What you're basically seeing is the worst case scenario....and despite the fact that they didnt get to the same level as the PC tech demo, its pretty damn close, and it will only get better with time. As the bottom end of what we can expect going forward, its pretty impressive IMO.
It was not running on a finished and polished PS4, but in a developers kit. Strike one.
The kits were sent to developers some few weeks before. The developers have not still learned the new APIs, tools. However they know very well the API tools used on the Pc because are years old. Strike two.
Third, as the chief explained. They had no time to merge the two different cinematics, which resulted in some visual differences which are not real, but artifacts of that. Strike three. You are out, but let us continue
Fourth, the PS4 version had tesselation broken due to bug that could not be fixed on time. That bug is responsible for other visual differences as the lava flowing.
Fifth, they did basically a fast port from PC to PS4. They did not use any advanced possibility associated to the console hardware such as the eight threads, HSA, the extra RAM, or the physics effects.
Everyone knows that early demos and first games are not representative of the real power of a new console. One only need to see early demos and first PS3 games and compare they to the superb games developed years after for the same console.
The 7970 and the 670 are not giving those framerates on an intel atom D525. True? Comparing your GTX 670 (2460 GFLOP) to the ATI GPU (240 GFLOP) on the XboX is only part of the equation. The 10x was applied only to the GPUs. There are additional reduction of performance becoming from the CPU on the Xbox, from the memory on the Xbox...
Repeat the comparison using a much slower CPU and much slower RAM in your PC and reduce the RAM to 512 MB and you will see how the gap is reduced.
The 10x factor is an rough estimation. Nobody said you that it is an exact fator that applies always in any situation. It is an overall estimation that works in the average. It is an order of magnitude, it can 6x or 13x or something as that depending of lots of factors, including as well non-technical factors such programmers ability or the time they had to port/optimize.
Take only Windows 7 as example. There is six versions and two architectures, which gives 12 different versions of it. Which version of W7 would chose the programmer, that supporting a maximum of 2GB? one supporting 4GB? That supporting 8 GB? That supporting 16GB? Those supporting 192 GB?
And each one of them can have different updates/patches. What version of W7 Pro 64bit? That with the last updates? That with the FX patches installed manually?
And so on.
From epic:
It always sounded like a really pie in the sky brute force method to me too - kind of like super sampling AA, before they started to figure out ways to get nearly the same quality for much less performance - first MSAA, then CSAA, FXAA, now SMAA.
Sounds like with the 8GB of memory they can basically get away with prebaking GI/SSAO on all static objects by just using super high res lightmaps. Not too often you think of memory capacity actually contributing to performance. Give the industry a couple years and they'll prob figure out how to fake GI at a fraction of the performance cost.
"The dynamic lights are not our current performance limit. Our limits come more from the amounts of massive overlapping shadow-casting lights," Willard said. "That's not been a huge challenge because typically we don't need that many to get a perfect-looking scene.
Early ps3 games were poor because no one knew how to code for cell. they don't have this problem this time.
Any API tools on the ps4 would be basically identical to the pc (since they are basically using the same off the shelf components) for the demo (perhaps in the future this will be different).
You do realize that win 7 ultimate 64 bit and win 7 home premium 64 bit will run the same executable in the exact same matter (because ultimate is simply home premium with additional features).
And if you wish to compare screenshots:
![]()
![]()
Wrong, again.No it cannot.
Wrong, again.
http://www.pcworld.idg.com.au/article/386085/windows_7_home_premium_vs_windows_7_professional/Memory support
Home Premium: 16GB of physical memory supported
Professional: 192GB of physical memory supported
On a computer that is running Windows 7, the usable memory (RAM) may be less than the installed memory:
For example, a 32-bit version of Windows 7 may report that there is only 3.5 GB of usable system memory on a computer that has 4 GB of memory installed.
Or, a 64-bit version of Windows 7 may report that there is only 7.1 GB of usable system memory on a computer that has 8 GB of memory installed.
Note The amount of usable memory in the examples are not exact amounts. Usable memory is a calculated amount of the total physical memory minus "hardware reserved" memory.
[...]
For example, if you have a video card that has 256 MB of on-board memory, that memory must be mapped within the first 4 GB of address space. If 4 GB of system memory is already installed, part of that address space must be reserved by the graphics memory mapping. Graphics memory mapping overwrites a part of the system memory. These conditions reduce the total amount of system memory that is available to the operating system.
Which is very good, because you don't. Some mobos needed remapping turned on, and others were plain broken. Windows could not do anything about that. On most systems today, the amount that remains truly unusable is typically <10MB (2MB on my desktop, FI).Of course game developers know those basic facts... ()![]()
We know this as well. The physical memory amount is not quite equal to the amount addressable. The PS4, for example, is anticipated to have only 7.2GB available after the OS takes its share.
But you are nitpicking at words. You said Win7 does not support 16GB. We have shown that it does in fact, though the OS takes up anywhere from just less than a gigabyte to only a few MBs.
Another thing is the amount of memory available to applications. The amount of memory available to a game will beThese conditions reduce the total amount of system memory that is available to the operating system.
