Are we allowed to ask if a certain poster has previously posted under another name?
Because I swear the provocative, outlandish claims and generally superior and combative attitude of a previous poster just seems to keep coming back, and no one seems to do anything to stop it.
Try to understand by yourself which is the main difference between game evolution in the PC ecosystem and in the console ecosystem and you will be answering yourself.
Hint: Read the bit-tech link given before.
Does this other persons name who you're thinking of start with a g?
Are we allowed to ask if a certain poster has previously posted under another name?
...and no one seems to do anything to stop it.
No, this makes absolutely no sense. CPU draw call disadvantages on DX11 should have absolutely no impact on what's available in a dev kit, particularly because, as you keep claiming anyway, devs program direct to metal.
I can't find an article about how the dev kits are stripped down. What I do find, in the first two links from Google, are about how the PS4 has many of the most resource-intensive features stripped out, like real-time lighting and advanced liquid physics, and that by nature of being a game engine demo, looks better than the games will.
The burden of proof is on the person who makes the original claim. Show us proof that the demo is in fact stripped down and that API and draw calls make a console 2x more powerful (you haven't yet. The Bit-tech article is about the difficulty of programming despite increased power, not how consoles gain a factor of power).
The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?
We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.'
[...]
The DirectX Performance Overhead
So what sort of performance-overhead are we talking about here?
[...]
On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.
Low level and API-free programming seems to be the future of game development and graphics programming on the PC.
Do you know that graphic hardware on PC is limited to few thousand of draw calls per frame (around 2,000 to 3,000) while the number of draw calls on a console can be 10,000 up to 20,000?
According to Richard Huddy (AMD’s head of GPU developer relations), the limiting factor on PC is the performance overhead of the 3D API (mainly DirectX) while on consoles, game developers can use low level code to process more triangles than on PC. More render calls allow more creativity freedom for game designers. The solution would be to have a low level access to PC graphics hardware (direct-to-metal programming).
A final note because I am not going to repeat how API overhead affects GPU performance forever. My point is very well summarized in this article:
http://www.geeks3d.com/20110317/low-level-gpu-programming-the-future-of-game-development-on-pc/
They do. That's half the problem: people put vaseline all over the TV and monitor, and then say, "see, it doesn't look that much better!" There were either not as many details, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?
, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).
No, it doesn't. I also always use vsync, and still rock a Core 2 Duo, so I only get high FPS in really old games, anyway.that mostly has to do with vsync.
Low amounts of upscaling always look like crap, regardless of any FPS cap. Blur AA looks like crap, no matter what else, too (I don't mean FXAA or SMAA, but the shader AAs predating them on console games and their ports). Tons of bloom looks like crap. Really fuzzy textures, downscaled from what the artists actually made, and especially textures of varied fuzziness in a scene, also look like crap.
They do. That's half the problem: people put vaseline all over the TV and monitor, and then say, "see, it doesn't look that much better!" There were either not as many details, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).
However disabling Vsync is also known as a GPU killer unless u got great cooling.
There was a Forceware driver bug, which was triggered by the menu screen of Starcraft II, allowing the GPU to dangerously overheat, without being throttled or shut off. Vsync was the early fix, then new Forcewares fixed it, and SC2 limited the framerate of the menu screen.I've been gaming on PC since Wolfenstein 3D amd have never heard of vsync as a GPU killer nor have I ever heard anyone claim you need better cooling with it disabled. Sure, it allows the GPU to get loaded heavier depending on the situation but nothing tat requires anything special. Sounds like a myth to me. Especially considering I always run with it off and have always used stock cooling.
The more conservative overload given by Carmack is of 2x. Taking that the PS4 would perform as a GTX-680.
What curious! Epic showed a PS4 vs PC demo, at GDC 2013, where the PC was using a GTX-680. What is more interesting is that the demo was running on AH and only a 30% of the final specs were used. Taking that the PS4 would perform above a HD 7790, Titan, and GTX-690.
Game developers and hardware engineers must be using similar numbers for backing up their public claims that the PS4 is much more than a high-end PC.
On the other side I only can read what have been adequately described as "PC trolls" (not you of course) claiming that the PS4 cannot do this cannot that and their entire line of arguing goes from idiotic rants about tablet-like power consumptions to people who believes that 1.84 TFLOP in a console equates 1.84 TFLOP on a PC.
https://twitter.com/MarkRein/status/337627995323895808
RT @developonline: EA: Xbox One and PS4 a generation ahead of PC http://www.develop-online.net/news/44289/EA-Xbox-One-and-PS4-a-generation-ahead-of-PC <-no theyre not. I call bullshit on this one.