VulgarDisplay
Diamond Member
- Apr 3, 2009
- 6,188
- 2
- 76
It's not like dx12 is out right now. There is a lot more devs can get out of DX11 at this point since most games are just DX9 with a few DX11 features tacked on for marketing purposes.
It's not like dx12 is out right now. There is a lot more devs can get out of DX11 at this point since most games are just DX9 with a few DX11 features tacked on for marketing purposes.
And then lock games to same hardware performance for another 5-6 years...like today...introducing stagnation....history repeating it self ^^
What I don't really like though, is that if the next-gen consoles come in 2013, with DX11/11.1 class hardware, DX11/11.1 is likely to become the "new DX9".
The "consoles being underpowered" stems from this generation simply being too long. The 360 launched in 2005. We are in 2012 and still don't have previews of its successor, just leaks.
That's definitely a huge part of it. Normally we would have seen a new generation in 2009-2010.
Xbox 1 = November 15, 2001
Xbox 360 = November 22, 2005 (that's 4 years life cycle between the 2)
Xbox 720 = November 2013 (?), that's 8 years or twice as long
Pretty amazing that consumers put up with this non-sense. Xbox360 250GB Kinect bundle still goes for $375, which is a joke for a 7-year-old console. I mean MS is probably going to sell Xbox 720 with Kinect 2.0 for $499 or less. How are they even selling these 7 year old outdated consoles for almost $400? :sneaky: That's a lot of $ to waste on a console that's almost EOL.
Never underestimate that absolutely brain-dead stupidity of the general public.Pretty amazing that consumers put up with this non-sense. Xbox360 250GB Kinect bundle still goes for $375, which is a joke for a 7-year-old console. I mean MS is probably going to sell Xbox 720 with Kinect 2.0 for $499 or less. How are they even selling these 7 year old outdated consoles for almost $400? :sneaky: That's a lot of $ to waste on a console that's almost EOL.
BTW if the next-gen is going to overhype another underpowering PowerPC crap again rather than simply using a top end AMD APU I'm just going to laugh my ass off.
PS3/360 Dark Souls 1024x720
![]()
vs. PC modded
![]()
PS3/360
![]()
PC modded
![]()
We need new consoles ASAP!!
The details on the walls and rubble looks amazing! Does the game use DX11 tessellation?
No, that's just a form of a bump-mapping technique. With the move to DX9, parallax mapping and displacement mapping techniques built upon bump mapping to allow for improved depth effects without actually building the geometry. Tessellation is the next step to actually create a real 3D depth to objects, but it's currently a very costly technique.
You can have a quick look here on how parallax and displacement mapping look:
http://www.tomshardware.com/reviews/hollywood-nature-compared,2049-8.html
Actually, I would argue that those Dark Souls screen shots do nothing more than prove that consoles are good enough. Sitting six feet from the screen, those extra details don't matter at all in enjoying a game. Not that consoles aren't antiquated - they certainly are. But for the average gamer, they still have a lot to offer. The jump to HD seven years ago was a huge leap and is still carrying these consoles through.
Take for example Borderlands 2. I'm a computer enthusiast and just pre ordered it on PS3, even after watching the nvidia videos showing the major physx effects. Why? Because the game will still be a ton of fun and my friends who play it will be buying it on playstation. I have no doubt the PC version looks better, but that will take very little away from the enjoyment of the game.
This thread wasn't really about PC vs. Console anyway. It's more about how the development of the two gaming arenas interact.
Even though when we compare CPUs, we strive for near the top performance and criticize AMD's Phenom / Bulldozer CPUs, etc., for consoles they would be miles better than that PowerPC crap.
I can already see it: PS4, engineered for 4K graphics! (i.e., the HD7000 series GPU will be capable to output 4K resolution to a 4K TV, but good luck getting PS4 to run games at 4096x2160 natively).
An A8-3850 is actually pretty close to getting 60 fps in modern games, based on TechReport's quick overview. It would actually be a no brainer for a console to just use a cheap quad-core AMD CPU and instead spend $$ where it counts the most for graphics -- the GPU.
x86 CPUs are very weak at gaming computations, the POWER architecture is far better suited(PC devs are the only ones you see complaining about in order architecture, they have been spoiled by Intel's phenomenal compilers). Nothing is stopping IBM from producing an OoO POWER based CPU if the console makers are willing to take the rather sizeable hit in perf/watt. As of 2010 the fasted six core i7 was able to exceed half of Cell's floating point performance, in gaming the overwhelming majority of the code base is floating point. When we talk about GPUs it is very easy to see the superiority of PCs, when it comes to CPUs PCs are actually shockingly poor when looking at gaming explicitly. In relative terms, the CPUs the consoles shipped with were four to five generations ahead of PCs when looked at from a GPU based perspective.
Don't confuse limitations of the GPU or RAM with the CPUs. When Carmack was posting notes on Rage one of the things he mentioned was that the PS3 had an advantage over the 360 and PC because it had so much raw CPU power they could use an entirely different and more intensive compression method to save IO overhead, this was versus the i7 too. That's the 'terrible' POWER architecture. Wasn't very good as a general purpose CPU, it is a *beast* at floating point computations as configured in the consoles particularly when compared to the extremely weak x86 offerings of the time era.
Polyphony will almost certainly run at 4K resolution for their games on the PS4. You take a top tier console developer and it is amazing what they can do with fixed hardware. It is a completely different development process then dealing with PCs where you need to worry about code flexibility. It tends to be why console devs suck at PC ports and PC devs suck at console games(nothing this generation has changed that perception either).
Forgot to mention, perf/xtor is significantly higher on the POWER parts too, a "cheap" AMD CPU is considerably more expensive on a performance basis then the rather tiny POWER chips that MS used this generation. Not saying they won't go x86, but if they do it won't be to save money.
I think you missed the extreme sarcasm.
Even after the resolution fix that game looks sub par.
You sound like you have choosen to ignore Anand's pulled article...ignoring facts won't make them go away.
As of 2010 the fasted six core i7 was able to exceed half of Cell's floating point performance, in gaming the overwhelming majority of the code base is floating point. Forgot to mention, perf/xtor is significantly higher on the POWER parts too, a "cheap" AMD CPU is considerably more expensive on a performance basis then the rather tiny POWER chips that MS used this generation. Not saying they won't go x86, but if they do it won't be to save money.
BS. Everyone sane complains about in-order uarches, because for almost 50 years, they've solved performance problems that either can't be solved any other way, or which require too much bandwidth to try to solve some other way. And, you what, it's turned out quite good in many ways. A wealth of home routers are running OOOE uarchs, today (MIPS 74K, in-order issue, in-order completion, OOOE--jut what C wants).x86 CPUs are very weak at gaming computations, the POWER architecture is far better suited(PC devs are the only ones you see complaining about in order architecture, they have been spoiled by Intel's phenomenal compilers).
What perf/W loss? The PPC470 can be efficient enough, and they have been made at at least 2GHz, now. It would still suck compared to modern PC CPUs, but it aught to be quite respectable, and actually worth the savings in space/power.Nothing is stopping IBM from producing an OoO POWER based CPU if the console makers are willing to take the rather sizeable hit in perf/watt.
That's a good reason for choosing PPC, for the relevant games (most code is not necessarily floating point; most code in any one game might be) but has nothing to do with in-order or not. Not all FP is vectorizable, and benefits from OOOE, too. Some vector FP can even benefit from OOOE, though I think that will become rarer as time goes on (cache effects are leading to having to align data to cache lines, where that alignment and total sizing will have more of an effect, for wide vectors and long-running vector loops).in gaming the overwhelming majority of the code base is floating point.
LOL. Really, that deserves little more than that. The PS3 had a basically bandwidth-crippled 7800GT (not ahead at all), and the XB360's GPU would be superseded within a year or so. The XB360's GPU was high-end for its time, and it did have features that gave it an advantage over the generation of GPUs in PCs with the same raw performance, but it was 1 gen ahead.When we talk about GPUs it is very easy to see the superiority of PCs, when it comes to CPUs PCs are actually shockingly poor when looking at gaming explicitly. In relative terms, the CPUs the consoles shipped with were four to five generations ahead of PCs when looked at from a GPU based perspective.
That's called luck and ingenuity. The Cell is very good at certain things. It just sucks at the other 99%. Also, it's PowerPC. Power is another beast. And, the Cell SPEs are their own little things, too.Don't confuse limitations of the GPU or RAM with the CPUs. When Carmack was posting notes on Rage one of the things he mentioned was that the PS3 had an advantage over the 360 and PC because it had so much raw CPU power they could use an entirely different and more intensive compression method to save IO overhead, this was versus the i7 too. That's the 'terrible' POWER architecture.
Yes, a beast at floating point computations. That made it much faster at those things we already had enough power for, 99% of the time, which is also why we haven't had sufficiently good vector extensions on the PC--it was fast enough we didn't need them, up until recently. The problem, though, is that you're living in the past, where games were simple loops. Today, they are giant branchy messes, like most any other big computer application. Fast FP can make up for a few things being slower, but it can't make up for cache misses, it can't make up for branch mispredictions, and it can't make up for slow ALUs. All those threads and all those Altivec/VMX engines are needed to barely reach what 1 and 2 core x86 CPUs could do. Lots of potential, but it's too hard to use, because some code is fundamentally sensitive to CPI.Wasn't very good as a general purpose CPU, it is a *beast* at floating point computations as configured in the consoles particularly when compared to the extremely weak x86 offerings of the time era.
Perf/xtor is not only filled-pipeline MIPS and FLOPS, though people can sometimes be sold on those. If it takes tens of cycles to do a function call, or a few shift and adds, there's no hope, not matter how good the numbers are when you fill the pipelines.Forgot to mention, perf/xtor is significantly higher on the POWER parts too, a "cheap" AMD CPU is considerably more expensive on a performance basis then the rather tiny POWER chips that MS used this generation. Not saying they won't go x86, but if they do it won't be to save money.
It's worse than just irrelevant: they have MS' compiler. Intel's might be better at automatically leveraging vector extensions, but MS' compiler on x86 is no slouch, and I doubt it's at all bad on PPC. Given that MS has historically not had microbenchmark blinders on, like GCC devs often have, I imagine it's fairly good. I have also heard nothing but praise for their dev tools. Personally, I'd just about bet that MS has kept basic IR->ISA backends alive internally for ARM and PPC for years, even after they went all-x86 around 2000. The added cost would be fairly small, while the ability to jump right into using HW based on such an ISA would be huge (also the chances of needing to support anything more esoteric than PPC is pretty small, today).Using an excuse that PowerPC architecture isn't given proper compilers like Intel CPUs are on x86 is irrelevant.
The jump to HD seven years ago was a huge leap and is still carrying these consoles through.
Dark Souls only runs at 1024x720 on the PS3.
http://forum.beyond3d.com/showthread.php?t=46241
