- Jun 2, 2012
- 6,470
- 32
- 91
The 360 has a 3.2ghz powerpc cpu while the Xbone has a 1.6ghz tablet cpu which is faster per core?
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?
Both the PS4 and Xbox One are faster and produce better graphics than their previous gen counterparts. All the games they publish for the systems run perfectly smooth in my experience. So what impact is the CPU really having (and what makes the AMD chip so bad, really)?
These aren't PCs and are purpose built for console games. What is the point of this thread?
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?
I see alot of posters complaining in forums all over that consoles are causing the porting issues with PC games like Arkham Knight etc...
It's still up to the company making the game to port the game, GTAV took so long because it seems there wasn't enough money to justifiably go re-do the game without being able to sell it on next gen consoles.
A lot of processing tasks are now handled by the GPU, so a powerful CPU isn't required anymore.
We have a lot of recent AAA multiplat games on PC being very CPU intense pushing i5's to the limit and hammering i7's hard so it seems a powerful CPU is still required?
http://www.anandtech.com/show/7889/...w-level-graphics-programming-comes-to-directxAnandTech said:The result is that when looking at single threaded CPU performance, GPUs have greatly outstripped CPU performance growth. This in and of itself isn’t necessarily a problem, but it does present a problem when coupled with the high level APIs used for PC graphics. The bulk of the work these APIs do in preparing data for GPUs is single threaded by its very nature, causing the slowdown in CPU performance increases to create a bottleneck. As a result of this gap and its ever-increasing nature, the potential for bottlenecking has similarly increased; the price of abstraction is the CPU performance required to provide it.
Low level programming in contrast is more resistant against this type of bottlenecking. There is still the need for a “master” thread and hence the possibility of bottlenecking on that master, but low level programming styles have no need for a CPU-intensive API and runtime to prepare data for GPUs. This makes it much easier to farm out work to multiple CPU cores, protecting against this bottlenecking. To use consoles as an example once again, this is why they are capable of so much with such a (relatively) weak CPU, as they’re better able to utilize their multiple CPU cores than a high level programmed PC can.
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)
Both the PS4 and Xbox One are faster and produce better graphics than their previous gen counterparts. All the games they publish for the systems run perfectly smooth in my experience. So what impact is the CPU really having (and what makes the AMD chip so bad, really)?
These aren't PCs and are purpose built for console games. What is the point of this thread?
We have a lot of recent AAA multiplat games on PC being very CPU intense pushing i5's to the limit and hammering i7's hard so it seems a powerful CPU is still required?
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)
Graphics quality is primarily gpu. Game complexity is primarily cpu.
The big advancement to Xbox One/PS4 is that games no longer require crazy optimizations to perform well, but all the big companies already built game engines with crazy optimizations.
But in maxed performance, it does seem the PS4/Xbox One cpus just barely beat the Cell in real workloads. Xbox 360 is much further behind because its best cases weren't as good as the Cell, even though it's average or worst cases could be better.
We don't have many benchmarks to go off of, but from what's been posted by developers:
In a floating point heavy graphics benchmark, Xbox One/PS4 are roughly equal to Cell, and all are 3x the speed of the Xbox 360 cpu. And I think a GPU equivalent program was about 4x the speed of the best cpu results.
In a multi-threaded draw calls/s benchmark, PS4 is about 50% better than Cell.
As more and more FP heavy code moves from cpu to GPU, the cpus should look better, but they're still not a huge amount faster than last gen, they just require far less optimizing to max out.
As far as limitations, well having cpus not much faster keeps us at 30fps in games. It also keeps the number of interactive elements (AI, physx, particle effects, etc) at about the same levels as previously. We have so much more memory, but not much more cpu power.
This can already be seen limiting games like Assassin's Creed Unity and Dead Rising 3. However, more and more computation will switch to gpu compute, which will help.
Quite possibly due to draw calls limits and API overhead. Consoles basically have ways to bypass their APIs, and while DirectX does offer some features to optimize performance draw call performance, it's not as good and those features may not even be used in a port. Also DirectX doesn't multi-thread well prior to DX12. OpenGL does offer ways to bypass the API, but only recently, and I can't remember the last time a major game used OpenGL.
Also, console games typically run at 30fps or below, which is basically considered unplayable on a PC.
There's other weird things too. Disk streaming performance seems worse on PCs (possibly due to DRM or not having exclusive access to the disk), so you need a faster system to make up for the poor disk streaming performance. Also, I'm pretty sure the recent Batman game used an outdated version of the Physx library that wasn't well optimized for cpus on PC.
Also, for transfer from CPU to GPU ram (or vice versa), the consoles are faster than any PC, once again needing an even faster cpu to speed up other parts of the processing.
RISC versus CISC doesn't really matter. Intel's cpus are among the most powerful cpus you can find in any metrics. AMD's console cpus? Not so much. It's just expensive cpus (Cell was like $2 billion in R&D and pushed manufacturing to its limits at the time, basically as expensive as a true quad core while most pcs were single or dual core) versus cheap cpus.
I guess people find this fun on an absolutely theoretical level, but it's absolutely meaningless when features and game availability are exclusive to a certain console.
...
This generation is boring as crap. With both sides just being x86 computers we know EXACTLY how powerful they are, minus this mysterious "console optimization" fudge factor (that should really be a "how lazy are the developers porting the PC version?" fudge factor). Unlike a 360 or a PS3 both have CPUs and GPUs that are like products we can buy, which means that the only debate left is how much fudge factor one side has vs the other.
In fact, I would argue it's not exclusives that kill the debate. It is those cross-platform games. When they are made for the lowest common denominator platform (the Xbone) then the difference in power doesn't matter. THEN the debate is purely theoretical because the developer isn't using the difference. Exclusives are not really part of the debate because almost every decent console has exclusives, and often they are first party titles that have some extra special sauce that only a team close to the hardware could pull off.
I do see a lot more talk about exclusive nowadays than ever though, mostly as a way to end discussion. "No we can't compare the Xbone to the PS4, or the PS4 to the PC because exclusives make them all so different!" In reality most titles aren't exclusive, and now more than ever a gamer can't really make a bad choice when they chose what platform to game on as most of the top games will find their way to all of them.