Xbox 360 and XBone which cpu has better single core performance?

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
The 360 has a 3.2ghz powerpc cpu while the Xbone has a 1.6ghz tablet cpu which is faster per core?
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Powerpc is practically arm (risc) which is found in most tablets/smarts the xbone has a common x64 processor the same we have in desktops although it does have the performance of a tablet cpu.

Anyway they are different architectures so you can't really compare them.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
I heard somewhere that in vector performance, the Xbox 360 CPU is quite good. Even close to equivalently clocked x86 levels. But, it takes a lot of optimization to get that good, because the cores are in order cores.

It does have a massive clock speed advantage, but clock for clock, in most cases the Xbox One CPU will be faster. Is it sufficiently faster clock for clock to make up for the clock speed disadvantage? I would think in most cases it is. Maybe in some edge cases, the Xbox 360 will be faster, like in vector calculations, but for most other things, I'd expect the Xbox one to be faster. Yes I am talking single threaded, not taking the extra cores into account.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Best information I could find has the Xbox 360 Xenon topping out at 77 GFLOPS while the Cell is 230 GFLOPS. That's according to this old Forbes article. Their source is IBM, who of course makes those chips. So numbers should be taken with a grain of salt.
http://www.forbes.com/free_forbes/2006/0130/076.html

Those are just theoretical maximum numbers, but it does put performance in line with a lot of top end modern CPUs. So yes, the older chips are faster than the Jaguar.

Both the PS3 and 360 use the Cell Broadband Engine Architecture.

The Cell is a weird chip. It uses 8 cores but doesn't work in the same way a traditional multi-core CPU does. Which is why they're so difficult to program for. They have one master (Power Processing Element) core and 7 slave (Synergistic Processing Element) cores. I've read the Wikipedia article on it and I'm still confused about how it actually works, other than its very good at single precision floating point tasks. Whatever that means. There was a lot of math jargon involved.

The Cell was created to bridge the gap between CPU and GPU performance. Now a lot of that stuff can be offloaded to the GPU, which is why the PS4 and XB1 can get away with comparatively weak CPUs.

The 360 Xenon is basically 3 modified Power Processing Elements, made to form a traditional tri-core CPU.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Xbox One wins by basically any metric.
Comparing versus the Cell would be a closer competition though, where the much higher peak floating point performance of Cell would be tough for the Xbox One to overcome on a mixed workload.
 

Sonikku

Lifer
Jun 23, 2005
15,851
4,788
136
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?
 

tdawg

Platinum Member
May 18, 2001
2,215
6
81
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?

Both the PS4 and Xbox One are faster and produce better graphics than their previous gen counterparts. All the games they publish for the systems run perfectly smooth in my experience. So what impact is the CPU really having (and what makes the AMD chip so bad, really)?

These aren't PCs and are purpose built for console games. What is the point of this thread?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?

You mean current gen and it's cause of cost.

Both the PS4 and Xbox One are faster and produce better graphics than their previous gen counterparts. All the games they publish for the systems run perfectly smooth in my experience. So what impact is the CPU really having (and what makes the AMD chip so bad, really)?

These aren't PCs and are purpose built for console games. What is the point of this thread?

I agree. For the purpose laid out they work fine and the 360 and PS3 can't even touch the graphics we have seen from the Xbox One and PS4. The difference is huge. I don't know how people can't see that.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
This shouldn't even be a comparison. Why did they go with such awful CPU's for next gen?

Cost mostly. You can go out and buy the quad core version of the PS4's CPU for $50. They could have used more powerful Kaveri (A-series) or Vishera (FX-series) chips, but though would have cost more. Not to mention higher power and cooling requirements.

Both the PS3 and 360 existed before things like GPGPU technologies did. A lot of processing tasks are now handled by the GPU, so a powerful CPU isn't required anymore.
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
This thread reminds me of the Megahertz war before Conroe. In other words "herp a derp, they be clocked slower omgbbq they must be slower".
 

Rezist

Senior member
Jun 20, 2009
726
0
71
I see alot of posters complaining in forums all over that consoles are causing the porting issues with PC games like Arkham Knight etc...

It's still up to the company making the game to port the game, GTAV took so long because it seems there wasn't enough money to justifiably go re-do the game without being able to sell it on next gen consoles.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
I see alot of posters complaining in forums all over that consoles are causing the porting issues with PC games like Arkham Knight etc...

It's still up to the company making the game to port the game, GTAV took so long because it seems there wasn't enough money to justifiably go re-do the game without being able to sell it on next gen consoles.

In theory it should be easier to port console games to PC now, but it all boils down to terrible business practices by the published. WB decided that Rocksteady should put their resources into making day-one DLC for the Arkham Kight. So the PC port was handed off to Iron Galaxy. Same folks who ported Arkham Origins, which also turned into a buggy mess. A lot of PC ports end up being rushed to meet launch schedules, so they don't get properly QA tested or optimized for Windows. They just adopt a "patch it later" attitude. We've seen this same mentality sneaking into consoles as well, now that Sony and MS have lifted patch size restrictions.

Publishers got away with it because gamers had no way to get their money back if the game was broken. Now that Steam is offering no-questions-asked refunds, that should lead to some big changes. AK was pulled from store shelves because WB, for the first time ever, was losing money from all the refund demands.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
A lot of processing tasks are now handled by the GPU, so a powerful CPU isn't required anymore.

We have a lot of recent AAA multiplat games on PC being very CPU intense pushing i5's to the limit and hammering i7's hard so it seems a powerful CPU is still required?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
We have a lot of recent AAA multiplat games on PC being very CPU intense pushing i5's to the limit and hammering i7's hard so it seems a powerful CPU is still required?


It depends. On a console with a strict set of hardware that does not vary, you can program GPGPU tasks that you can't do easily on the PC because the hardware functions differently between brands. There also isn't a low level API that works universally on windows yet. Not until dx12 and then we will see better resource management.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
While PCs can very powerful game machines, they're pretty inefficient at it. In simplest terms, games don't have direct access to the GPU. Rather they have to call up the CPU first, which calls up the graphics API, processes the data, sends the data to the graphics driver, which then sends it to the GPU. That adds a lot of overhead and takes CPU power away from other tasks.

AnandTech said:
The result is that when looking at single threaded CPU performance, GPUs have greatly outstripped CPU performance growth. This in and of itself isn’t necessarily a problem, but it does present a problem when coupled with the high level APIs used for PC graphics. The bulk of the work these APIs do in preparing data for GPUs is single threaded by its very nature, causing the slowdown in CPU performance increases to create a bottleneck. As a result of this gap and its ever-increasing nature, the potential for bottlenecking has similarly increased; the price of abstraction is the CPU performance required to provide it.

Low level programming in contrast is more resistant against this type of bottlenecking. There is still the need for a “master” thread and hence the possibility of bottlenecking on that master, but low level programming styles have no need for a CPU-intensive API and runtime to prepare data for GPUs. This makes it much easier to farm out work to multiple CPU cores, protecting against this bottlenecking. To use consoles as an example once again, this is why they are capable of so much with such a (relatively) weak CPU, as they’re better able to utilize their multiple CPU cores than a high level programmed PC can.
http://www.anandtech.com/show/7889/...w-level-graphics-programming-comes-to-directx

DirectX 12 allows for more direct access to the GPU, which frees up CPU cycles and allows the GPU to do more work without having to wait. At least that's how I understand it. Feel free to correct me if I'm wrong.

Another bottleneck in PCs is memory. System RAM is a lot slower than VRAM. So if the game has to call the CPU first, system memory adds to lag time and the GPU has to wait longer. Console games get around this by using unified memory. The CPU and GPU share the same bank. It wouldn't be practical for PCs to do this. VRAM isn't well suited for general computing. System RAM doesn't have enough bandwidth for high performance graphics.
 
Last edited:

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)

IIRC, Apple used to claim the G4 was something like 30% faster clock per clock while being more energy efficient. I see no reason to doubt those claims as my old iBook G4 was plenty fast and had outstanding battery life for the time.

Apple never really say why they ditched PowerPC AFAIK, but it probably had to do with heat and cost. The G5 ran so hot, some PowerMacs or the era had to be water cooled. No good for notebooks. Even the Cell based chips ran too hot, which is a big part of why early 360s and PS3s cooked themselves to death.

The Cell is still a crazy powerful processor but it was never the commercial success IBM hoped it would be. Never really saw the light of day outside game consoles. IBM was using them for servers at one point, but haven't for a few years now.
 
Last edited:

Graze

Senior member
Nov 27, 2012
468
1
0
RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)


This RISC arugment doesn't hold any water today. You need to read up on today's X86 chips. The x86 has evolved so much that it is a completely different chip than it was in the 90's and are very RISC like now. Where have you been?

Edit:// I just read your post again and you are definitely living in the past. There is nothing super computer today about a cluster of PS3's
 
Last edited:

Graze

Senior member
Nov 27, 2012
468
1
0
I remember also reading that those G5 machines were dogs. I always wanted one though being that it was the last of the PowerPC architecture. I passed on it when I was offered one from a friend who did video editing since it's basically worthless really :(

http://barefeats.com/imcd3.html
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Current gen console CPUs have crap SSE etc speed, AMD just didn't even try to get decent performance.
Thus the old gen CPUs can get a lot faster performance in case where you just do a lot of simple math. (especially Cell as each SPU had performance of ~25GFlops and it's own 25GB/s bandwidth to it's own memory.)

Of course the old gen were quite bad in general code so in such tasks the new CPUs win quite easily.

Developers now have CPUs which are relatively easy to program and have somewhat decent performance, but if program requires a lot of processing power they can become constrained. (maximum Gflops is around ~100GFlops.)
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Both the PS4 and Xbox One are faster and produce better graphics than their previous gen counterparts. All the games they publish for the systems run perfectly smooth in my experience. So what impact is the CPU really having (and what makes the AMD chip so bad, really)?

These aren't PCs and are purpose built for console games. What is the point of this thread?

Graphics quality is primarily gpu. Game complexity is primarily cpu.

The big advancement to Xbox One/PS4 is that games no longer require crazy optimizations to perform well, but all the big companies already built game engines with crazy optimizations.

But in maxed performance, it does seem the PS4/Xbox One cpus just barely beat the Cell in real workloads. Xbox 360 is much further behind because its best cases weren't as good as the Cell, even though it's average or worst cases could be better.

We don't have many benchmarks to go off of, but from what's been posted by developers:
In a floating point heavy graphics benchmark, Xbox One/PS4 are roughly equal to Cell, and all are 3x the speed of the Xbox 360 cpu. And I think a GPU equivalent program was about 4x the speed of the best cpu results.
In a multi-threaded draw calls/s benchmark, PS4 is about 50% better than Cell.

As more and more FP heavy code moves from cpu to GPU, the cpus should look better, but they're still not a huge amount faster than last gen, they just require far less optimizing to max out.

As far as limitations, well having cpus not much faster keeps us at 30fps in games. It also keeps the number of interactive elements (AI, physx, particle effects, etc) at about the same levels as previously. We have so much more memory, but not much more cpu power.
This can already be seen limiting games like Assassin's Creed Unity and Dead Rising 3. However, more and more computation will switch to gpu compute, which will help.

We have a lot of recent AAA multiplat games on PC being very CPU intense pushing i5's to the limit and hammering i7's hard so it seems a powerful CPU is still required?

Quite possibly due to draw calls limits and API overhead. Consoles basically have ways to bypass their APIs, and while DirectX does offer some features to optimize performance draw call performance, it's not as good and those features may not even be used in a port. Also DirectX doesn't multi-thread well prior to DX12. OpenGL does offer ways to bypass the API, but only recently, and I can't remember the last time a major game used OpenGL.

Also, console games typically run at 30fps or below, which is basically considered unplayable on a PC.

There's other weird things too. Disk streaming performance seems worse on PCs (possibly due to DRM or not having exclusive access to the disk), so you need a faster system to make up for the poor disk streaming performance. Also, I'm pretty sure the recent Batman game used an outdated version of the Physx library that wasn't well optimized for cpus on PC.

Also, for transfer from CPU to GPU ram (or vice versa), the consoles are faster than any PC, once again needing an even faster cpu to speed up other parts of the processing.

RISC processors back in the 90s were a lot more efficient than x86 processors, I had my hands on a number of enterprise and corporate level machines (HP/UX, Sparc, AIX, etc) and they were doing stuff that x86 couldn't do till dual CPU or dual cores. The Xbox PowerPC is still a powerful processor today as is the Playstation 3's cell processor (some say it can act as a supercomputer)

RISC versus CISC doesn't really matter. Intel's cpus are among the most powerful cpus you can find in any metrics. AMD's console cpus? Not so much. It's just expensive cpus (Cell was like $2 billion in R&D and pushed manufacturing to its limits at the time, basically as expensive as a true quad core while most pcs were single or dual core) versus cheap cpus.
 

tdawg

Platinum Member
May 18, 2001
2,215
6
81
Graphics quality is primarily gpu. Game complexity is primarily cpu.

The big advancement to Xbox One/PS4 is that games no longer require crazy optimizations to perform well, but all the big companies already built game engines with crazy optimizations.

But in maxed performance, it does seem the PS4/Xbox One cpus just barely beat the Cell in real workloads. Xbox 360 is much further behind because its best cases weren't as good as the Cell, even though it's average or worst cases could be better.

We don't have many benchmarks to go off of, but from what's been posted by developers:
In a floating point heavy graphics benchmark, Xbox One/PS4 are roughly equal to Cell, and all are 3x the speed of the Xbox 360 cpu. And I think a GPU equivalent program was about 4x the speed of the best cpu results.
In a multi-threaded draw calls/s benchmark, PS4 is about 50% better than Cell.

As more and more FP heavy code moves from cpu to GPU, the cpus should look better, but they're still not a huge amount faster than last gen, they just require far less optimizing to max out.

As far as limitations, well having cpus not much faster keeps us at 30fps in games. It also keeps the number of interactive elements (AI, physx, particle effects, etc) at about the same levels as previously. We have so much more memory, but not much more cpu power.
This can already be seen limiting games like Assassin's Creed Unity and Dead Rising 3. However, more and more computation will switch to gpu compute, which will help.



Quite possibly due to draw calls limits and API overhead. Consoles basically have ways to bypass their APIs, and while DirectX does offer some features to optimize performance draw call performance, it's not as good and those features may not even be used in a port. Also DirectX doesn't multi-thread well prior to DX12. OpenGL does offer ways to bypass the API, but only recently, and I can't remember the last time a major game used OpenGL.

Also, console games typically run at 30fps or below, which is basically considered unplayable on a PC.

There's other weird things too. Disk streaming performance seems worse on PCs (possibly due to DRM or not having exclusive access to the disk), so you need a faster system to make up for the poor disk streaming performance. Also, I'm pretty sure the recent Batman game used an outdated version of the Physx library that wasn't well optimized for cpus on PC.

Also, for transfer from CPU to GPU ram (or vice versa), the consoles are faster than any PC, once again needing an even faster cpu to speed up other parts of the processing.



RISC versus CISC doesn't really matter. Intel's cpus are among the most powerful cpus you can find in any metrics. AMD's console cpus? Not so much. It's just expensive cpus (Cell was like $2 billion in R&D and pushed manufacturing to its limits at the time, basically as expensive as a true quad core while most pcs were single or dual core) versus cheap cpus.

Here's the point though, you can't play Uncharted 4 on a PS3, for example, so the differences between the PS4 CPU / GPU and PS3 CPU / GPU don't actually matter. And for those games that are built for multiple consoles, such as FIFA or Madden, it's safe to say that everything will still look better / play better on the newer console.

I guess people find this fun on an absolutely theoretical level, but it's absolutely meaningless when features and game availability are exclusive to a certain console.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I guess people find this fun on an absolutely theoretical level, but it's absolutely meaningless when features and game availability are exclusive to a certain console.

This "fun" has been had for years. Almost every generation dating back to the Atari 2600 vs the Intellivision tried to compare not alike hardware to each other. The SNES vs the Genesis and its blast processing. The Saturn and its dual CPUs vs a PS1 made to take on polygons, etc. Part of the fun of gaming is comparing an Apple to an Orange. Exclusives have always been there to play the spoiler, but that didn't stop the debate.

Last generation was the best ever at that. In one corner we had the 360 with what was probably the first modern GPU inside. In the other corner we had a Ps3 with a supercomputer CPU but with a dated GPU. The 360 was probably the better designed machine on paper, but worse in reality (RROD). It was a blast to compare the two, even if for many gamers the real question was Uncharted or Halo.

This generation is boring as crap. With both sides just being x86 computers we know EXACTLY how powerful they are, minus this mysterious "console optimization" fudge factor (that should really be a "how lazy are the developers porting the PC version?" fudge factor). Unlike a 360 or a PS3 both have CPUs and GPUs that are like products we can buy, which means that the only debate left is how much fudge factor one side has vs the other.

In fact, I would argue it's not exclusives that kill the debate. It is those cross-platform games. When they are made for the lowest common denominator platform (the Xbone) then the difference in power doesn't matter. THEN the debate is purely theoretical because the developer isn't using the difference. Exclusives are not really part of the debate because almost every decent console has exclusives, and often they are first party titles that have some extra special sauce that only a team close to the hardware could pull off.

I do see a lot more talk about exclusive nowadays than ever though, mostly as a way to end discussion. "No we can't compare the Xbone to the PS4, or the PS4 to the PC because exclusives make them all so different!" In reality most titles aren't exclusive, and now more than ever a gamer can't really make a bad choice when they chose what platform to game on as most of the top games will find their way to all of them.
 

tdawg

Platinum Member
May 18, 2001
2,215
6
81
...

This generation is boring as crap. With both sides just being x86 computers we know EXACTLY how powerful they are, minus this mysterious "console optimization" fudge factor (that should really be a "how lazy are the developers porting the PC version?" fudge factor). Unlike a 360 or a PS3 both have CPUs and GPUs that are like products we can buy, which means that the only debate left is how much fudge factor one side has vs the other.

In fact, I would argue it's not exclusives that kill the debate. It is those cross-platform games. When they are made for the lowest common denominator platform (the Xbone) then the difference in power doesn't matter. THEN the debate is purely theoretical because the developer isn't using the difference. Exclusives are not really part of the debate because almost every decent console has exclusives, and often they are first party titles that have some extra special sauce that only a team close to the hardware could pull off.

I do see a lot more talk about exclusive nowadays than ever though, mostly as a way to end discussion. "No we can't compare the Xbone to the PS4, or the PS4 to the PC because exclusives make them all so different!" In reality most titles aren't exclusive, and now more than ever a gamer can't really make a bad choice when they chose what platform to game on as most of the top games will find their way to all of them.

I was reading this thread as PS3/360 vs. PS4/XB1 and, speaking directly to the PS3 vs PS4, what's the point of comparing two things that can't be used interchangeably? I understand comparing Intel to AMD or NVIDIA to AMD since I can put any one combination in my PC, start a PC game and directly speak to one combination outperforming another one.

In regards to consoles, since I can't interchange parts or build my own, I choose the one that allows me to play one or more specific games, not one that provides 2 fps more. That's why comparing previous generation consoles to their new counterpart doesn't make much logical sense.

The debate between PS4 vs XB1 seems a bit legitimate given the cross platform options, but in my experience, both look pretty damn good and play damn well in my living room.