Why are the advantages/disadvantages of using an X86-based CPU in a console?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I heard that it makes it easier for people to hack (that could be wrong, I don't really understand that), but then the Xbox was basically a PC-in-a-box and it's the only console of its generation that doesn't have an emulator. The PS2 is emulated pretty well, the GC fine, and the only reason nulldc hasn't caught up yet is because the developers are taking quite some time to make the DX11 plugin.

So, what are all possible disadvantage and advantages of using an x86 based CPU in a console?

If I were designing a console released a year from now, I'd probably use a Phenom II X4 architecture (cores @ 3.5 GHz, and L3 cache @ 3 GHz) built on a 28 nm process. Since it would be 28 nm, it would have fairly low TDP. I'd probably use 4 GB of DDR3 @ 1.2 GHz with extra low latency timings including a 1T command rate.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
XBox was microsoft's attempt to break into the console market (which worked quite well). It wasn't meant to be a money maker. In fact, it wasn't until the 360 that microsoft finally recovered the cost of entry.

Using the x86 architecture was essentially done because it was a quick method to get into the market. The x86 arch isn't the fastest, most specialized, best at power consumption, or cheapest. What it is is well known. MS already had 99% of the software written to interact with it. Not only that, but at the time (and now) directX was one of the best known APIs out there. They essentially wasted very little in research to get the Xbox 1 out the door, but payed a hefty price for it.

After getting their name established, they where able to do a more custom, cheaper design that was more power efficient design. The cost of each unit begins to drop when they do this.

For a new architecture such as consoles, Using the x86 architecture really doesn't make sense. The only reason it is hobbling along today is for legacy support. (the only other reason being that compilers are EXTREMELY optimized for it. More so than just about any other platform). This is why you don't see really anyone making a phone or console using x86. It is bloated, power hungry, and not specialized enough. It has been tacking on stuff since the 70s and it shows.
 

Soleron

Senior member
May 10, 2009
337
0
71
The most successful console in each generation has never been the most powerful. It is software that makes a console; hardware performance is of little importance.

Especially this time, with the Wii and DS, it's been shown that if you make software with mass appeal (Wii Sports, 2D Mario, Call of Duty, Wii Fit, Minecraft), people don't need high-end graphics for it to sell 20m+ copies. The gameplay needs to be fun and involving.

So consoles don't need x86 for higher performance; I'd actually prefer if console power didn't rise from this gen to next so developers could focus on making good games without needing a budget that will bankrupt them (Lair) or focusing on narrative/cutscenes to the detriment of gameplay (Other M).

Look at the 3DS. Within a month it has missed its financial targets; it is selling at the level of the Wonderswan, far below the DS or even PSP at this point. Because it has no games people want, it is expensive due to the fast hardware, and the developers are obsessed with the 3D gimmick.
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,277
125
106
The most successful console in each generation has never been the most powerful. It is software that makes a console; hardware performance is of little importance.

Especially this time, with the Wii and DS, it's been shown that if you make software with mass appeal (Wii Sports, 2D Mario, Call of Duty, Wii Fit, Minecraft), people don't need high-end graphics for it to sell 20m+ copies. The gameplay needs to be fun and involving.

So consoles don't need x86 for higher performance; I'd actually prefer if console power didn't rise from this gen to next so developers could focus on making good games without needing a budget that will bankrupt them (Lair) or focusing on narrative/cutscenes to the detriment of gameplay (Other M).

Look at the 3DS. Within a month it has missed its financial targets; it is selling at the level of the Wonderswan, far below the DS or even PSP at this point. Because it has no games people want, it is expensive, and the developers are obsessed with the 3D gimmick.

This is true. But I would argue that price is also a big factor. The PS3 released at what, $700? While the Xbox 360 released at around $500. The Wii, somewhere around $200. Which was the most popular? The Wii, followed by the Xbox and then by the PS3. The primary driver of cost is the hardware, not the software.

Couple that with the fact that it took forever for the PS3 to get any good games, and all reports that I heard about their SDK have been "This thing is awful!", you can see why the Xbox 360 beat the PS3. Yet, Wii was the overall winner because it was innovative. Innovation doesn't always pay off, but it did here.
 

smartpatrol

Senior member
Mar 8, 2006
870
0
0
The PS3 released at what, $700? While the Xbox 360 released at around $500. The Wii, somewhere around $200. Which was the most popular? The Wii, followed by the Xbox and then by the PS3.

No. PS3 launched at $599 (and also had a $499 model), 360 launched at $399 (and also had the HDD-less Arcade model at $299 IIRC). Wii launched at $249.

Don't forget the previous generation: Dreamcast at $199, PS2 at $299, Gamecube at $199, Xbox at $299. PS2 won by a long shot, followed by Xbox, Gamecube, and then Dreamcast.

Anyway, the only possible reason to choose x86 at this point would be if one of the OEMs got an excellent deal on them. I really don't see it happening for any of the next-gen systems.
 

ncalipari

Senior member
Apr 1, 2009
255
0
0
Pros:

1) Lower R&D cost

2) A big developer base

3) Possibly better outcome due to the known SW+HW combo

Cons:

1) Higher cost per unit

2) Higher TDP

3) Lower performance in games
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
No. PS3 launched at $599 (and also had a $499 model), 360 launched at $399 (and also had the HDD-less Arcade model at $299 IIRC). Wii launched at $249.
Sorry, I was relying on a quick google and looking at wiki prices.

Don't forget the previous generation: Dreamcast at $199, PS2 at $299, Gamecube at $199, Xbox at $299. PS2 won by a long shot, followed by Xbox, Gamecube, and then Dreamcast.
True, and this would probably be the place where software, form, and other things came into play.

Anyway, the only possible reason to choose x86 at this point would be if one of the OEMs got an excellent deal on them. I really don't see it happening for any of the next-gen systems.
Sort of agreed. Another, that I pointed out earlier, is for a fast deployment time. The original Xbox was truly nothing more than a PC with a fancy box and windows NT with a tweaked gui.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,324
1,462
136
I heard that it makes it easier for people to hack (that could be wrong, I don't really understand that)
It's easier to hack because there is insane amount of historical baggage in x86, and unless the console designers take it all into account, they risk leaving holes in their fences. (At least if they make a new one now, they won't leave Int A20 unaccounted for... :D)

Xbox was basically a PC-in-a-box and it's the only console of its generation that doesn't have an emulator.
That's because there is no sense -- as it was a pc-in-a-box, basically every Xbox game of interest also has a pc version, so why run an emulator?

If I were designing a console released a year from now, I'd probably use a Phenom II X4 architecture (cores @ 3.5 GHz, and L3 cache @ 3 GHz) built on a 28 nm process.
Phenoms are hand-designed for a SOI process, and the 28nm processes available a year from now will be bulk. There will not be any 28nm Phenoms. Also, any future console will likely be a "fusion" chip to save manufacturing cost, and frankly, the GPU is the more important part. You could probably get a good enough improvement over current generation by swapping out the GPU for a modern one, keeping the cpu as it is, and doubling-quadrupling the memory and memory bus.

I'd probably use 4 GB of DDR3 @ 1.2 GHz with extra low latency timings including a 1T command rate.

... why? for a console that makes no sense. The key point of making a console is cost-efficiency -- getting the best performance and reliability out of the least dollars spent per console produced. For that reason, you want to use the most easily mass-produced and cheapest memory possible, and extra low latency timings isn't it. Also, due to cost reduction, the very first thing to go is two separate memory buses for graphics and the CPU, so bandwidth will need to be emphasized over latency. For a console designed for about a year from now, the sanest memory choice would probably be 128-bit GDDR5, with a total of 1-2GB.

(the only other reason being that compilers are EXTREMELY optimized for it. More so than just about any other platform).
This is a very strong point in it's favor, and this situation is not likely to change anytime soon.

It is bloated, power hungry, and not specialized enough. It has been tacking on stuff since the 70s and it shows.
Come on now, x86 processors aren't really x86 anymore. They are risc processors with a x86 translation layer in front -- and this layer costs basically nothing in performance, very little in core cost, but quite a lot in power. Which is why x86 phones are so elusive. They might still happen, simply because Intel wants them to happen and they are so far ahead of everyone else in manufacturing, that it is possible that node or two from now they can ship x86 that is competitive against ARM in the same power envelope, despite the disadvantages of the architecture.

The reason nobody would make a x86 console today is not any feature of the processor architecture, but of the supply situation. Unlike Power or Arm, you cannot really get the "blueprints" of the core for yourself, so you would have to tie the future of your game company either into AMD (... stupid, what if they go under?) or Intel (which would probably not let you manufacture their cores, stopping you from shopping around for better deals later on. And reducing the cost of the console as fast as possible is very important for console economics.)

So, what are all possible disadvantage and advantages of using an x86 based CPU in a console?

Quick summary:
- supply situation
- power use
- harder to secure
- less likely to be able to customize and integrate properly
+ compilers and other tools
+ economies of scale even at small console production volumes

The only reason to make a X86 console that I can see is that you are too small a company to make your own custom hardware.
 

BladeVenom

Lifer
Jun 2, 2005
13,540
16
0
XBox was microsoft's attempt to break into the console market (which worked quite well). It wasn't meant to be a money maker. In fact, it wasn't until the 360 that microsoft finally recovered the cost of entry.

They still haven't recovered the costs. The Xbox lost $4 billion during it's four years, and the 360 lost $3 billion it's first two years.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
They still haven't recovered the costs. The Xbox lost $4 billion during it's four years, and the 360 lost $3 billion it's first two years.

I thought they had. The only article I can find on the matter was from 2010 reporting a single quarter gain of $165 million from 360 sales.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
... why? for a console that makes no sense. The key point of making a console is cost-efficiency -- getting the best performance and reliability out of the least dollars spent per console produced. For that reason, you want to use the most easily mass-produced and cheapest memory possible, and extra low latency timings isn't it. Also, due to cost reduction, the very first thing to go is two separate memory buses for graphics and the CPU, so bandwidth will need to be emphasized over latency. For a console designed for about a year from now, the sanest memory choice would probably be 128-bit GDDR5, with a total of 1-2GB.
.

AFAIK GDDR5 memory is designed to perform best as graphics memory. I haven't seen it used elsewhere. Would it also work well as a system memory?
 

ncalipari

Senior member
Apr 1, 2009
255
0
0
AFAIK GDDR5 memory is designed to perform best as graphics memory. I haven't seen it used elsewhere. Would it also work well as a system memory?

Basically GDDR is a memory for a situation where you want a lot of bandwidth without spending too much.

By soldering the memory on board you can save a lot of money on interconnections. Same speed DDR would be very expensive due to the possibility of swapping memory banks.


Plus you get some secondary benefits, like filling an entire bank with the same data, at the cost of one operation. But that's not a big difference with DDR.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Okay thanks for the replies everyone:) Makes a lot of sense now. True RISC CPUs have always always been used in consoles, with the Genesis, NeoGeo, and original Xbox being exceptions, of course. I know the Motorola MC68000 which the former two used, was very popular, although it was said not to be as good for a console as the SNES's CPU because the MC68000 was a CISC processor, so it got less done per clock cycle and watt. Nintendo still should've clocked the SNES's CPU a little bit faster though.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
The advantage/disadvantage is the same as that for any general hardware versus fixed/dedicated hardware solution.

General hardware is always about reducing cost per unit based on volumes and time to market while leveraging proven solutions (this is what makes ARM so popular in non-x86 marketspace) as a means of risk management and mitigation.

Fixed hardware is usually about improving performance, locking in a captured audience (gross margins, barrier to entry and barrier to exit) and locking out your competitors.

COTS - commodity of the shelf - system integration is for those with a weak development budget or a pressing timeline to get to market.

If you've got a reasonable timeline to develop your product, and a reasonable budget to do it with, then management is going to expect you to develop something that is as proprietary as possible as that enables them to leverage the market dynamics to lock out competition and increase gross margins on the product.

This is universally true of all business, whether it is consoles or pharmaceuticals or airplanes.
 

smartpatrol

Senior member
Mar 8, 2006
870
0
0
AFAIK GDDR5 memory is designed to perform best as graphics memory. I haven't seen it used elsewhere. Would it also work well as a system memory?

The Xbox 360 uses GDDR3 for its shared system/graphics memory.

Anyway GDDR5 is based on DDR3 so I don't see why it can't be used for system memory. The only drawback is that graphics memory is optimized for maximum bandwidth, not low latency.
 

Gundark

Member
May 1, 2011
85
2
71
I heard that it makes it easier for people to hack (that could be wrong, I don't really understand that), but then the Xbox was basically a PC-in-a-box and it's the only console of its generation that doesn't have an emulator. The PS2 is emulated pretty well, the GC fine, and the only reason nulldc hasn't caught up yet is because the developers are taking quite some time to make the DX11 plugin.

So, what are all possible disadvantage and advantages of using an x86 based CPU in a console?

Speaking about emulation you are not very well informed. Xbox is poorly emulated because it is not well documented, there are only 3 developers who are interested to work on it (one of them recently retired from it), and it's approach is quite unique and different from classic emulation ( but it's hardware requirement is very low).
Dx11 is irrelevant, it doesn't offer more functionality nor performance comparing to Dx9 other than cleaner interface (except in some rare cases).
Actually XBOX was the most powerfull console at the time ( but on paper maybe it didn't look so). Even GC looks better than PS2 especially when you both compare it on emulator in full HD. And NullDC is working fine.

I belive that all of the last-gen consoles are hacked. I remember that some CEO in Microsoft have said that they are not interested in increasing the security off their consoles, because it will get hacked anyway. Instead they will focus on content making people wanting to play on xbox360.
 
Last edited:

Gundark

Member
May 1, 2011
85
2
71
I belive that x86 CPU based console could be cheaper to produce. Also, we see that PS3 is becoming general purpose machine (but in this task it is less effective than x86).
 

lamedude

Golden Member
Jan 14, 2011
1,206
10
81
With the Xbox I think Intel had a bunch of P3s and was willing to give MS a nice price on them but eventually came back to haunt them since they didn't own the rights to it (along with the NV2A) so they couldn't shrink them or buy them from somewhere else. So while Sony was able to do this and get their chip's cost down for the PS2 MS was probably paying the same price for P3 & NV2A in 05 as they were in 01 (what the MS/NV lawsuit was about IIRC). MS learned from this and owns the 360's chips. I doubt Intel would do that or AMD even could.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
I belive that x86 CPU based console could be cheaper to produce. Also, we see that PS3 is becoming general purpose machine (but in this task it is less effective than x86).

Very few people have the licencing required to produce an x86 CPU, let alone the required licences to use the extensions (And intel isn't in a sharing mood really). x86 CPUs are just too expensive for consoles.

A company like IBM or ARM, on the other hand, is more than happy to licence out the tech needed to produce CPUs that are (or were) faster and more power friendly than any x86 CPU available.

Bottom line, Microsoft and Sony get a great deal on the processors they are currently fabbing. They are pretty much only paying for fabbing costs (which translate down to something like $10 per CPU or so). This only works at the large scale.
 

Gundark

Member
May 1, 2011
85
2
71
Well, then it's more complicated that i thought. Anyway, let's wait and see if the Microsoft next console will be Intel (or AMD) based.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
Well, then it's more complicated that i thought. Anyway, let's wait and see if the Microsoft next console will be Intel (or AMD) based.

It isn't going to be either. The question is more "Will it be ARM or IBM based?". The only console that has ever been Intel x86 (that I know of) was the Xbox. The rest have been with someone else.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Didn't the original Xbox use a scaled down Pentium 3 at 600mhz?

Really an x86 based console is just a PC in disguise, IMO. The advantages would be that you could tap into what Intel has to offer, which could give you amazing performance. You also gain compatibility with the PC, making the process of porting games over really easy.
 

lamedude

Golden Member
Jan 14, 2011
1,206
10
81
A Coppermine P3 at 733MHZ with 128K L2 on a 133MHZ FSB.
MS never switched to Tualatin (Intel probably kept it priced higher) or could merge it with the GPU like Sony did with the PS2 or MS did with Valhalla 360's. Because MS was unable to do these cost cutting measures MS lost billions on the Xbox and I doubt anyone is going to repeat their mistakes.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
Didn't the original Xbox use a scaled down Pentium 3 at 600mhz?

Really an x86 based console is just a PC in disguise, IMO. The advantages would be that you could tap into what Intel has to offer, which could give you amazing performance. You also gain compatibility with the PC, making the process of porting games over really easy.

Original Xbox had a 733Mhz Coppermine CPU.

Any advantage x86 has is porting games over to PC doesn't mean anything when the hardware cost would be too high compared to licensing n IBM or ARM design. Neither Intel or AMD will allow another company to license a design out and have Microsoft or Sony manufacture it where ever they want. Microsoft found this out the hard way for Xbox when they couldn't renegotiate with Intel or Nvidia to get lower cost as new processes came around.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I heard that it makes it easier for people to hack (that could be wrong, I don't really understand that),
No. Trying to use DRM in which the box contains the information needed to hack it makes it easier to hack. x86 might add a few more holes, but if you made it only support x86_64, and customized the MMU a bit, I'm sure all that could be taken care of, making it about as hackable as a current-gen console.

Really, IMO, they need to get their heads out of their asses, and realize that all they need to protect from hacks are their multiplayer software systems. Hell, document the whole console, and release those docs for free. But, add a memory locking feature that allows the hardware to verify CRCs, prevent further writing to that memory, and do a full-on SSH/SSL connection, to verify those numbers with the server, as you log on. Easy? No. Cheap? Not too much. But, it would make it much harder to hack, and if the HW was exposed, overall, it would also reduce the need and desire for hacking. They need secure monetary transactions, and a fair multiplayer environment. Nothing else needs to be closed, at least for the gaming side of things.
The PS2 is emulated pretty well, the GC fine, and the only reason nulldc hasn't caught up yet is because the developers are taking quite some time to make the DX11 plugin.

So, what are all possible disadvantage and advantages of using an x86 based CPU in a console?
Neither AMD nor Intel can yet give you a CPU for it. Intel is closer, and both AMD and Intel are headed towards customized CPUs and SoCs, but they aren't there, yet. IBM has been there for a couple decades, to the point that they've been making customized CPUs for customers, not just customized SoCs.

The Xbox360 is a great example. I'm not sure why they did three cores, instead of four, but they used plain in-order cores, apparently power-efficiency-tweaked precursors to the current Power, gave them SMT*, the ability to use cache as scratch space, and a few other goodies. The performance was alright out of the gate, including .NET, with plenty of room for improvements, which got taken advantage of.

Can AMD or Intel just take a chunk of a CPU, and swap it out with another kind? Where is the low-space/low-power in-order Core i int execution unit?

...and that's why they're all IBM. ARM has some promise, and is a shoe-in for hand-helds, but it seems that IBM can offer more of what the console makers want and need, right now.

* which makes a lot more sense for an in-order CPU that is likely to be cache-starved, and have limited or no re-ordering ability, than your typical high-performance x86 CPU

I thought they had. The only article I can find on the matter was from 2010 reporting a single quarter gain of $165 million from 360 sales.
Nah. If they do make money from it, they will spend it all making the next gen even more badass. Gaming is incidental. Everybody knows that eventually the desktop PC will not be used by normal people for normal things. Nobody knows how this will happen, or when it will happen, though. Netflix on gaming consoles would be a good example of a first tentative step towards this. MS wants a box in your living room, a box in your pocket, and a tablet in your hand, when it finally does happen, so that they can adapt and capitalize on it. As long as Server and Office make enough money to subsidize gaming, they have no need to make money from it. When and if that scenario finally begins, having a high quality multiplatform software development environment will allow MS to do...whatever, to keep people making software for their systems.

People laugh at this idea, but then go use their phone and Wii for what they used to use their computer for :).
 
Last edited: