Why are RISC CPUs more common in game consoles?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Less resolution, less colors.
Genesis games typically ran at a higher resolution although the Genesis couldn't do as many on screen colors and it had a much smaller palette. I always thought it would've been nice if the Super NES used a 320x224 mode like the Genesis did for most of its games.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
RISC is just a better architechture for chips that are pipelined and all chips in the last 20 years have been. With pipelining, load, store, and execute are separate stages so it makes sense to target these stages with instructions that are tailored to them. There's simply no point in making a cisc chip except for compatibility. That's why all CISC x86 chips are actually RISC inside since the pentium pro. With bigger and bigger transisitor budgets though, the cost of the translation layer from CISC to RISC matters less and less.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Genesis games typically ran at a higher resolution although the Genesis couldn't do as many on screen colors and it had a much smaller palette. I always thought it would've been nice if the Super NES used a 320x224 mode like the Genesis did for most of its games.
In some games, SNES did use 512x224, though. Plenty of games where Genesis had used 256x224 instead. Most noticeable difference was in color palette, though. 9 bit versus 15 bit color. Here's a nice comparison.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
For starters, Intel probably would demand quite a bit of money in licensing/royalty fees after the initial run, though I know for the Xbox they undercut the original deal MS had with AMD in order to win the contract for the Xbox's CPU. I don't know if the cost was for an initial batch or a fixed price over the entire production run. Of course, the Nvidia debacle made MS pay dearly to get into the console race, but gave Nvidia a somewhat bad name for the market. It makes me wonder what kind of deal Sony and Nvidia have over the RSX.

Back on topic......I think IBM was probably more akin to creating custom architectures for clients, especially since they have embedded products and do processor meant to be customized to the needs of the buyer. Intel seems to want to keep their processor development in line with the PC and server world and that business would be effected by a processor contract for a console since I would expect them to be unwilling to relinquish production duties and allow their own chip to be fabricated in a non-Intel plant.

The "real semiconductor companies have fabs" statement has merit, but the fabless model does too.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
The NeoGeo, Genesis, and Xbox used CISC processors, but they were exceptions to the rule.

Do RISC processors have an absolute advantage in consoles or just a cost effectiveness advantage in consoles?

It is all cost-benefit analysis.

The allure of COTS (commodity off the shelf) hardware like CISC processors from AMD/Intel/Via is that someone else has already captured the development cost.

So as as systems integrator, which MS was for the XBOX was for example, the cost-benefit analysis for using COTS versus taking on the risk that comes with developing proprietary hardware is a pretty big barrier to entry.

But if you are already in the industry, have all your business infrastructure in place, such that developing new products is merely an additive expense to an existing operating expense, then developing custom hardware (which then involves licensing costs, and barriers, ala x86 and Nvidia) is one way for you to lower your costs while maximizing your gross margins.

Custom hardware should be superior to general purpose hardware in the specific area for which the custom hardware is being customized to operate. If it isn't then you've done it wrong and you should be going with a COTS-based product development model.

In the end, no matter how you characterize it, the answer is very simple: $$$ (it all comes down to money)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Biggest issue and basicly sole reason why AMD/Intel/VIA never gonna supply to consoles (Yes the original Xbox had one, and even MS said never again.). And why all the GPU designs are licensed designs is, that the console maker cant control the production chain. Both in terms of process improvement, volume and cost.

And then there is basicly only one alternative on the CPU side. PowerPC.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Biggest issue and basicly sole reason why AMD/Intel/VIA never gonna supply to consoles (Yes the original Xbox had one, and even MS said never again.). And why all the GPU designs are licensed designs is, that the console maker cant control the production chain. Both in terms of process improvement, volume and cost.

And then there is basicly only one alternative on the CPU side. PowerPC.
Yeah that's basically the problem. Initially, using COTS stuff from AMD/INTEL is cheaper, easier and faster. However, AMD and Intel never produce a given chip for very long before it becomes obsolete. New designs come out. The console maker would be screwed because the chips it depended on would no longer be manufactured. Of course, for enough money, the chip designers can keep process shrinking the chips and keep assembly lines open. However, at that point you are paying extra for non-commodity services and doing so loses you the entire advantage of going with COTS in the first place. So for console makers, they need to keep control of the designs.