I don't understand why console makers are bent on being so power-conservative?

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Don't the major (Microsoft, Sony, Nintendo) consoles make a ton of money? Even the Nintendo consoles that are considered failures like Wii/Wii U make the company a substantial amount of profit through game sales etc... They say that Xbox 360's and PS3's were sold at a loss at the beginning from a hardware perspective, but wouldn't the recoup from the video games sales themselves completely negate that? What percentage of video games sales go to the console company themselves (ie. what percentage of GTA5 money do Sony and Microsoft get?)

The excuse that is constantly bandied about as to why Microsoft (and to a certain extent Sony) have to "underpower" their consoles just does not make any sense at all too me. Did Xbox 360 and PS3 in total not make their respective companies gobs of cash?

I just don't understand what a company like Microsoft has to lose by making their console actually really powerful. It will sell more in the long run, last longer, games will look better and sell better. I honestly don't think Microsoft saves themselves any money by underpowering their console as they have clearly done with XBone in comparison to OG Xbox and Xbox 360 which were very powerful for their time (and both made the company gobs of cash)
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
I think everything you asked can be followed back to the red ring of death. The current gen consoles were a beast when they first came out (Even if they were a step below the 8800gtx which was the current top end video card, they were coupled with 3ghz tri-core CPUs which was very high end as well) and as MS found out, beastly hardware needs beastly cooling solutions. Also the SMALLER you go in fans, the faster and more noise they create. Well no one wants a console that sounds like a vacuum cleaner, so you have to create a balance between a console that is big enough with fans that are just the right size to not be noisy, and hardware that can be cooled by such a cooling solution.

It is no different now. Especially when you consider that AMD is currently struggling with a larger fabrication process then intel because they outsource their chip fabrication, AMD chips are well known to use more power and create more heat then intel's current generation of CPUs. So right off the bat, the systems are having to make do with a chip fabrication that is one generation old. The idea is however that the consoles will be using AMDs next generation roadmap of combining the CPU and GPU, both on-die and in the code writing process. Such a merger would totally give amd the boost they need, and I think we will see great things in the future (esp for AMD fans) when all games are going to be created with the new architecture roadmap. A feat that simply could not be accomplished in such a short amount of time if it was the PC alone trying to force such a hand.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Don't the major (Microsoft, Sony, Nintendo) consoles make a ton of money? Even the Nintendo consoles that are considered failures like Wii/Wii U make the company a substantial amount of profit through game sales etc... They say that Xbox 360's and PS3's were sold at a loss at the beginning from a hardware perspective, but wouldn't the recoup from the video games sales themselves completely negate that? What percentage of video games sales go to the console company themselves (ie. what percentage of GTA5 money do Sony and Microsoft get?)

The excuse that is constantly bandied about as to why Microsoft (and to a certain extent Sony) have to "underpower" their consoles just does not make any sense at all too me. Did Xbox 360 and PS3 in total not make their respective companies gobs of cash?

I just don't understand what a company like Microsoft has to lose by making their console actually really powerful. It will sell more in the long run, last longer, games will look better and sell better. I honestly don't think Microsoft saves themselves any money by underpowering their console as they have clearly done with XBone in comparison to OG Xbox and Xbox 360 which were very powerful for their time (and both made the company gobs of cash)

Is this a joke post? PS3 lost tons of money for Sony, almost as much as the total profit for PS1 and PS2 combined, the X360 is barely net positive but the OG Xbox was a huge money-hole.

Besides, there is also the issue of gpu tdps increasing significantly from 2005-2006. Considering that the X360 and the PS3 both had mediocre reliability, a similar sized console today with say, a GTX770/GTX780/HD7970 will be disastrous.
 

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
Console business is a rough business to be in.

The PS3 erased a lot of Sony's profits from the previous 2 Playstations. The 360 made some money for MS, but took big hits with the RROD fiasco and as a whole the entire Xbox side is still in the red big time.

They are scared and are going for something a bit safer. Something they can bring down in price much quicker and should not have unforeseen problems later on (RROD). Basically this gen they need to make money otherwise they wont have another gen.

zYw1A.jpg
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Consoles can't be too expensive, ultimately they have to be cheaper than PCs and other gaming devices to be compelling. In order to meet a low price point they make a lot of compromises. Similarly they have to fit underneath the average TV so they have to be relatively small and constrained devices which puts a power consumption limit on them. The combination of the two I think dominates what is economical for a console and its creation. Its not reasonable to loose money on every device, and certainly not to loose a substantial amount.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
If they made them ultra powerful with higher end hardware (think quad i7 and GTX 600 series type power) they would not only cost a lot more but they would generate a lot of heat and use more power which would have to be somehow pulled out of the small form factor. These are even smaller than a mATX box PC. So it would be quite difficult from an engineering standpoint to make these cool and quiet.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Because the previous generation had massive heat problems. That and they want to make money right from the start on each console sold, both of those are the reason why the consoles aren't powerhouses.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
That's actually a good question and guaranteed console reliability went out the window when the first PlayStation came out. Sega Saturn was the last of the line of consoles that could be counted on to be reliable.

Also, microsoft would've saved money if they hadn't made all of the Xbox360s defective.

People will pay about anything for consoles these days, so they ought to just make them $1k, powerful, and reliable.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think everything you asked can be followed back to the red ring of death. The current gen consoles were a beast when they first came out (Even if they were a step below the 8800gtx which was the current top end video card, they were coupled with 3ghz tri-core CPUs which was very high end as well) and as MS found out, beastly hardware needs beastly cooling solutions. Also the SMALLER you go in fans, the faster and more noise they create. Well no one wants a console that sounds like a vacuum cleaner, so you have to create a balance between a console that is big enough with fans that are just the right size to not be noisy, and hardware that can be cooled by such a cooling solution.

It is no different now. Especially when you consider that AMD is currently struggling with a larger fabrication process then intel because they outsource their chip fabrication, AMD chips are well known to use more power and create more heat then intel's current generation of CPUs. So right off the bat, the systems are having to make do with a chip fabrication that is one generation old. The idea is however that the consoles will be using AMDs next generation roadmap of combining the CPU and GPU, both on-die and in the code writing process. Such a merger would totally give amd the boost they need, and I think we will see great things in the future (esp for AMD fans) when all games are going to be created with the new architecture roadmap. A feat that simply could not be accomplished in such a short amount of time if it was the PC alone trying to force such a hand.

First of all, the 8800GTX was released in mid 2007. The xbox 360 was released in 2005 - it was actually more advanced in terms of performance than the highest PC GPUs at the time. The Xbox 360 GPU actually had unified shaders (which are now standard in everything) prior to the PC, and had higher overall throughput.

The highest GPU in 2005 was the ATI X1800 which was worse than the Xbox 360 GPU. Now mid 2006 GPUs finally saw the PC getting unified shader GPUs which closed and passed the gap, but the performance of the xbox 360 at launch was pretty staggering. The GPU itself was more advanced (AT LAUNCH IN 2005) than any PC GPU. That gap was obviously closed fast, though.

The key difference now is that cooling characteristics for PC GPUs has gone in a direction that is just not possible for a console form factor; in 2005 most GPUs were passively cooled or had a simple headsink/fan. That is not the case any longer. You cannot put something like a GTX 780 dGPU inside of a console size box easily, and you cannot easily put something like a 3930k inside of a console size box. The TDP and cooling requirements of high end PC parts have gone in a direction (in terms of heating and cooling) that is just not possible inside a console sized box, while things were far different in 2005. You could get a high performing GPU in 2005 that didn't require a massive GPU shroud for cooling. That just isn't the case now.

Consider the top PC GPU of 2005: ATI X1800. You can't even run basic console ports such as Black Ops 2 on such a card, it would crawl at framerates below 10 even at 720p. The Xbox 360 was pretty amazing at launch, but as I said - the top GPUs of 2005 didn't have the massively stupid huge GPU shrouds that high end GPUs have today. What was possible for high end then requires massive cooling now that isn't possible in a console sized box. This is aside from the fact that high end GPUs cost WAY more than high end GPUs in 2005 did.
 
Last edited:

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Well, you could build a console with a 1.6ghz AMD Jaguar chip and HD 7850 GPU and sell it for $400...

OR, you could build a console with a 4ghz AMD FX Vishera chip and HD 7990 and sell if for $1200.

Throughout the entire history of game consoles, not once has the most powerful console of its generation been the best selling. Nor has the most expensive. So you have to balance performance with affordability. For example, the original 60GB PS3 was a media powerhouse. It was the best Blu-ray player at the time, would play SACD, multicard reader for photos, a ridiculously powerful CPU, and full hardware PS2 backwards compatibility. It was also $600-$700 and nobody wanted it at that price. $400 is the sweet spot for consoles so you have to work within that limitation without bankrupting yourself. Keep in mind hardcore gamers represent a minority, and those who are really concerned about performance will tend to gravitate towards PC.

Though it's been already mentioned, I'll go over TDP again. There are design considerations you have to take into account when building a console and selecting its components. Issues that don't really affect gaming PCs, or even gaming laptops. A console has to be compact in size. It's also going to be placed in an enclosed or semi-enclosed media centre surrounded by other hot devices. It also has to be quiet. So you're limited in the amount of cooling you can have, and its complexity in order to keep costs down. So using lower TDP chips makes sense. Both Sony and Microsoft want to get away from the disastrous RROD and YLOD that plagued the early 7th gen systems.

PCs really don't have the same cooling issues for a few reasons. The air in the case acts as a big heat reservoir. You can add more fans, bigger fans to move more air. Size isn't a big issue. You can have bigger heat sinks or use more exotic cooling methods like water. Gaming laptops tend to be left out in the open. They also use mobile tuned chips that have a lower TDP. Generally around 100W for top end GPUs. Why consoles haven't latched onto mobile parts is a mystery though. Apple has been using them for years.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
I just don't understand what a company like Microsoft has to lose by making their console actually really powerful. It will sell more in the long run

Thing is, it really won't. Hell, a lot of people said they couldn't tell the difference between the PS2 and the PS3 when it first came out.

It's highly suspect that Sony would have made any more money by making this look better:
GTA_V_13524245038808.jpg



What is it missing? What could you do with more memory that hasn't exhausted their budget anyway? Not terribly much.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Throughout the entire history of game consoles, not once has the most powerful console of its generation been the best selling. Nor has the most expensive.

Wii outsold the PS3 and Xbox 360
PS2 did better than the Gamecube and Xbox
Nintendo DS outsold the PSP
3DS is beating the Vita
Gameboy outsold Game Gear
 

purbeast0

No Lifer
Sep 13, 2001
53,038
5,920
126
because it doesn't matter. it is only the vocal minority on tech forums like this that give a shit about what equivalent card is in the console. joe blow and his mom who is buying the next console doesn't give 2 shits about what card is in it. they just care that the new console looks better than previous ones. you have to make tradeoffs when it comes to hardware/price and hit a middle ground, and what they've currently chosen is where it's at.
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
First of all, the 8800GTX was released in mid 2007. The xbox 360 was released in 2005 - it was actually more advanced in terms of performance than the highest PC GPUs at the time. The Xbox 360 GPU actually had unified shaders (which are now standard in everything) prior to the PC, and had higher overall throughput.

8800gtx was exactly one year, nov 2006, after the xbox, so pc was just one generation of video cards until it technically surpasses xbox (although I never really felt the 8800 surpassed in performance. ^^)

The key difference now is that cooling characteristics for PC GPUs has gone in a direction that is just not possible for a console form factor; in 2005 most GPUs were passively cooled or had a simple headsink/fan.

.... that made no sense. Active cooling has been required since 1998 in the tnt2 erra, and consoles have been using it since the dreamcast. Heatpipes which allowed us to transport heat from chips to a centralised heat block started showing up on GPUs in 2004 and made their way into the ps3 and xbox. Current cooling solutions still utilise heatsinks and heatpipes.


Consider the top PC GPU of 2005: ATI X1800. You can't even run basic console ports such as Black Ops 2 on such a card, it would crawl at frame rates below 10 even at 720p. The Xbox 360 was pretty amazing at launch, but as I said - the top GPUs of 2005 didn't have the massively stupid huge GPU shrouds that high end GPUs have today. What was possible for high end then requires massive cooling now that isn't possible in a console sized box. This is aside from the fact that high end GPUs cost WAY more than high end GPUs in 2005 did.

Yea this kinda falls into my disappointment with the 8800gtx (I had the ultra even, it was my system built for crysis). It went beyond just video hardware though, they were making the games to be super efficient on every aspect of the console, and then just shiating all over the pc port. This is also why I don't expect the lower specs of the next gen consoles to be a problem though either. They are still way ahead of the current gen and game makers will continue to be super efficient on them and make even greater looking games.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't know where you're from but the xbox 360 was released in November 2005. Maybe you're thinking PS3 but the xbox 360 has and had a more advanced GPU than the PS3 - I remember this very well because I bought it day one. And yes, it had a more advanced GPU than any PC GPU at the time.

The main point, though, is that cooling and TDP requirements of high end PC equipment has gone in a direction that is not compatible with a console sized box that is going into enclosed entertainment centers, that's all there is to it. You cannot do 780GTX in such a box because of cost and TDP. You cannot do a 3930k. The heating and cooling requirements of "high end" PC equipment has gone the way of ridiculousness which wasn't the case in 2005. In fact, I remember OC'ing CPUs back then with stock CPU coolers. That is just unthinkable these days.

With that in consideration, you have to make trade-offs in a 400$ console. That said, I think the next generation consoles will perform very well for their target - the PS4 in particular should be very much more graphically advanced than the prior PS3. So it isn't a big deal.

.... that made no sense. Active cooling has been required since 1998 in the tnt2 erra, and consoles have been using it since the dreamcast. Heatpipes which allowed us to transport heat from chips to a centralised heat block started showing up on GPUs in 2004 and made their way into the ps3 and xbox. Current cooling solutions still utilise heatsinks and heatpipes.

There is a large and VAST difference between the size, heat dissipation and volume characteristics of PC CPU / GPU coolers used now as compared to 2005 and prior. The TNT2 had a simple heatsink fan. Now compare that heatsink fan to a massive GTX 780 shroud. Or the hot and loud shroud used on the AMD 7970 which has to dissipate 250-300W of generated heat. A console that will be in an air tight enclosed entertainment center with no room to dissipate 500W+ of heat from the CPU and GPU? That just isn't going to happen, you can't have that high of a TDP for a console. Hell, most GPUs had NOTHING BUT A HEATSINK back in those days. It is nothing like now with the huge GPU shrouds that must dissipate 300W+ of heat and must be in an "open air" space to exhaust it out of the back of a PC.

Now you look at CPU coolers that are like the Noctua NH-D14 - that cooler itself is probably bigger than the xbox 360.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
because it doesn't matter. it is only the vocal minority on tech forums like this that give a shit about what equivalent card is in the console. joe blow and his mom who is buying the next console doesn't give 2 shits about what card is in it. they just care that the new console looks better than previous ones. you have to make tradeoffs when it comes to hardware/price and hit a middle ground, and what they've currently chosen is where it's at.

This is also true. The target market of consoles aren't tech nerds. That being said, there are very real TDP considerations which make "cutting edge" simply NOT possible in a console. That is also a very large factor in it.

Put simply, Sony and MS did the best they could within the restraints of a console sized box. You can't do cutting edge PC components in such a box because the TDP and cooling requirements are ridiculous in comparison, and not sustainable for long term use in a console sized box.
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
uhmm..i said the 8800gtx came out a year after the xbox, in nov 2006. That would put it, look at that one year ahead of the xbox release date. -_-

Considering the TDP of the 8800 is probably close to that of the xbox chip, it isn't far off from that of a 670gtx. (Looking at the 500 series.. looks like we see some insane TDP ratings in that series). So if the xbox's 2005 setup could handle 150ish tdp (well, for the most part. ;) ) I don't see why todays better materials and airflow design could easily take out 200 tdp. Also gpu heatsinks haven't changed an awful lot since my 8800 ultra (although I look at those 580 and 590 numbers and think DAMN, they had to do something special on those).
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
uhmm..i said the 8800gtx came out a year after the xbox, in nov 2006. That would put it, look at that one year ahead of the xbox release date. -_-

Considering the TDP of the 8800 is probably close to that of the xbox chip, it isn't far off from that of a 670gtx. (Looking at the 500 series.. looks like we see some insane TDP ratings in that series). So if the xbox's 2005 setup could handle 150ish tdp (well, for the most part. ;) ) I don't see why todays better materials and airflow design could easily take out 200 tdp. Also gpu heatsinks haven't changed an awful lot since my 8800 ultra (although I look at those 580 and 590 numbers and think DAMN, they had to do something special on those).
I don't suppose you know the exact tdp rating of the Xbox GPU do you?

The PS3's GPU I believe though was based on a GTX 7800 Ultra with cut-down ROPs and memory bandwidth. Wonder how the TDP on that part compares to the 8800 Ultra.
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
I don't suppose you know the exact tdp rating of the Xbox GPU do you?

The PS3's GPU I believe though was based on a GTX 7800 Ultra with cut-down ROPs and memory bandwidth. Wonder how the TDP on that part compares to the 8800 Ultra.

Nah they never give out that info, and because it isn't an off the shelf chip its hard to test. I always thought of the xbox gpu as a beta stage 8800, with its unified shaders and similar strengths. PS3 does have the weaker gpu and needs to use other tricks to achieve the same performance. PS3 probably had better cooling solutions too, sony has been making equipment for decades, MS has not. If I recall, xbox has a power brick? So PS3 was able to cool that internally too.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
uhmm..i said the 8800gtx came out a year after the xbox, in nov 2006. That would put it, look at that one year ahead of the xbox release date. -_-

Considering the TDP of the 8800 is probably close to that of the xbox chip, it isn't far off from that of a 670gtx. (Looking at the 500 series.. looks like we see some insane TDP ratings in that series). So if the xbox's 2005 setup could handle 150ish tdp (well, for the most part. ;) ) I don't see why todays better materials and airflow design could easily take out 200 tdp. Also gpu heatsinks haven't changed an awful lot since my 8800 ultra (although I look at those 580 and 590 numbers and think DAMN, they had to do something special on those).

Do you know how stupidly hot all of the 8800 cards got? Do you know their failure rate after 2-3 years? (HIGH) The point being, is that the shroud for the 8800 GTX will not work in a console. The heat generated is too high for a device that is intended to last 5+ years, and that is aside from the fact that consoles are put in air constrained areas with limited ventilation. Think of enclosed entertainment centers with absolutely no air flow. This is how consoles are designed. You cannot even do an 8800 ultra because those stupid cards ran hot as heck (90C+) and after 1-2 years of use required the user to remove the dust from the shroud. I know this because I had one of those stupid 8800 ultras and after a year of use there was so much dust caked inside of my 8800 that it was causing my PC to BSOD incessantly. It was an easy fix, but I don't expect an end-user to deal with something like this on a consumer level item - So is the user supposed to open up their xbox to remove a GPU shroud? I don't think so. The cooler is also designed to be in a PC in an open air area with decent ambient temps - that is obviously not the case with a console which must be designed to work in an air constrained space while simultaneously not increasing component heat too dramatically. Put an 8800 ultra in an air constrained space and watch the BSODs and TDR's happen.

Even WITH that design consideration, the xbox 360 still had issues. You cannot put those stupid GPUs with the massive shrouds inside of a console sized box. You have to remember where consumers will be putting these things. Even the 7 year old 8800 ultra is designed to work in an open air area and exhaust outside of a PC case. Consoles are designed to operate with air constraint outside of the console chassis.

So with all that said, Sony and MS did the best they could (well, at least SONY did) with the currently available technologies. Anyone thinking that a high end GPU such as a 780 being doable in a console? No. Cost is one issues, but the more important issue is longevity, warranty, and heating/cooling requirements. Those heating/cooling requirements simply are not possible in a device intended to work in any air constrained space. Again, the differences between CPU/GPU cooling requirements from 2005 and 2013 are far different - I look at some CPU coolers now and just laugh because they are stupidly big (Noctua NH-D14) or GPUs such as the 7970 and Titan. That type of cooling/TDP is not compatible with a console box which will be put in an air constrained entertainment area.
 
Last edited:

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
Looking at it from the point of a consumer and idealist, your right the failure rate was stupidly high because they got too hot. I did have to replace my 8800 under warranty twice but both times it was the memory that failed, not the gpu. I certain wouldn't try sticking a x80 in it... but there are x70's with low enough tdp's and maybe even doing 780m(obile) chips in SLI. But alas that goes beyond the scope of this thread.

However when was the last time you saw ANY console maker worried about how long their system lasted? Once they started getting cd drives in them, they were produced on the cheap and failed frequently. You would be hard pressed to find working ps1 units these days. If anything, I think the ps3 and xbox are constructed well (parts wise.. the cooling design in the 360 is clearly a failure) considering the ps1/ps2/xbox/saturn days. My dreamcast probably still works, but I played burned Japanese games on it, you had to flip it upside down to read the disc.

I don't think any console maker cares if their system last more then 2 years, more reason to buy another one. And many of my friends who play consoles have had to buy a replacement system (well.. xbox.. none of my friends have ps3's -_-). Heck one of my friend's xbox gets so hot it destroys disc, hes replaced more then a few with used gamestop copies. ^^
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
0 fux gvn.

I'll stick with current gen for another 5 years if it continues to have better games.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
The money is in the accesories for Sony and ms. Sure they lost their ass on the systems but how much does it cost to make a controller they sell for $50 in store? They make a killing off controllers
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
Accessories, maybe not so much, they don't have a restriction on third party equipment, and let me tell you a lot of people will go for the cheaper stuff for that "second" controller or to replace one that was broken. Thats from my experience selling the stuff. It is the games where its all at. With the matured online stores, profit cuts are even bigger too. Sadly while sony gave us a free pass when they had a fledgling online portal compared to LIVE, ps4 the free ride ends. :(
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Don't the major (Microsoft, Sony, Nintendo) consoles make a ton of money? Even the Nintendo consoles that are considered failures like Wii/Wii U make the company a substantial amount of profit through game sales etc... They say that Xbox 360's and PS3's were sold at a loss at the beginning from a hardware perspective, but wouldn't the recoup from the video games sales themselves completely negate that?

The main reason the new consoles are being so power conservative is simply a side effect of designing consoles that will have a lower sticker price and not lose money on sales to Sony/Microsoft for ~3 years. Both Sony and Microsoft are at turning points where they can't afford to lose money on their console hardware for various reasons.
-------

In the mid 2000's, Sony was still an industry titan - its Electronics divisions were doing better than the shell of their former selves they are today. And Microsoft was willing to lose money on its Xbox 360 essentially to "buy" marketshare as well.

http://gamerant.com/ps4-costs-lower-ps3/

Basically, the PS3 lost Sony a TON of money for several years, a strategy which worked fine for Sony in the PS2 era, but development costs were even crazier in the PS3 era (remember, Cell cost >$1 Billion to develop alone).

RROD cost Microsoft > $ 1 Billion too: http://www.forbes.com/2007/07/05/msft-xbox-charge-tech-media-cx_rr_0705techmsft.html

Compounding the problem, the attach rate (# of software titles people bought per console) was considerably worse for the PS3 vs the PS2 - an average of 4.6 games per PS3 console, vs 6.5 on the PS2. See this page:

http://vgsales.wikia.com/wiki/Software_tie_ratio

I think everything you asked can be followed back to the red ring of death.

^ RROD made the picture less rosy for Microsoft to design a beastly console this generation, for sure, but even without RROD Microsoft can't afford to (or doesn't want to) sell its consoles at a loss for years this generation either.
-----

There's several ancillary reasons for cheaper consoles this generation too. Unemployment and the economic downturn mean that the mass market can't afford a $600+ console this generation.

Also, consoles are threatened by tablets - the iPad and iPhone as well as Android devices are a HUGE market. Remember, the Xbox 360 came out in 2005 and the PS3 in 2006. In 2007 the iPhone came out, and in the past 6 years, mobile phone/tablet gaming has gone from nothing to a multi billion dollar market with an installed userbase of 1 billion devices. Pretty strong competition! It's not a coincidence that the Wii U's sales are so abysmal compared to the Wii's -- all of the casual gamers have moved on to mobile gaming!

Also, the PC market is surprisingly resurgent as well due to a number of factors, with STEAM being a big part of that:

http://ca.ign.com/articles/2013/09/04/why-pc-gaming-has-exploded

Microsoft has catalyzed the convergence of PC & Console, with its Xbox hardware and software being effectively very similar to DirectX on the PC. This has made porting games between Xbox 360 and PC easier, so lots of AAA titles are non-exclusive. Heck, even Halo 1 & 2 and Gears of War came out on PC!
 
Last edited: