Console hardware: what Sony/MS went with versus what they should have

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
An Intel i3 + custom Iris Pro variant should have been the choice. And before you mention price, Intel is flipping Atom's for nothing, I'd bet they would be flexible in price given they have put so much work into their iGPUs.

Both Intel and Nvidia charge insanely high licensing fees, which is why console manufacturers prefer AMD and IBM (previously).
 

lamedude

Golden Member
Jan 14, 2011
1,230
69
91
6 core OoE Xenon and cablecard TV tuner instead of Kinnect.
Backwards compatibility with 360 will hopefully stop players from switching to PS4 and being able to replace the DVR will make $400-500 an easier sell.
Ditch backwards compat on the WiiU and go with 4 core 1.5-2GHZ MIPS CPU.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
intel wasn't flexible with the original xbox so burned that bridge. intel wasn't going to happen because MS wasn't going back to them, and sony saw what intel had done so sony wasn't going to them either.

That just shows you shouldn't live in the past and hold petty grudges. Intel in consoles would be a decent coup for Intel. I wouldn't be surprised if they could make it work.
 
Apr 20, 2008
10,067
990
126
If other competitors could have came up with a better or similar product at a better price, it would've happened. Both consoles spec'd out exactly what they wanted and took that route. Anything else would not have hit the price/perf ratio and minimum performance that is required. Unless you want a repeat of Cell or the original xbox being boned by nVidia, it clearly was the right call.

Consumers are loving the new consoles. Games look great and the consoles are profitable. There's just a ton of spec-whores who just cant comprehend the difference between code that's close to metal and using a layer-heavy API.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
If other competitors could have came up with a better or similar product at a better price, it would've happened. Both consoles spec'd out exactly what they wanted and took that route. Anything else would not have hit the price/perf ratio and minimum performance that is required. Unless you want a repeat of Cell or the original xbox being boned by nVidia, it clearly was the right call.

Consumers are loving the new consoles. Games look great and the consoles are profitable. There's just a ton of spec-whores who just cant comprehend the difference between code that's close to metal and using a layer-heavy API.

So there is no middle ground between what happened with Cell, and what we have right now?

Ok....

I guess upping the Xbox One's GPU to HD7950 levels instead of HD7850 levels would have caused a Cell like "Fiasco".
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
That just shows you shouldn't live in the past and hold petty grudges. Intel in consoles would be a decent coup for Intel. I wouldn't be surprised if they could make it work.
And what about the GPU ? Doubt anything (from Intel) comes close to that ~7870 level of perf in PS4 & even Nvidia wasn't making anything stellar, in terms of efficiency & minimal power draw, to fit that form factor & on an SoC, certainly before Maxwell came along.
 
Apr 20, 2008
10,067
990
126
So there is no middle ground between what happened with Cell, and what we have right now?

Ok....

I guess upping the Xbox One's GPU to HD7950 levels instead of HD7850 levels would have caused a Cell like "Fiasco".

Fiasco? What the heck are you talking about? Fiasco? Really?

Cell was a dog to code for, is aimed for in between graphics/GPU/jack of all trades, and was never cheap to produce. nVidia held their GPU price high as hell, making the original xbox unprofitable. The current consoles can be die-shrunk (both CPU/GPU together) and are already profitable, similar to how Nintendo made cash from day 1 on the Wii. The difference is that games look great on the current consoles.

If you think software engineers who are in-the-know and hardware engineers who spec'd didn't evaluate all available options, you're out of your minds. Microsoft used both IBM and Intel in the past, and Sony used custom hardware, IBM, and nVidia for GPU. They wouldn't just up an switch vendors unless the best value couldn't have been achieved.
 
Aug 11, 2008
10,451
642
126
Fiasco? What the heck are you talking about? Fiasco? Really?

Cell was a dog to code for, is aimed for in between graphics/GPU/jack of all trades, and was never cheap to produce. nVidia held their GPU price high as hell, making the original xbox unprofitable. The current consoles can be die-shrunk (both CPU/GPU together) and are already profitable, similar to how Nintendo made cash from day 1 on the Wii. The difference is that games look great on the current consoles.

If you think software engineers who are in-the-know and hardware engineers who spec'd didn't evaluate all available options, you're out of your minds. Microsoft used both IBM and Intel in the past, and Sony used custom hardware, IBM, and nVidia for GPU. They wouldn't just up an switch vendors unless the best value couldn't have been achieved.

Apparently engineers in the same position made the decision to use cell. I think the console cpu is a good compromise, but you cant say how they made such a terrible decision before and made a perfect one this time.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Something quite subtle that people may be forgetting is that the xbone/PS4 APU is NOT a true 8 core chip, rather it is two quad core jaguar clusters on die connected by a bus. While L2 access on the same module is a minimum 26 cycles it increases to a whopping ~190 cycles when accessing cache on the other module (RAM latency is ~220 cycles).

This is a terrible barrier to effectively multithreading game code; for performance purposes code needed both by the two 'odd duck' cores and the four game cores must pretty much be present in both caches for prompt execution. Thus it seems quite likely that two of the 6 console cores are going to be used for relatively remote tasks that do not depend much on the main game code.

This is really interesting - didn't know this....

As you state, this really means half the cores (4 of the 8) really are left to OS and other tasks and the game itself, without substantial penalties, is limited to 4 pretty slow cores. Wow...

I guess my perspective is that the current console generation is obviously a 'compromise'. The market today is different than when the 360/PS3 launched, gaming is cheaper these days, and mobile gaming is getting bigger and bigger (phones, tablets, etc). I am 'more OK' with the PS4/XBone design choice if they shorten the life-span of the systems to 3-4 years or so. The 360/PS3 was pretty good hardware the first few years, then 'bleh' and then just outright outclassed. Most folks don't mind refreshing their iPads every other year, or just getting a new tablet each year for Christmas. They cost around what a console does, often higher. Keep the consoles more iterative, maintain backwards compatibility and the lower-hardware (thus lower prices) are more justified IMHO.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Apparently engineers in the same position made the decision to use cell. I think the console cpu is a good compromise, but you cant say how they made such a terrible decision before and made a perfect one this time.


The economy was a lot different in ~2005 when the PS3 was being drawn up compared to 2012/13 when the PS4 hardware choices were being looked at. Things are looking better today, but I can see why Sony and MS went with a lower priced console that takes a smaller loss and will make a profit sooner this time around.


I think the hardware isn't as bad as many make it out to be. We've already seen some pretty ambitious games on the PC over the last six months, likely mostly a side effect of the new console generation. The only thing I think AMD/MS/Sony might have been able to do that would have offered more flexibility and a better gaming experience (at the price the consoles launched and sell at today) is by allowing for some clookspeed control depending on how many cores are used. Being able to keep a couple of cores at low-ish clocks and handle background tasks while a couple of cores can jump to ~2GHz for games, something like that. 1.6GHz to 2.0GHz doesn't sound like a huge deal, but that's 25% more single threaded CPU performance. Possibly the difference between 20 and 25FPS or 25 and 30+FPS in some situations.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Look, the APU is $100 of the console cost, what it could have been $200? someone whould have cared if the PS4 was $500 instead of $400? no.
Now tell me how much you can improve with twice the budget for the APU....

Cost was not the issue, the problem was they dont care, they just wanted a cheap thing to make money fast, they dont need to make it good, devs have no option but to chop out their games to make it run and fanboys are always gona find some way to defend it, as they are, no one should be defending the jaguar.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Look, the APU is $100 of the console cost, what it could have been $200? someone whould have cared if the PS4 was $500 instead of $400? no.
Now tell me how much you can improve with twice the budget for the APU....
How about you tell us?
Cost was not the issue, the problem was they dont care, they just wanted a cheap thing to make money fast,
You are contradicting yourself.
..no one should be defending the jaguar.
It's not about defending, read the thread title. So here's the list add in your $200 CPU/GPU including specs.

CPU ??$
GPU ??$
GPU/CPU bridge ??$
GPU memory $80
CPU memory $65
power supply $20
Optical $28
Hard drive $37
electro-mechanical (cooling?) $35
Other (motherboard and components?) $40
Controller $18
Box/packing $6
Manufacturing cost $9
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
I think people have unrealistic expectations here. Mostly because previous generations were sold at a loss, with the hope of making a profit on game sales. It turns out, that business plan didn't work so well.

If you're going to discuss what they should have done, keep the cost to $375 or less. They ain't gonna throw in a pony and a lifetime supply of popcorn for $400.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I think people have unrealistic expectations here. Mostly because previous generations were sold at a loss, with the hope of making a profit on game sales. It turns out, that business plan didn't work so well.

If you're going to discuss what they should have done, keep the cost to $375 or less. They ain't gonna throw in a pony and a lifetime supply of popcorn for $400.

Shame couse we could have cheap crypto mining gear subsidized by MS and sony... Oh well.

How about you tell us?

You are contradicting yourself.

It's not about defending, read the thread title. So here's the list add in your $200 CPU/GPU including specs.

Your sarcas-o-meter is borked ^^
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think people have unrealistic expectations here. Mostly because previous generations were sold at a loss, with the hope of making a profit on game sales. It turns out, that business plan didn't work so well.

If you're going to discuss what they should have done, keep the cost to $375 or less. They ain't gonna throw in a pony and a lifetime supply of popcorn for $400.

Current generation is still sold at a loss. Its all about the expectations for game and service sales.

Obviously they have lowered that projection this time around. Plus Sony got some 5B$ to recover, while MS got 3B$ or so to recover from last time. not to mention the expected profit they are behind.

The real question is, how long will the console divisions get to live on before they are axed for not living up to the expected profit margins for the companies.

MS is further down the road than Sony in this regard. Elop is an example.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Look, the APU is $100 of the console cost, what it could have been $200? someone whould have cared if the PS4 was $500 instead of $400? no.
Now tell me how much you can improve with twice the budget for the APU....

Cost was not the issue, the problem was they dont care, they just wanted a cheap thing to make money fast, they dont need to make it good, devs have no option but to chop out their games to make it run and fanboys are always gona find some way to defend it, as they are, no one should be defending the jaguar.


If MS and Sony thought no one would care if the prices were $100 higher, the prices would be $100 higher.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
A faster cpu would've been nice, but kaveri wasn't ready in time, and might have been troublesome at the size of those ps4 / xbox one chips?

I don't think there was a real choice. Intel would've been expensive and nvidia didn't have a suitable cpu readily available.

They decided it was good enough and game developers would have to make due.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Thats no contradiction and the thread makes no sence, there is no way for me or anyone else to know what AMD or anyone else could have offered for $200.

if an 8 Core jaguar + HD7870 costs just $100, i can extrapolate that to $100 CPU (2x Athlon 5150) + $150 GPU ($180 R9 270 - $30 for 2GB of GDDR5), and if i double that i end up with a retail $200 cpu + $300 gpu in a similar range to the $200 APU Sony could have got. Needless to say for that money you get a I5 and a 290x to give you some idea.

Again this thread makes no sence, there is no way but to guess at what could have been done.
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
Thats no contradiction and the thread makes no sence, there is no way for me or anyone else to know what AMD or anyone else could have offered for $200.

if an 8 Core jaguar + HD7870 costs just $100, i can extrapolate that to $100 CPU (2x Athlon 5150) + $150 GPU ($180 R9 270 - $30 for 2GB of GDDR5), and if i double that i end up with a retail $200 cpu + $300 gpu in a similar range to the $200 APU Sony could have got. Needless to say for that money you get and I5 and a 290x to give you some idea.

Again this thread makes no sence, there is no way but to guess at what could have been done.
Well there was NO (good enough) GPU from Intel) at any price range to match the 7870 & NO (good enough) CPU from Nvidia comes close to the eight core Jaguar. So AFAIK neither of them could have made anything close to the console SoC's & at least one of the dept (CPU or GPU) would've been seriously neglected.

As for AMD, they did the best they could in a limited time frame & very limited budget, wrt TDP & price both. I do think that this gen will probably olny see the console hardware for 4~5yrs & that Sony/MS may want to upgrade the consoles sooner than most people expect, since they'll need better hardware to handle 4K & pushing consoles as a disposable gadget (like smartphone, tablets) is also in their best interest. If this happens, as I predict, then AMD will probably power the next gen as well :awe:
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Your list is flawed, unless its a 16GB memory console :p
Unless the CPU/GPU is a single die then there are two memory controllers. But if you think the list is flawed then post what you think it should be.

Thats no contradiction and the thread makes no sence, there is no way for me or anyone else to know what AMD or anyone else could have offered for $200.

if an 8 Core jaguar + HD7870 costs just $100, i can extrapolate that to $100 CPU (2x Athlon 5150) + $150 GPU ($180 R9 270 - $30 for 2GB of GDDR5), and if i double that i end up with a retail $200 cpu + $300 gpu in a similar range to the $200 APU Sony could have got. Needless to say for that money you get a I5 and a 290x to give you some idea.

Again this thread makes no sence, there is no way but to guess at what could have been done.
You are all over the place here I honestly have no idea what you're trying to say.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
The PS4 (1.84TFLOPs / 1152 stream @ 800MHz GPU clock) is far closer to a 7850 (1.76TFLOPS / 1024 stream @ 800MHz GPU clock) than a 7870 (2.56TFLOPS / 1280 stream @ 1000MHz GPU clock). (XBox One = (1.31TFLOPS / 768 stream @ 853MHz))
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Red Ring fear drove the consoles dramatically. They focused on low clock speeds, and a ton of BOM and R&D into heatsink, fans, case design. They are designed for very low failure chance due to heat, and this was prioritized over speed and pretty much everything else too.

I would be a lot happier with the consoles if they just ran at 2.2ghz on the CPU instead of 1.6ghz and 1.75ghz.
 

turtile

Senior member
Aug 19, 2014
634
315
136
I think a lot of people or forgetting the extra costs and time a $200 APU would incur. More power = bigger power supply with better cooling, more expensive motherboard/components, bigger/powerful fan/heatsink. Since everything would be bigger, a larger case would have to be used. Heavier components = more cost to ship to final assembly. Then more shipping costs when shipping to wholesalers - and yet again when shipping to retail markets.

On top of that, it would take twice the FAB space to produce the chips in the same amount of time.

Point being, a $200 APU would move the price up closer to $200 not just $100.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Unless the CPU/GPU is a single die then there are two memory controllers. But if you think the list is flawed then post what you think it should be.

Memory could easily be unified. Even if it had to travel over a HT/QPI link.

But even with 2 memory pools. You dont need 2x8GB. You would need something like ~2GB and ~6GB.