Console hardware: what Sony/MS went with versus what they should have

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
and yet both sony and MS could have done this if they thought that was a good idea, if they wanted to blow 50 watts of their TDP they could have run there 8 cores @ 2.4ghz, thats 50/33% ( depending which way you want to count) more perf.

Math just doesn't work that way, 2.4GHz is either 50% more than 1.6GHz or 1.6GHz is 33% less than 2.4GHz. There's only one way to count that.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Something quite subtle that people may be forgetting is that the xbone/PS4 APU is NOT a true 8 core chip, rather it is two quad core jaguar clusters on die connected by a bus. While L2 access on the same module is a minimum 26 cycles it increases to a whopping ~190 cycles when accessing cache on the other module (RAM latency is ~220 cycles).

Is the same tech Core2Quad haves, is no different. And C2Q does not perform any bad compared to native quads performances.


Consumers are loving the new consoles. Games look great and the consoles are profitable. There's just a ton of spec-whores who just cant comprehend the difference between code that's close to metal and using a layer-heavy API.

If you think software engineers who are in-the-know and hardware engineers who spec'd didn't evaluate all available options, you're out of your minds. Microsoft used both IBM and Intel in the past, and Sony used custom hardware, IBM, and nVidia for GPU. They wouldn't just up an switch vendors unless the best value couldn't have been achieved.

Awesome posts, and I agree with all. Whoever says this console generation is unsuccessful are not tuned with the latest console news. Console gamers are loving it.





The real question is, how long will the console divisions get to live on before they are axed for not living up to the expected profit margins for the companies.

Why think this as a bad thing?
Consoles becoming less mainstream and lucrative, Microsoft get out and Nintendo and Sony become the two remaining players in the field. Game developing become less graphics-focused and more gameplay/replay/fun-focused, with more and more indie developers empowered with high development tools such as Havok and UE4 doing better and better games.
Console market is as doomed as the whole PC market is. Better saying, is a thing that will NEVER die.




Just to summarize, from what I can tell so far nobody here has been able to present an alternative hardware spec that would be better than the AMD APU used in the PS4/XBONE, assuming the TDP and price of those units should be the same as currently. Is that correct?

You won the thread. :p
 

tential

Diamond Member
May 13, 2008
7,348
642
121
There was a target that MS/Sony went for, cost/heat/performance/form factor. Could things have been done better? Maybe, but remember these consoles are over a year old at this point so have to keep that in mind when talking about faster APUs and the like.

Isn't the thread titled "Console hardware: what Sony/MS went with versus what they should have" and not
"Console hardware: what Sony/MS went with versus what they should have given their set price/tdp/etc. For the consoles already in production?"

Changing those price/tdp/etc. Requirements are quite essential for an actual discussion on what some users think should happen.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Makes sense seeing there is only one clock speed version. But at the same time AMD seems very conservative with their APUs it's usually very easy to push them past stock clocks.

But this is 20% beyond the highest clocked SKU. Their would be yield problems for sure at those clocks.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Isn't the thread titled "Console hardware: what Sony/MS went with versus what they should have" and not
"Console hardware: what Sony/MS went with versus what they should have given their set price/tdp/etc. For the consoles already in production?"

Changing those price/tdp/etc. Requirements are quite essential for an actual discussion on what some users think should happen.

Well then the answer becomes easy if it is anything we want it to be, the most powerful hardware available at the lowest price they can charge.

The size is probably about as small as is comfortable to the manufacturer for keeping the hardware projected and cooled. Had the APU under the hood been a 200 watt unit, the case would probably be bigger. But so would the price.

I could just say they should have went with a socket 2011 hexcore CPU and an R9 290 / GTX780 class GPU and sold it for under $100. But, I'm trying to approach this theoretical discussion with some ties to reality. Where it can be tricky is how that can be a relative idea to each of us.

With that in mind, I don't think MS or Sony were willing to build an $800+-to-produce console when the world economy was where it is when the Xbox One and PS4 were being designed. I don't see either company willing to take a huge initial loss that doesn't become a profit for three or four years into the cycle. The prices are probably about right for what casual gamers and parents are willing to spend.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Isn't the thread titled "Console hardware: what Sony/MS went with versus what they should have" and not
"Console hardware: what Sony/MS went with versus what they should have given their set price/tdp/etc. For the consoles already in production?"

Changing those price/tdp/etc. Requirements are quite essential for an actual discussion on what some users think should happen.
I'm having a hard time deciphering this. The thread title is meant to say, at the time of the consoles release (and the engineering choices leading up to), what should have or could have been done differently/better. Some may chose the narrative, if the consoles were released now or in the future, or 8 months later then they were.
But this is 20% beyond the highest clocked SKU. Their would be yield problems for sure at those clocks.
Possible or even likely, I'd have to defer to Sony/MS on this because it would be odd for them to clock the processor lower if they didn't have to. But what makes me wonder is imagine them taking guidance from AMD and being told those are the stable clocks at a given yield. AMD seems very conservative in my experience that's what I was getting at.
 
Apr 20, 2008
10,067
990
126
Apparently engineers in the same position made the decision to use cell. I think the console cpu is a good compromise, but you cant say how they made such a terrible decision before and made a perfect one this time.

It was failed promises from Sony that promised an APU in the form of Cell. Once testing production started, cell couldn't really do both CPU and GPU concurrently well so Sony went with nVidia for graphics, which was the best option at the time.

AMD had off-the-shelf parts that could get right in the hands of console makers directly. All speculation had to do with Iis the effect of different memory controllers. A fiasco is IBM's miscalculation of cell's ability was not to be made again.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
I'm having a hard time deciphering this. The thread title is meant to say, at the time of the consoles release (and the engineering choices leading up to), what should have or could have been done differently/better. Some may chose the narrative, if the consoles were released now or in the future, or 8 months later then they were.

I already told you, they could been chosen to go for a slightly higher price, double the APU budget, it whould have yielded something better, either better cpu cores or better gpu, or maybe both. There was no specific reason to sell the PS4 at $400 other than sell it at cost because they make money out of games and services.

But them you want to tell you what they could have done, and thats something there is no way to tell, we cant be sure what AMD could have done, they may have used Piledriver modules like they did on Richland, and i dont think that whould have increased the APU cost by 100%.

But the thing is, they did not need it, its not their job to make games run on it, if a game does no run properly, its someone else problem.
Just dont come to me to say that using a cpu core arch meant for tablets and ultraportables was the best option for gaming consoles, this whole thread seems to try to justify that.
 
Last edited:
Apr 20, 2008
10,067
990
126
I already told you, they could been chosen to go for a slightly higher price, double the APU budget, it whould have yielded something better, either better cpu cores or better gpu, or maybe both. There was no specific reason to sell the PS4 at $400 other than sell it at cost because they make money out of games and services.

But them you want to tell you what they could have done, and thats something there is no way to tell, we cant be sure what AMD could have done, they may have used Piledriver modules like they did on Richland, and i dont think that whould have increased the APU cost by 100%.

But the thing is, they did not need it, its not their job to make games run on it, if a game does no run properly, its someone else problem.

Double APU budget, slightly bigger price? They wouldn't just raise the price exactly how much more it'd cost. It'd make the consoles much bigger and uglier with much more power consumption. That also doesn't fit in future SFF options that most consoles go to. When it gets near a ~14nm process you can bet these will be about the size of the PSone.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I'm having a hard time deciphering this. The thread title is meant to say, at the time of the consoles release (and the engineering choices leading up to), what should have or could have been done differently/better. Some may chose the narrative, if the consoles were released now or in the future, or 8 months later then they were.

Possible or even likely, I'd have to defer to Sony/MS on this because it would be odd for them to clock the processor lower if they didn't have to. But what makes me wonder is imagine them taking guidance from AMD and being told those are the stable clocks at a given yield. AMD seems very conservative in my experience that's what I was getting at.

I'm saying that locking the tdp/price/etc. Requirements for this discussion is silly it's a hypothetical thread. Not talking about the time on relationship.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
For the pricepoint and release timeline, I'd say Sony and MS did just fine with their hardware selections.

Obviously the market was fine with it as well given the numbers of units sold from both, to be contrasted with the Wii U for example.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think both companies got pretty good setups. I'm interested to see how PS4 devs leverage the extra compute capability though, especially considering Sony's history of vector and SIMD processing with the Emotion Engine and Cell. I guess it's just another logical step in their way of looking at things, however difference now is that it's not short sighted like the EE and Cell were.

I think everyone here is a bit crazy to think that either system is so absolutely horrible when we know that MS and Sony likely ran the numbers and economics of numerous configurations. They are good systems that deliver much more graphics horsepower while providing a common ISA and vastly increased IPC compared to the previous generation. Most developers are probably very happy with the PS4 especially. The Ubisoft debacle with AC Unity is just an example of ambition exceeding finite capabilities.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
The only alternative that would have made any kind of sense for the cost + TDP points would have been an nVidia SoC with 8-core Cortex-A15 clocked at 2GHz or so plus a mid-range Kepler-based GPU. But that would have had the major disadvantage of being 32-bit and having to use LPAE to access all of RAM from the CPU, and not being able to reuse PC x86-optimized middleware.
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Is the same tech Core2Quad haves, is no different. And C2Q does not perform any bad compared to native quads performances.






Awesome posts, and I agree with all. Whoever says this console generation is unsuccessful are not tuned with the latest console news. Console gamers are loving it.







Why think this as a bad thing?
Consoles becoming less mainstream and lucrative, Microsoft get out and Nintendo and Sony become the two remaining players in the field. Game developing become less graphics-focused and more gameplay/replay/fun-focused, with more and more indie developers empowered with high development tools such as Havok and UE4 doing better and better games.
Console market is as doomed as the whole PC market is. Better saying, is a thing that will NEVER die.






You won the thread. :p

Really? AAA gaming in 2014 was mostly generic rubbish.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Double APU budget, slightly bigger price? They wouldn't just raise the price exactly how much more it'd cost. It'd make the consoles much bigger and uglier with much more power consumption. That also doesn't fit in future SFF options that most consoles go to. When it gets near a ~14nm process you can bet these will be about the size of the PSone.

Who knows, its all speculation, we have no idea how much more using piledriver modules whould have influenced TDP and power, them again this whole thread want people to wild guess on just about everything.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
For the pricepoint and release timeline, I'd say Sony and MS did just fine with their hardware selections.

Obviously the market was fine with it as well given the numbers of units sold from both, to be contrasted with the Wii U for example.

i reject the premise that using a tablet arch processor is anywhere near "fine" for a mainstream gaming device.

Its like i want to build a pc for gaming and i use a Avoton C2750(forget about its price for a moment) instead of a FX8300 or an I5... ill just be stupid if i did that.
 
Apr 20, 2008
10,067
990
126
i reject the premise that using a tablet arch processor is anywhere near "fine" for a mainstream gaming device.

Its like i want to build a pc for gaming and i use a Avoton C2750(forget about its price for a moment) instead of a FX8300 or an I5... ill just be stupid if i did that.

When you stop calling it a tablet CPU then you can just get over it. Low voltage core 2 duos were in first gen (and expensive) tablets. They're not the same.

NES, SNES, Genesis, PS2, Dreamcast, among others had processors less than half the performance of high end off-the-shelf hardware. Guess what? Those had awesome games.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
NES, SNES, Genesis, PS2, Dreamcast, among others had processors less than half the performance of high end off-the-shelf hardware. Guess what? Those had awesome games.

Indeed, I can't think of a single console that used a CPU that wasn't grossly weaker at "normal" code than what high end computers of the same time used.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Both the CPU and GPU selection where the best possible solutions before release. As many have pointed out, nothing from Nvidia, IBM or Intel presented a better case.

Today however, we have progressed a bit and Maxwell along with a more powerful CPU would provide better performance / efficiency.

So to answer the question, before release nothing could beat the AMD combo, today there's alternatives and those do include AMD too.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
I honestly don't get the point of these discussions.

For one, those that complain are generally not the ones that buy. They are complaining about things they do not own and will never own. Why bother complaining about it then?

Secondly, the number one metric for determining what hardware to include is price. If Intel could make Sony a chip that was 3x faster and used 2x less power, but cost 5x as much, they would have still gone with AMD. The accountants work out how much money they can spend on internals, and it has to be within that. Within their budget, they need to get the most performance at the least power consumption they can.

Oh and the hardware has to be available a year before launch, or even earlier, so that game development can begin. This puts Kaveri out of the picture, which was too late.

So everyone saying that it should have had a 7970 and 4 x Bulldozer cores is missing the point - you would then be paying $800 for a console and no one would buy them.

Bulldozer cores are not as power efficient as Jaguar cores, which means that there is less power to give to the GPU. Bulldozer is extremely power hungry compared to Jaguar, and needs high clockspeeds to shine, which makes its power consumption worse. So it had to be little cat cores and not big cat cores, because big cat cores would have meant that, for the performance they wanted, power consumption would have been unacceptable.

The other thing to note is that fabbing the chip as a single die will eventually lead to cost savings. This goes back to what I said about cost being the most important metric. However, I would be willing to bet that AMD was not willing to fab another manufacturers CPU in their die, which meant that only AMD cpus were available. They could have gone Nvidia, but that would have meant using off the shelf ARM cores, which are generally not as quick as little cat cores.

If I had to guess why they both went AMD, it was simply because AMD bargained the hardest. They were willing to do the most work at the lowest cost, and could also get it to market quickly. Time to market is HUGE in consoles.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
When you stop calling it a tablet CPU then you can just get over it. Low voltage core 2 duos were in first gen (and expensive) tablets. They're not the same.

NES, SNES, Genesis, PS2, Dreamcast, among others had processors less than half the performance of high end off-the-shelf hardware. Guess what? Those had awesome games.

more excuses, what are you? Sony PR?

Yes, there are also tablet haswell as well, so what? Unlike all other examples Jaguar was designed for low power ultraportables and tablets (and failed btw).

My example is very valid, i have the option to build a gaming pc with a Bay Trail or Kabini CPU, or the ultraportables intended Celerons xxxxU, that means "its fine" if i do?
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
more excuses, what are you? Sony PR?

Yes, there are also tablet haswell as well, so what? Unlike all other examples Jaguar was designed for low power ultraportables and tablets (and failed btw).

My example is very valid, i have the option to build a gaming pc with a Bay Trail or Kabini CPU, or the ultraportables intended Celerons xxxxU, that means "its fine" if i do?


I guess I don't get it. I don't see a problem with the CPU choice given the options and pricing. If the performance is there and games run well enough on a low cost console, and their is an obvious leap from the previous generation, I don't care. I'll probably pick up a PS4 when they are closer to $300 and have plenty of hours of enjoyable gaming without thinking too much about the CPU that is under the hood.

Also, how a piece of hardware performs in a PC is not comparable to how it performs in a console.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
My example is very valid, i have the option to build a gaming pc with a Bay Trail or Kabini CPU, or the ultraportables intended Celerons xxxxU, that means "its fine" if i do?

It's not Kabini and it's not Bay Trail, it's a custom 8 core Jaguar. Do you have the option to buy that? No? Then how is this a relevant comparison?