Console hardware: what Sony/MS went with versus what they should have

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
By June 25, 2013, which is cutting it crazy close to the release date of PS4 for testing and manufacturing, NV released GTX760 with 161W power usage, or 68% more power than an HD7850 2GB, but only 45% faster.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_760/25.html

perfwatt_1920.gif


However, that launch was too close to realistically use in PS4 due to testing+manufacturing and logistics for Nov 2013 time-frame. To make launch date, you would have had to choose GTX660/660Ti/670/680, but all of them have higher power usage than 100W, so cannot be directly comparable to a 7850/7970M. Your only option at this point is the 680M with its 100W TDP.

Problem is 680M has a voodoo power rating of 143 vs. 7850 with 141. You gain nothing really while the former probably costs $300+ from NV. The 680MX had 122W TDP, so that wouldn't work.

That brings us to 770M, 775M and 780M. 770M < 680M, so that's a fail. 775M is basically identical to a 680M in performance, so that's not gonna work either. Finally 780M would have worked and provided 30% more performance than 680M/7970M/PS4's GPU. 780M had a retail MSRP of $750 USD. With NV's profit margin of about 55-56%, NV would have had to sell that GPU for $330 USD at cost to get just 30% more performance from PS4's GPU on launch date.

Considering in 4-5 years we'll have GPUs 5-10X faster than PS4 and an extra 30% would hardly make a difference for PS4 longevity in 2018-2019, it's fair to say it would have been the dumbest decision in the world to put a 780M inside a PS4/XB1 for a mere 30-40% faster GPU performance and for Sony to take a $400+ USD loss on the GPU alone because no way would NV have sold that GPU with $0 profit.

Therefore, Sony's choice for an 1152 SP GCN was hands down the best option possible given the risk of removing 2 CUs for yields. Of course if Sony wanted to take greater losses and have supply constraints for 6 months or so, they could have released a full blown 1280 SP GCN part but in the grand scheme of the console's 5-6 year useful life, that's not going to matter. To really make a bigger difference, the GPU would have had to be 7970Ghz/780 level or so, something not possible given their power usage.

If you throw out power usage, integrated PSU, air cooling is replaced by water cooling, throw out $399 price, heck, let's build a $3000 PC and call it PS4/XB1. How logical would that be?

-----------
TL; DR:

If we want to get technical, yes, you could have built a much faster PS4:

1) Take a May 30, 2013 GTX 780M 100W TDP GPU (about 30% faster than a 7970M/7850 2GB), MSRP of $750, with NV's profit margins of 55-56% or so, implies NV could have sold this minimum at ~$330-340 at cost.

2) Mobile Core i7 4930MX 57W TDP (or even chosen a 47W TDP i7 such as the 4950HQ).
$657 USD, or with Intel's profit margin of about 60%, about $260-270 at cost.
http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html

Such a console would use about 180W of real world power and cost $600 USD in BOM for the CPU+GPU alone if you could convince NV and Intel to make $0 off the deal. Awesome, what do I win? Nothing, cuz Sony was paying just $100 for the entire APU in PS4 and no way would NV/Intel sell you their parts for even 15% profit margins.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
It's not Kabini and it's not Bay Trail, it's a custom 8 core Jaguar. Do you have the option to buy that? No? Then how is this a relevant comparison?

Im talking about the cpu cores here, its a custom Jaguar, but its still a Jaguar, actually i dont think it should be very far of what a Athlon 5350 + HD7850/R9 270 + Mantle can do.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It's not Kabini and it's not Bay Trail, it's a custom 8 core Jaguar. Do you have the option to buy that? No? Then how is this a relevant comparison?

Its really two standard 4 core jaguar clusters bolted together via a bus.

By June 25, 2013, which is cutting it crazy close to the release date of PS4 for testing and manufacturing, NV released GTX760 with 161W power usage, or 68% more power than an HD7850 2GB, but only 45% faster.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_760/25.html

However, that launch was too close to realistically use in PS4 due to testing+manufacturing and logistics for Nov 2013 time-frame. To make launch date, you would have had to choose GTX660/660Ti/670/680, but all of them have higher power usage than 100W, so cannot be directly comparable to a 7850/7970M. Your only option at this point is the 680M with its 100W TDP.

Problem is 680M has a voodoo power rating of 143 vs. 7850 with 141. You gain nothing really while the former probably costs $300+ from NV. The 680MX had 122W TDP, so that wouldn't work.

That brings us to 770M, 775M and 780M. 770M < 680M, so that's a fail. 775M is basically identical to a 680M in performance, so that's not gonna work either. Finally 780M would have worked and provided 30% more performance than 680M/7970M/PS4's GPU. 780M had a retail MSRP of $750 USD. With NV's profit margin of about 55-56%, NV would have had to sell that GPU for $330 USD at cost to get just 30% more performance from PS4's GPU on launch date.

Considering in 4-5 years we'll have GPUs 5-10X faster than PS4 and an extra 30% would hardly make a difference for PS4 longevity in 2018-2019, it's fair to say it would have been the dumbest decision in the world to put a 780M inside a PS4/XB1 for a mere 30-40% faster GPU performance and for Sony to take a $400+ USD loss on the GPU alone because no way would NV have sold that GPU with $0 profit.

Therefore, Sony's choice for an 1152 SP GCN was hands down the best option possible given the risk of removing 2 CUs for yields. Of course if Sony wanted to take greater losses and have supply constraints for 6 months or so, they could have released a full blown 1280 SP GCN part but in the grand scheme of the console's 5-6 year useful life, that's not going to matter. To really make a bigger difference, the GPU would have had to be 7970Ghz/780 level or so, something not possible given their power usage.

If you throw out power usage, integrated PSU, air cooling is replaced by water cooling, throw out $399 price, heck, let's build a $3000 PC and call it PS4/XB1. How logical would that be?

If we want to get technical, yes, you could have built a PS4 with a May 30, 2013 GTX 780M (about 30-40% faster max), 100W TDP GPU and paired with a mobile Core i7 4930MX 57W TDP (or even chosen a 47W TDP i7 such as the 4950HQ).
http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html

Such a console would use about 180W of real world power and cost $700-800 USD in BOM for the CPU+GPU alone. Awesome, what do I win?

I've already gone into the absurdity of using highly binned and expensive mobile chips for comparison in the other thread.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
2) Mobile Core i7 4930MX 57W TDP (or even chosen a 47W TDP i7 such as the 4950HQ).
$657 USD, or with Intel's profit margin of about 60%, about $260-270 at cost.
http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html

Such a console would use about 180W of real world power and cost $600 USD in BOM for the CPU+GPU alone if you could convince NV and Intel to make $0 off the deal. Awesome, what do I win?

A truckload of chips that didn't cut it to top tier mobile part bin. And shortest lived company ever badge.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Such a console would use about 180W of real world power and cost $600 USD in BOM for the CPU+GPU alone if you could convince NV and Intel to make $0 off the deal.
Of course Intel/Nvidia will NOT go all charity case and give their stuff away. So realistically if we add everything up such a console would cost $1000+. Sony has sold over 18 million PS4's so far that's an amazing install base for just over a year on the market, a $1000+ unit would be sales suicide.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Double APU budget, slightly bigger price? They wouldn't just raise the price exactly how much more it'd cost. It'd make the consoles much bigger and uglier with much more power consumption. That also doesn't fit in future SFF options that most consoles go to. When it gets near a ~14nm process you can bet these will be about the size of the PSone.

If MS would sacrifice some safety margin they could easily cool an APU that would draw 25% more power with their current design, they over-engineered the cooling solution but your points still stands for Sony, their cooling solution is not overdone at all like that of Xbox One.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I've already gone into the absurdity of using highly binned and expensive mobile chips for comparison in the other thread.

My point was that a highly binned mobile GPU from AMD/NV is the only possible option to replace surpass the performance of an ~ HD7850 2GB/7970M level of GPU in PS4 in a still reasonable 100W power envelope. If we remove the power envelope, why not put an HD7990 or a GTX690 inside a PS4? You see what I mean? It's not hard to assemble a "PS4/XB1" with way more power in November 2013. Core i7 4770K @ 4.7Ghz on custom water, GTX690 SLI or HD7970Ghz Quad-Fire and put in a large case, slap Sony PS4 logo on the box. That's not how consoles work. So how do we design a console with a similar power usage to PS4? The ONLY way to beat that AMD GPU was to use a binned mobile GPU such as the 780M.

Of course Intel/Nvidia will NOT go all charity case and give their stuff away. So realistically if we add everything up such a console would cost $1000+. Sony has sold over 18 million PS4's so far that's an amazing install base for just over a year on the market, a $1000+ unit would be sales suicide.

I am not even entertaining that idea as being reasonable from a business perspective for Sony, but I just provided it as an example. It's pointless to me and I don't know why people argue. Even before XB1 and PS4 launched, right away I just looked at the top mobile 100W GPU from NV and AMD and I knew right away that's the HIGHEST level of GPU performance PS4/XB1 would possibly use. 12 months later, it's a variation of the HD7970M part from AMD. :awe: There was hardly a better GPU besides the 780M that Sony could have used. The only point of serious contention is really the CPU part of the consoles -- i.e., going 4 faster AMD cores or 8 slow cores.

Looking at power usage of FX series, they would have needed to severely downclock them to hit their power usage target. Even a desktop i5 would have used way too much for their total console power usage target of 150-180W when pairing it with an HD7970M style GPU. This is without the 100W GPU!

power-3.png


What Sony did is maximized the power budget for the graphics card and by default had almost no other choice but to use a low-power CPU. At this point they choose to have 8 cores as they thought developers could at least try and optimize for more threads. It was a compromise. Now if they allowed the console to use 240-250W of power, then sure we could have been exploring all kinds of possibilities. PS3 paid a heavy price for having a weak GPU and an expensive Cell CPU. The system never lived up to its graphics potential. Sony did the right thing by focusing on the GPU as it's the most important factor in limiting graphical performance in modern games. Since PS4 isn't trying to get 60 fps in all games but mostly 30 fps, the CPU plays a less critical role in 90% of games than the GPU. Sure, there will be some cases where the game is heavily CPU limited but if having the choice of allocating the power and cost budget of a console, it's logical to devote 80%+ to the GPU as that is how modern games tend to be.

A key point that hasn't been touched up is the lowest common denominator. Even if one of the consoles had a Core i7 5960X and GTX980 Quad-Fire, since the other 2 consoles would have been underpowered, and since most PCs do not have these specs, 90% of the developers would have not coded games to such high specs to begin with. Even though N64 was more powerful than PS1 and the Original Xbox and Gamecube were more powerful than PS2, developers of cross-platform games rarely take advantage of that extra hardware. We are seeing the same thing with most PS4 games looking and running only slightly better than Xbox One's versions. If you make a console too powerful, it is bound to be much more expensive, and thus has a higher risk of not selling as well, which can seriously hamper early adoption and long-term success. Frankly, having way more powerful hardware in a console doesn't guarantee success - Sega Dreamcast anyone?
 
Last edited:

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
RussianSensation, good as your analysis is, you are wasting your time. None of the people complaining will ever buy one of these consoles, and yet they complain. Go figure.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Personally I think the 8 core design wasnt a terrible idea, but I think they should have put at least one big core in there, with more execution units and more IPC to handle the main program thread. I think the SoC would have been a lot more powerful overall with a single large core occupying the space of two smaller ones for a total core count of 7, maybe even 6. I just dont see any console needing 8 cores, even taking into consideration audio, network, recording, streaming, etc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RussianSensation, good as your analysis is, you are wasting your time. None of the people complaining will ever buy one of these consoles, and yet they complain. Go figure.

Thanks! In another thread I already dispelled the myths that Xbox360/PS3 were super powerful consoles to begin with as their hardware actually aged at alarming rates due to how fast PC hardware was evolving at that time.

Just 20 months out from PS3's launch, a GTX280 provided 5.75X the GPU power.
http://forums.anandtech.com/showpost.php?p=37051515&postcount=69

The point is, even if PS4 used a GTX680, it would have been a matter of time before it would be outdated. Can I play Zelda U, Mario Kart 8, Super Smash Bros., Bayonetta 2, Halo 5: Guardians, Forza 6, Gran Turismo 7, Unchared 4 on Quad-SLI GM200s and 5960X? No, I cannot. There is nothing wrong with buying most cross-platform games on the PC but it doesn't mean there is no room for consoles for social multi-player, sports games, and exclusives, regardless that consoles can't reach Quad-SLI GM200 level of graphics. :cool:

Personally I think the 8 core design wasnt a terrible idea, but I think they should have put at least one big core in there, with more execution units and more IPC to handle the main program thread. I think the SoC would have been a lot more powerful overall with a single large core occupying the space of two smaller ones for a total core count of 7, maybe even 6. I just dont see any console needing 8 cores, even taking into consideration audio, network, recording, streaming, etc.

But again, now you are talking about taking a similar approach to a Cell custom design, which took 4 years and $400+ million from 2001-2005, and ultimately was not worthwhile for any of the partners involved:
http://en.wikipedia.org/wiki/Cell_%28microprocessor%29

This has already been tried and it was a massive failure in PS3. Don't forget that the Cell was never designed as a gaming CPU but was intended for super-computing which at that time justified the investment in R&D by the firms which collaborated on the idea. AMD/NV/Intel aren't about to spend $500 million-1 billion to make a custom CPU for PS4/XB1 only since you wouldn't recoup the costs. Think about who is actually going to finance such a custom CPU specifically for consoles? You won't be able to justify taking a chance at designing a $500+ million custom CPU 3-4 years out for a chance to bid for console design wins. That means the only companies which can possibly get console design wins are those who have already designed the product and have shown it to Sony/MS with minor tweaks possible for the final design. That's why already finished products like Jaguar could have been used for bidding.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
My point was that a highly binned mobile GPU from AMD/NV is the only possible option to replace surpass the performance of an ~ HD7850 2GB/7970M level of GPU in PS4 in a still reasonable 100W power envelope. If we remove the power envelope, why not put an HD7990 or a GTX690 inside a PS4? You see what I mean? It's not hard to assemble a "PS4/XB1" with way more power in November 2013. Core i7 4770K @ 4.7Ghz on custom water, GTX690 SLI or HD7970Ghz Quad-Fire and put in a large case, slap Sony PS4 logo on the box. That's not how consoles work. So how do we design a console with a similar power usage to PS4? The ONLY way to beat that AMD GPU was to use a binned mobile GPU such as the 780M.

And my point is that is an artificial distinction. The chips in the PS4 do not go through such a rigorous binning process (no die harvesting except for two redundant CUs); which is why clocks are low. Thus there is no reason to pick a mobile chip. Mobile chips are the same silicon as the desktop counterparts. Furthermore, much of the efficiency increase in mobile chips is due to the lower speeds they run at (780m runs well under 900 mhz compared to the 680/770 at 1100 mhz +). Simply lower clocks on a stock desktop chip to appropriate levels to the desired power numbers (yes mobile will be a little better but it will not be terribly significant).

Not to mention that the 7970m at 2.18 TFLOPS is 18% more powerful than the 1.84 TFLOP PS4.

My point is not that SLI or a 3770k would ever be possible its that mobile chips are not required.

The only other contenders for a console would be Pitcarin, GK104 (downclocked) or GK106. Pitcarin would have probably been the best choice out of the three.

Looking at power usage of FX series, they would have needed to severely downclock them to hit their power usage target. Even a desktop i5 would have used way too much for their total console power usage target of 150-180W when pairing it with an HD7970M style GPU. This is without the 100W GPU!

Again, downclock the chips. Even sandybridge had no problem running 4C/8T at 2.2 Ghz in a 45W envelope (with igp). Ivybridge, lowering the clocks could easily fit 4C/8T @ 3.0 ghz in 45W (no igp).

44766.png


44767.png


50-55W on the 3770k CPU only at stock clocks. Reduce the clocks to 3.2 ghz and it should easily fit in a 45W TDP. A lot of power consumption can be cut off when comparing to desktop boards simply because the mobo and bios would be cut down to only what is required (how a laptop motherboard seems to consume 1/2 the power of desktop motherboards and POST is 1-3 seconds). An i5 would take even less power.

Taking a 3.2 ghz IVB quad (4GB DDR3) + 7870 (4 GB GDDR5 perhaps core clock at 850 mhz to reduce power) would not have put the system over the 150-180W desired power range. It would have been too expensive but it would have fit the power budget.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
How about this:
Sony/AMD should have given both of those Jaguar modules (à 4 cores, for lack of a better word) their own power- and clocktree. That way they could have had the option to clock four cores at 1.8~2.0 GHz and the other quad at 1.2~1.4 GHz. That would have given them more single threaded performance with a very reasonable power increase and enough performance left on the remaining quad to perform system tasks and non-critical gaming tasks (e.g. background streaming of assets).

Now Microsoft... Well, they really have more of a problem with their GPU performance as it seems, so no easy hack there.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
How about this:
Sony/AMD should have given both of those Jaguar modules (à 4 cores, for lack of a better word) their own power- and clocktree. That way they could have had the option to clock four cores at 1.8~2.0 GHz and the other quad at 1.2~1.4 GHz. That would have given them more single threaded performance with a very reasonable power increase and enough performance left on the remaining quad to perform system tasks and non-critical gaming tasks (e.g. background streaming of assets).

Now Microsoft... Well, they really have more of a problem with their GPU performance as it seems, so no easy hack there.

If power is the limiter they should have at least allowed this higher clock with some cores disabled. Although it may not even have per-core power gating either. It's also possible that it lacks the dynamic voltage scaling necessary for this.

I wouldn't totally rule out the possibility that a non-insignificant number of the chips can't clock much higher than they do, regardless of power consumption.
 

naukkis

Golden Member
Jun 5, 2002
1,030
854
136
The only other contenders for a console would be Pitcarin, GK104 (downclocked) or GK106. Pitcarin would have probably been the best choice out of the three.

And you don't know that PS4's gpu part is Pitcairn? Sure in it's shipping configuration it have 2 cu's disabled but without die harvesting yields would have been much lower with fully enabled chip.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If power is the limiter they should have at least allowed this higher clock with some cores disabled. Although it may not even have per-core power gating either. It's also possible that it lacks the dynamic voltage scaling necessary for this.
Yes, very likely imho. I think that is something not even Beema has?

I wouldn't totally rule out the possibility that a non-insignificant number of the chips can't clock much higher than they do, regardless of power consumption.
Of course it would need to be within acceptable yield parameters, but considering that MS pushed their chip to 1.75 GHz very late in the evaluation process I think Sony & AMD might have lowballed clocks. A split power domain and four cores running at even lower speeds should take some pressure from yields as well.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
And you don't know that PS4's gpu part is Pitcairn? Sure in it's shipping configuration it have 2 cu's disabled but without die harvesting yields would have been much lower with fully enabled chip.

Its based off Pitcarin but it is not the actual Pitcarin chip launched in 2012. I mean Pitcarin as in the physical standalone chip.
Its slightly modified from Pitcarin but the same done for it as well (1252 cores @ 900 mhz). It would allow for die harvesting (sold as consumer GPUs).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Taking a 3.2 ghz IVB quad (4GB DDR3) + 7870 (4 GB GDDR5 perhaps core clock at 850 mhz to reduce power) would not have put the system over the 150-180W desired power range. It would have been too expensive but it would have fit the power budget.
Did you completely miss my post where I already said they could have used a mobile 780M GPU from NV and an i7 4950HQ on a power efficient mobile Intel board and would have achieved way more performance within their 180W power usage? You keep arguing how no one would use a mobile part but we can safely use a 780M as the upper-bound GPU specs within 100W TDP at that time and 4950HQ as close to upper-bound CPU spec. That's the whole point of using those components as a point of reference.

Of course none of this matters since component cost and availability are major factors in the console budget and forecasted profitability of the console business model over its lifecycle. The conclusion of this thread remains the same -- unless Sony (or any other 3rd party involved) wanted to take a hit on yields, thus unlocking the full 1280 SPs in the GPU and upping the clocks of the CPU to 1.75-1.8Ghz (by allocating an extra $20 towards a beefier heatsink) -- it is impossible to have manufactured a more powerful console than the PS4 when taking into account the console's cost, power usage and form factor. This is not surprising considering professionals at AMD advised Sony on the best component selection -- the people who know more perf/watt, yields, cost and all things CPUs and GPUs than anyone on our forum. The fact that Sony was able to get nearly 7970M's performance and 8GB of GDDR5, along with a built-in PSU, is an outstanding achievement. Sales of 18.5 million in the first year reflect that their engineers/designers/marketers did an A+ job.

Because these consoles have not been overspeced and overpriced, we are likely to see a PS5/XB2 by 2019 instead of 2021 (Xbox 360 2005 --> XB1 2013 = 8 years generation!). For PC gaming long-term and for console gaming I would choose a current at $399 PS4, with PS5 launch by 2019, than a $599 PS4 with 30-40% faster 780M, and PS5 by 2021 due to extra years required to recoup the losses.
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
Because these consoles have not been overspeced and overpriced, we are likely to see a PS5/XB2 by 2019 instead of 2021 (Xbox 360 2005 --> XB1 2013 = 8 years generation!). For PC gaming long-term and for console gaming I would choose a current at $399 PS4, with PS5 launch by 2019, than a $599 PS4 with 30-40% faster 780M, and PS5 by 2021 due to extra years required to recoup the losses.
I expect the next gen of consoles to hit the market near ~2018 & unless something major happens, like ARM catching up with low power x86 chips, AMD is the only chipmaker (realistically) that'll power them. Of course by then we'll have fulltime HSA, HUMA, Sabertooth (Jaguar's evolution) HBM, GCN 3.x et al & will probably see the full array of AMD promoted tech at its very best.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Think NV have shown enough already that they could in theory do something based on tegra to cover the next set of consoles.

No idea what they'll use in the end of course - lots will change by then anyway :)
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
We all know the spat that happened with first Xbox with MS - Intel and Nvidia. There was a major reason why 360 didn't have Nvidia in it. Sony wasn't happy with what Nvidia gave them......*which those chips also were part of bumpgate*

Nintendo has been very happy with what ATI/AMD has been giving them so don't expect to see any changes there.

Also don't expect Nvidia to honestly be in the running unless they drop their margins *which they won't and the same thing with Intel*......

People need to understand - 3 major factors that was in the design and decision on Xbox1 and PS4..........power, price and they wanted as a single package *which leads back to price*

Nvidia had nothing that could fit the bill, plus the fact they've burnt their bridges with both Sony and MS........

Intel again had nothing that could fit that bill either.......two reasons one; their graphics stuck......*while they have gotten better their drivers still suck specially for color reproduction compared to Nvidia and AMD* Two - price.....Intel doesn't cut margins......

AMD has the best combo for what both Sony and MS needed at the time - people keep saying Jag cores are weak; but they really aren't and it is a custom 8 core cpu......its not tapped out at all.

I mean PS3 was only just finally tapped out last year.....there still quite a bit of power left untapped.... and well Xbox1; needs its tools fixed asap as they are a mess..thus why dx 12 is coming there first
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
We all know the spat that happened with first Xbox with MS - Intel and Nvidia. There was a major reason why 360 didn't have Nvidia in it. Sony wasn't happy with what Nvidia gave them......*which those chips also were part of bumpgate*

Nintendo has been very happy with what ATI/AMD has been giving them so don't expect to see any changes there.

Also don't expect Nvidia to honestly be in the running unless they drop their margins *which they won't and the same thing with Intel*......

People need to understand - 3 major factors that was in the design and decision on Xbox1 and PS4..........power, price and they wanted as a single package *which leads back to price*

Nvidia had nothing that could fit the bill, plus the fact they've burnt their bridges with both Sony and MS........

Intel again had nothing that could fit that bill either.......two reasons one; their graphics stuck......*while they have gotten better their drivers still suck specially for color reproduction compared to Nvidia and AMD* Two - price.....Intel doesn't cut margins......

AMD has the best combo for what both Sony and MS needed at the time - people keep saying Jag cores are weak; but they really aren't and it is a custom 8 core cpu......its not tapped out at all.

I mean PS3 was only just finally tapped out last year.....there still quite a bit of power left untapped.... and well Xbox1; needs its tools fixed asap as they are a mess..thus why dx 12 is coming there first

Everything you've said above is true except that last paragraph. The PS3 and X360 were functionally tapped out in terms of performance around 2010, give or take a year. Since then, visual quality in games stopped progressing while they started dropping resolution and frame rate targets to compensate.

Also, according to Phil Spencer directly, one shouldn't expect miracles from DX12 on the Xbox One.
- http://wccftech.com/phil-spencer-directx-1-xbox-one-hardware-graphics/
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
Everything you've said above is true except that last paragraph. The PS3 and X360 were functionally tapped out in terms of performance around 2010, give or take a year. Since then, visual quality in games stopped progressing while they started dropping resolution and frame rate targets to compensate.

Also, according to Phil Spencer directly, one shouldn't expect miracles from DX12 on the Xbox One.
- http://wccftech.com/phil-spencer-directx-1-xbox-one-hardware-graphics/


Oh I'm not exactly but dx 12 will help compared to the crap tools they have out now....;)

Second - people in the field; said pretty much different; while 2010 was big year Uncharted 2.....when you look at Uncharted 3 which was late 2011 there was a world of difference between the two.

Then look at The Last of Us; it again was a step up from Uncharted 3 and Beyond 2 Souls......as I said; PS3 really didn't get fully tapped until then which was 2013.....;)

Even then there most likely still some left on the table with the 360 and PS3; but with the new consoles out.......they will never be pushed.

*if you want links I can provide them; Carmack's interview from 2013; Richmond's comments in 2011* ;)
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
In my opinion for a console PS4 has great specs and if devs will work to tap it like they did with Cell SPUs, great things will come out.

GPU 8GB of GDDR5, 32 ROP, almost 2TF of performance is very respectable. For a system targetting 1080P 30FPS performance it is fine.

CPU is lower end, but having 6-7 Jaguar cores is okayish if the following are considered:

1) Jaguar has ~1/3rd of modern core integer performance, and if you have 6-7, you end up with Cinebench MT score of 2.x or so @1.5Ghz.
2) You are required to offload seriuos FP load to GPU compute.
3) Less graphics API overhead means less pressure on main game rendering thread. Low level could even allow multithreaded submission to GPU as long as dev controls things.
4) Usual console benefits of having a clear target hardware and total control of threads ( as in if you schedule the thread on core you know it will complete and will not get interrupted by users browser flash video burning CPU).

Seeing what PS3 devs did with Cell, I have all faith of great things coming out of PS4 and much sooner in console cycle. Sure it will get "tapped" earlier and perf/pixel is fixed, but rendering techniques will continue to improve and 8GB is massive playground.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Personally I think the 8 core design wasnt a terrible idea, but I think they should have put at least one big core in there, with more execution units and more IPC to handle the main program thread. I think the SoC would have been a lot more powerful overall with a single large core occupying the space of two smaller ones for a total core count of 7, maybe even 6.
But where is it going to fit? In practice, BD is barely wider than Jaguar, but a module will take up all the space of the quad Jaguar, and probably then some on top of that, plus need a ton more power to make use of faster clocks needed to perform well. In addition, an SR module (1 unit) has less FP performance per thread than a quad Jaguar (1 unit), at similar clocks.

Of course a fat core would be nice to have, but it would also be relatively expensive to include.

I just dont see any console needing 8 cores, even taking into consideration audio, network, recording, streaming, etc.
Audio and network will not be CPU bottlenecks, a lot of audio is in dedicated hardware, and the video encoding is done in dedicated hardware. None of that is making much use of the CPUs.