Microsoft released Xbox One Details: 5 billion transistors, GPU, CPU shared memory

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

insertcarehere

Senior member
Jan 17, 2013
712
701
136
I wonder taking into account that Xbox1 has bigger die and PS4 having GDDR instead of DDR+sRAM and diffrent GPU which console is actually more expensive to produce at the end of the day? (let's take Kinect out of equation for a moment).

I am guessing that the PS4 will be more expensive to manufacture with GDDR5 and a smaller cooling system that needs to dissipate more power. Considering that the PS4 is being sold at a 'small' loss, I wouldn't be surprised if the X1 is breaking even at launch, even with Kinect.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
It might get worse for XB1 as developers learn new tricks with PS4 because PS4 is "the only console that will support the company’s next-generation heterogeneous unified memory architecture (hUMA)." ~ Source

Or it might NOT!

I mean whats the point - 1/2 in console space and 1/3 in PC gaming space.

All of a sudden huma does not sound so unified, does it?
So much for developers having one standardized target architecture.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So much for developers having one standardized target architecture.

1st party developers will squeeze the most out of PS4 just like they did with PS3. We probably aren't going to see major differences between 3rd party titles but for 1st party titles we should see more dramatic differences. The way I am looking at it now, PC will be my main platform for cross-platform 3rd party titles while if PS3 is anything to go by, PS4 will have better 1st party games than XB1. That leaves XB1 in an odd position, being somewhat redundant for anyone who will buy a PC+PS4. Xbox 360 suffered the same fate. With games like Titanfall and The Division coming to the PC, consoles themselves are being undermined since the less exclusives they have, the less exciting they are for a PC gamer.
 

Makaveli

Diamond Member
Feb 8, 2002
4,960
1,557
136
40% isn't going to be enough to create a huge IQ separation. I think the PS4 will look slightly better, but I don't think console users will care.

This is petty stuff computer nerds argue over.

Agreed balla,

The mouth breathers :) that are the majority of console players won't be able to tell the difference or care.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I am guessing that the PS4 will be more expensive to manufacture with GDDR5 and a smaller cooling system that needs to dissipate more power. Considering that the PS4 is being sold at a 'small' loss, I wouldn't be surprised if the X1 is breaking even at launch, even with Kinect.

Did they actually say the ps4 is being sold at a loss?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Did they actually say the ps4 is being sold at a loss?

From what I've heard, they're at least selling it at cost. While GDDR5 is a little extravagant, the technology has been around long enough that it's reasonable, and the Jaguar-based CPU and the Pitcairn-esque GPU aren't exactly expensive. The Kinect 2.0 in XB1 is so expensive that Microsoft actually is selling the XB1 at a loss, from what I've heard.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Wait, the numbers are counting the transistors of the shared memory? Wat? Or is that just the eDRAM?
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
40% isn't going to be enough to create a huge IQ separation. I think the PS4 will look slightly better, but I don't think console users will care.

This is petty stuff computer nerds argue over.

When it comes to FPS games, the PS4 will offer a decent frame-rate advantage, and that's something that console users will definitely care about.

Also, the GDDR5 will allow the PS4 to throw around larger textures. The One's eDRAM will really only be useful for the framebuffer (AA performance etc). 3-4 years from now I bet we'll see a difference.
 
Last edited:

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
good to know that xbox will have an audio processor, i thought that only ps4 was going to have one...

...well, more free cpu cicles, never to bad :biggrin:
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Wait, the numbers are counting the transistors of the shared memory? Wat? Or is that just the eDRAM?
That would be just the APU die. So that's the GPU + CPU + eSRAM + odds & ends (IMC, buses, etc).
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It does have some eSRAM that helps boost the memory bandwidth. The PS4 uses the same CPU/GPU config but with GDDR5.



It's just memory bandwidth, texture caching will be slower and such. Maybe it will hinder AA performance as well in some circumstances.

The PS4 does not have the same GPU as the XBone. The PS4's should be roughly 50% faster, not even taking into account memory differences.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
From what I've heard, they're at least selling it at cost. While GDDR5 is a little extravagant, the technology has been around long enough that it's reasonable, and the Jaguar-based CPU and the Pitcairn-esque GPU aren't exactly expensive. The Kinect 2.0 in XB1 is so expensive that Microsoft actually is selling the XB1 at a loss, from what I've heard.

Using GDDR5 gives Sony a long term advantage, as it'll get cheaper to produce as the industry moves away from DDR3 next year.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
Using GDDR5 gives Sony a long term advantage, as it'll get cheaper to produce as the industry moves away from DDR3 next year.
GDDR5 is based on DDR3 and has been around for a while. When the industry moves to DDR4 I'm guessing we'll see a GDDR6 based in it.
 

CakeMonster

Golden Member
Nov 22, 2012
1,620
798
136
DDR3 is even cheaper to produce, and the eSRAM counteracts the GDDR5 advantage to some degree. I suspect the PS4 has the performance advantage, but GDDR5 is not cheaper than DDR3.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
DDR3 is even cheaper to produce, and the eSRAM counteracts the GDDR5 advantage to some degree. I suspect the PS4 has the performance advantage, but GDDR5 is not cheaper than DDR3.

The eSRAM is a very small amount in the xb1 which will be the problem I suspect.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
There is not much debate. If you can afford it, GDDR5 is superior to DDR3 + small amount of eSRAM/eDRAM. The fact that all high end GPUs are 256/384-bit bus cards over GDDR5 supports this. XB1's approach is a cost cutting solution. They used a similar approach for Xbox 360 and it didn't really work to provide the 360 with free 4AA they promised. PS4's GPU sub-system most closely resembles a modern GPU where you have access to the memory bandwidth without having to go through any optimizations.
I was not talking about consoles specifically -- rather the APU market as a whole.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
why did they chose 32 mb eSRAM?

for a 1080p image it seems tiny?

Most likely cost and lack of space on the die. And it is tiny. But don't expect many games to run at 1080P. Majority of them will be 720P, just like the 360.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
why did they chose 32 mb eSRAM?

for a 1080p image it seems tiny?
Here's an interesting quip from AnandTech on the size of Crystalwell (GT3e). The same logic should apply to XB1.

There’s only a single size of eDRAM offered this generation: 128MB. Since it’s a cache and not a buffer (and a giant one at that), Intel found that hit rate rarely dropped below 95%. It turns out that for current workloads, Intel didn’t see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof. Intel doubled the size to deal with any increases in game complexity, and doubled it again just to be sure. I believe the exact wording Intel’s Tom Piazza used during his explanation of why 128MB was “go big or go home”. It’s very rare that we see Intel be so liberal with die area, which makes me think this 128MB design is going to stick around for a while.

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3