Microsoft released Xbox One Details: 5 billion transistors, GPU, CPU shared memory

 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Still DDR3, sadly. They can't really expect this to last 10 years, can they? At the breakneck pace of smartphone SoC improvements, we'll have phones which are better than the Xbox One in 5. The same kind of goes for the PS4, though they might have a year or two more of relevancy thanks to the beefier graphics chip and GDDR5 memory.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
Still DDR3, sadly. They can't really expect this to last 10 years, can they? At the breakneck pace of smartphone SoC improvements, we'll have phones which are better than the Xbox One in 5. The same kind of goes for the PS4, though they might have a year or two more of relevancy thanks to the beefier graphics chip and GDDR5 memory.
Ehh, I'm thinking that breakneck pace will slow pretty soon, possibly due to power limits, and will soon resemble something like Intel's incremental improvements.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Still DDR3, sadly. They can't really expect this to last 10 years, can they? At the breakneck pace of smartphone SoC improvements, we'll have phones which are better than the Xbox One in 5. The same kind of goes for the PS4, though they might have a year or two more of relevancy thanks to the beefier graphics chip and GDDR5 memory.

It does have some eSRAM that helps boost the memory bandwidth. The PS4 uses the same CPU/GPU config but with GDDR5.

just how big of a difference in graphics performacne does it make between DDR3 and GDDR5?

It's just memory bandwidth, texture caching will be slower and such. Maybe it will hinder AA performance as well in some circumstances.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
just how big of a difference in graphics performacne does it make between DDR3 and GDDR5?
GDDR5 has roughly three times the bandwidth (varies based on clock speed/implementation). The main downsides are higher cost and power consumption.

Today's CPUs aren't built to take advantage of lots of bandwidth, so there would be almost no point in installing GDDR5 in a desktop computer (if it were possible, that is). However, bandwidth is a graphics processor's best friend, and there is a lot of debate in regards to what solution to the "bandwidth problem" is optimal.

The XBONE uses DDR3 + eSRAM. Haswell GT3e uses DDR3 + eDRAM. The PS4 uses GDDR5. GDDR5m has been proposed as another possible solution to APU bandwidth issues. Finally, wider memory busses could theoretically work, however they present issues in regards to cost and they don't work well in certain form factors.

It appears that stacked memory is the "ideal" solution, however it is not quite ready for market. DDR4 would help as well, but it is also not on the market, nor is the eventual GDDR6, which is based on DDR4.

In short, there are a lot of really important technologies on the horizon, however they are not here in time, and the industry players have adopted many different methods in an attempt to buy time.
 
Last edited:

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
It does have some eSRAM that helps boost the memory bandwidth. The PS4 uses the same CPU/GPU config but with GDDR5.



It's just memory bandwidth, texture caching will be slower and such. Maybe it will hinder AA performance as well in some circumstances.

It's claimed that the eSRAM will get 204GB/s peak bandwith, which is a pretty good speed compared to the GDDR3.

https://twitter.com/Daniel_Bowers/status/372056251611365377

With only 32mb of that though, and the possibility that it could be used for something like the OS's at the same time, I'm not sure about it as a fullproof solution. Still better than nothing and might work for some small frame buffers or something. We'll have to see when it actually comes out with something more concrete on the performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
just how big of a difference in graphics performacne does it make between DDR3 and GDDR5?

Should be a very large difference in the overall context of GPU performance.

Based on specs, the GPU in XB1 is around HD7770 level. HD7770 has 1.28Tflops of power over 1Ghz GPU @ 640 SPs, 16 ROPs, 72GB/sec memory bandwidth. The GPU in XB1 has about 1.31Tflops of power, clocked at ~850mhz and has 768 SPs, 16 ROPs, 68GB/sec memory bandwidth.

PS4 is actually slightly faster than HD7850 2GB. HD7850 has 1.761 Gflops of power over 860 mhz @ 1024 SPs, 32 ROPs, 154 GB/sec memory bandwidth. The GPU in PS4 supposedly has 1.84 Gflops of power over 800mhz @ 1152 SPs, 32 ROPs, 176GB/sec memory bandwidth.

With XB1's eDRAM, it might squeeze closer to HD7790 in performance.

Case 1: eDRAM is not efficiently used --> XB1 = HD7770
At 1080P no AA, HD7850 is 57% faster, with AA, 55% faster
http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-770-im-test/4/

Case 2: eDRAM is used to push XB1 to HD7790 level of performance
At 1080P no AA, 7850 is 24% faster, 21% faster with AA

It might get worse for XB1 as developers learn new tricks with PS4 because PS4 is "the only console that will support the company’s next-generation heterogeneous unified memory architecture (hUMA)." ~ Source

1st wave of games like NFS is already going to look better on PS4 and that's without developers tapping into the compute or HSA advantages of PS4's GPU:
http://www.videogamer.com/ps4/need_...r_on_one_next-gen_console_than_the_other.html

I have little doubt that PS4's 1st party games will look better than XB1's over the life of the console.

It's claimed that the eSRAM will get 204GB/s peak bandwith, which is a pretty good speed compared to the GDDR3.

https://twitter.com/Daniel_Bowers/status/372056251611365377

With only 32mb of that though, and the possibility that it could be used for something like the OS's at the same time, I'm not sure about it as a fullproof solution.

“On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.

I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA). If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.”
~ Source

XB1 costs $100 more than the more powerful PS4, has non-upgradeable HDD and the GPU is going to end up 25-50% slower when pushed to the limit. Right now XB1 looks underpowered and overpriced for what you get.

Today's CPUs aren't built to take advantage of lots of bandwidth, so there would be almost no point in installing GDDR5 in a desktop computer (if it were possible, that is). However, bandwidth is a graphics processor's best friend, and there is a lot of debate in regards to what solution to the "bandwidth problem" is optimal.

There is not much debate. If you can afford it, GDDR5 is superior to DDR3 + small amount of eSRAM/eDRAM. The fact that all high end GPUs are 256/384-bit bus cards over GDDR5 supports this. XB1's approach is a cost cutting solution. They used a similar approach for Xbox 360 and it didn't really work to provide the 360 with free 4AA they promised. PS4's GPU sub-system most closely resembles a modern GPU where you have access to the memory bandwidth without having to go through any optimizations.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
OMG. For me the most important thing I saw was TSMC. This basically means this APU does absolutely 0 to help AMD with the stupid WSA with GF. I guess that once more GF was not capable to get high enough yields...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
OMG. For me the most important thing I saw was TSMC. This basically means this APU does absolutely 0 to help AMD with the stupid WSA with GF. I guess that once more GF was not capable to get high enough yields...

It might have to do with GCN component that TSMC was used.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
OMG. For me the most important thing I saw was TSMC. This basically means this APU does absolutely 0 to help AMD with the stupid WSA with GF. I guess that once more GF was not capable to get high enough yields...

this was always going to be the problem. GF is way behind TSMC at the 28nm node. TSMC has been in volume production of 28nm with high k metal gates for almost 2 years.

http://www.tsmc.com/tsmcdotcom/PRListingNewsAction.do?action=detail&newsid=6181

TSMC announced volume production of 28nm HP/ 28nm HPL / 28nm LP on Oct 24,2011. HD 7970 launched on Jan 9, 2012.

GF only recently announced 28nm SLP in volume production with Rockchip as a customer. remember GF has still not even started volume production on the 28nm high performance process. it will have to start soon as Kaveri is scheduled for an early CES 2014 launch.

but the maturity of TSMC 28nm and yields are way above what GF can even dream of.

http://www.globalfoundries.com/newsroom/2013/20130617.aspx

I expect PS4 SOC to also be fabbed at TSMC 28nm. both these are very high volume products with quite big die sizes. Xbox One is 363 sq mm. PS4 should be 300 - 350 sq mm.

AMD has to keep Kaveri production at GF 28nm to meet the wafer commitments to GF.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
“On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.

I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA). If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.”
~ Source

That's an old quote, when less details about the XB1 were known (for example, it's been long confirmed that the XB1's GPU is GCN based). 32 MB of eSRAM is definitely troublesome, but to the point of 720p with only FXAA? That's what current generation consoles do. I'm fairly confident that the XB1 will handily surpass current consoles. MSAA under deferred shading will probably be out of reach most of the time though; I foresee 1080p or upscaled 900p with some sort of FXAA.

XB1 costs $100 more than the more powerful PS4, has non-upgradeable HDD and the GPU is going to end up 25-50% slower when pushed to the limit. Right now XB1 looks underpowered and overpriced for what you get.

But...but...Kinect! You're getting Kinect! And everyone wants Kinect and will get a lot of use out of it. Right?...Right? :'(
 

Saylick

Diamond Member
Sep 10, 2012
3,923
9,142
136
Should be a very large difference in the overall context of GPU performance.

Based on specs, the GPU in XB1 is around HD7770 level. HD7770 has 1.28Tflops of power over 1Ghz GPU @ 640 SPs, 16 ROPs, 72GB/sec memory bandwidth. The GPU in XB1 has about 1.31Tflops of power, clocked at ~850mhz and has 768 SPs, 16 ROPs, 68GB/sec memory bandwidth.

PS4 is actually slightly faster than HD7850 2GB. HD7850 has 1.761 Gflops of power over 860 mhz @ 1024 SPs, 32 ROPs, 154 GB/sec memory bandwidth. The GPU in PS4 supposedly has 1.84 Gflops of power over 800mhz @ 1152 SPs, 32 ROPs, 176GB/sec memory bandwidth.

With XB1's eDRAM, it might squeeze closer to HD7790 in performance.

Case 1: eDRAM is not efficiently used --> XB1 = HD7770
At 1080P no AA, HD7850 is 57% faster, with AA, 55% faster
http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-770-im-test/4/

Case 2: eDRAM is used to push XB1 to HD7790 level of performance
At 1080P no AA, 7850 is 24% faster, 21% faster with AA

/snip

Another thing to note is the difference in texture units provided to each console's GPU.

Assuming both will utilize GCN, XBO's GPU will have 48 TMUs and the PS4's GPU will have 72 TMUs.

Therefore, even if you were to go by the numbers you presented, RS, we're looking at an even wider gap if we were to compare XBO vs PS4 as 7790 vs 7850. Like you presented, PS4's GPU is slightly faster than a 7850 (I'm estimating it to be ~4% faster), but the XBO's GPU sits halfway between the 7770 and the 7790 in pure specs alone.

So, if we use your source again we have:

Case 1: eDRAM is not efficiently used --> XBO = 7770
At 1080P no AA, PS4 is ~63% faster, with AA, ~61% faster

Case 2: eDRAM is used efficiently --> XBO = hypothetically half-way between 7770 and 7790: "7780"
At 1080P no AA, PS4 is ~44% faster, ~41% faster with AA

*crosses fingers and hopes math is right*
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
40% isn't going to be enough to create a huge IQ separation. I think the PS4 will look slightly better, but I don't think console users will care.

This is petty stuff computer nerds argue over.
 

seitur

Senior member
Jul 12, 2013
383
1
81
I wonder taking into account that Xbox1 has bigger die and PS4 having GDDR instead of DDR+sRAM and diffrent GPU which console is actually more expensive to produce at the end of the day? (let's take Kinect out of equation for a moment).
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That's an old quote, when less details about the XB1 were known (for example, it's been long confirmed that the XB1's GPU is GCN based). 32 MB of eSRAM is definitely troublesome, but to the point of 720p with only FXAA? That's what current generation consoles do. I'm fairly confident that the XB1 will handily surpass current consoles. MSAA under deferred shading will probably be out of reach most of the time though; I foresee 1080p or upscaled 900p with some sort of FXAA.



But...but...Kinect! You're getting Kinect! And everyone wants Kinect and will get a lot of use out of it. Right?...Right? :'(

I'm looking at the games we have coming day one. They look very good, and forza running 1080p @ 60fps is pretty impressive. It will be a long time before we see any performance gap and I think the gap will only amount to better AA and small things unless it is first party. I would like to revisit this when the 360 and ps3 third party support dies out in a year or two. Right now there are some games still built around that baseline.
 
Last edited: