The Official PS4 Thread

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ader098

Member
Mar 9, 2010
99
3
66
At this point, I'd also think it's too late for Microsoft to just say "Ok, we're switching to GDDR5 from DDR3" without also blowing their budget.

Microsoft doesn't need to switch from ddr3.
The next Xbox has been designed to over come its bandwidth limitation by incorporating ESRAM on GPU while also having Data Move Engine to help shuffle data around.
While it may require more work form developers/programers it also means that
the next Xbox will have much lower memory latencies compared to a straight GDDR5 memory system.

I think it will be interesting to see how these two different types of memories will affect first party games that are designed around each console.

High bandwidth+Higher latency mem. vs Lower bandwidth+lower latency mem.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Does anyone have a link to watch the presentation? I had to leave like right in the middle of it :( .
 

American Gunner

Platinum Member
Aug 26, 2010
2,399
0
71
How much of a loss would MS take selling the console at $400 with kinect included in the system? That is my major problem with the next Xbox. I feel like they will have to go cheap on stuff to put kinect in every box.
 

ManBearPig

Diamond Member
Sep 5, 2000
9,175
6
81
I always thought memory latency was not very important. Course, i read about it when i built my computer and that was soooooo long ago. IIRC, it was only like a few percent speed increase from the lowest to highest latency memory. I think i probably have no idea what i'm talking about though.
 

ManBearPig

Diamond Member
Sep 5, 2000
9,175
6
81
Leaks have pretty much been spot on. The days of anything being secret are long gone.

2x RAM as previously "reliably" rumored. PS4 has same amount but faster RAM than 720. PS4 will have approx 50% more GPU/compute power than 720.

Basically, if you read between the lines, the PS4 will likely heavily be favored for multiplatform titles in terms of which looks the "prettiest". And it's not like the supposed hardware advantage the PS3 had over the 360, it's just faster clocks and more functional GPU units. I don't expect any developers to say "OMG programming for the PS4 is SO HARD".

Thanks for the info.

I wonder why none of the new consoles are using intel chips or nvidia gpus. Maybe amd just gave them an insane deal since its a long term contract.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Thanks for the info.

I wonder why none of the new consoles are using intel chips or nvidia gpus. Maybe amd just gave them an insane deal since its a long term contract.

APU vs dedicated CPU and GPU. Keeps a lot of things down like heat, power usage, and probably price.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Microsoft doesn't need to switch from ddr3.
The next Xbox has been designed to over come its bandwidth limitation by incorporating ESRAM on GPU while also having Data Move Engine to help shuffle data around.
While it may require more work form developers/programers it also means that
the next Xbox will have much lower memory latencies compared to a straight GDDR5 memory system.

I think it will be interesting to see how these two different types of memories will affect first party games that are designed around each console.

High bandwidth+Higher latency mem. vs Lower bandwidth+lower latency mem.

The PS4 unquestionably has the advantage here.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So, how can they use a x86CPU with 8gb of RAM? I understand a x86 CPU can use 8gb of RAM (the 4gb limit was always software based on the PC end), but why didn't they go with a 64bit CPU? If this console is to last another 8 years, I'm a bit confused about them choosing a x86 CPU instead of x64? or am i reading this wrong?

QBsWPSu.png


Considering rumours last year of HD6670/7670 style GPU, this is extremely impressive.

1.84 Tflops over 176GB/sec memory bandwidth implies a GPU that's very similar to the HD7970M. You have 1152 Stream Processors @ 800mhz, 32 ROPs, 80 TMUs, 256-bit bus @ 5500mhz GDDR5.

GPU comparision

PS4: 18 CUs @ 800Mhz (1.84 Tflops), 176 GB/sec memory bandwidth

HD 7850: 16 CUs = 1024 SPs @ 860Mhz (1.76 Tflops), 154 GB/sec memory bandwidth
HD 7970M: 20 CUs = 1280 SPs @ 850Mhz (2.17 Tflops), 154 GB/sec memory bandwidth
HD 7870: 20 CUs = 1280 SPs @ 1000Mhz (2.56 Tflops), 154 GB/sec memory bandwidth

The memory bandwidth in PS360 GPUs was crippled exactly in half from comparable PC GPUs. HD7970M is just 5% slower than GTX680M with latest drivers. NV charges $350 more in retail for the GTX680M. Makes sense why Sony went AMD. Going with a desktop chip like HD7950/GTX670 would have raised the power consumption pretty high as a single 1.6Ghz quad-core Jaguar has a TDP of 25W, which suggests a custom 8-core version would use 40-50W. Also, costs were likely a factor here too.

Given that consoles are power consumption / heat dissipation constrained, this was nearly the best possible mobile GPU available from AMD until Q3 2013 when higher end HD8000M parts replace HD7970M.

Overall the GPU & memory subsystem are pretty good. The CPU is less impressive though since even the highest bin Jaguar CPU has clocks of 1.85Ghz. This one is probably binned at just 1.6Ghz. Thankfully modern games are starting to use 6 to 8 cores, which should be good for both PC gaming and console gaming optimizations.

proz%20amd.jpg


Let's not forget that the Cell was just a single core CPU, with 6 supporting SPE/SPU engines. Those engines were not real stand-alone cores and went mostly underutilized over PS3's life. The main single core CPU (PowerPC based) on PS3 had a worse IPC than Pentium 4 3.2ghz. This was a typical case for PS3's CPU in a game like Killzone 2:

The white areas are all the time where the Cell's SPEs were completely underutilized. 6 rows for each SPE/SPU in a typical PS3 game.
KZ_Jobs.bmp.jpg


Switching to out-of-order 128-bit floating point / AVX instruction Jaguar isn't going to set the world on fire against quad-core Haswell and future CPUs, but it's still a true 8-core OoO CPU unlike the turds that were in the 360/PS3. The other benefit of switching to x86 is that future PS5/6, etc. will be BC and be able to use modern x86 CPUs going forward for 10-20 years. Let AMD/Intel do all the R&D on high-end chips while console makers focus on games, media, overall console experience.
 
Last edited:

Keylimesoda

Member
May 26, 2011
43
0
0
CPU architecture means that PC games won't be straight ports.

Most PC games today can take advantage of 2-3 cores. This is why Intel beats AMD CPUs for most gaming, Intel has significant per-core performance lead right now.

Splitting the workload from a PC game evenly between 8 smaller cores will still take some black magic coding.
 

Dari

Lifer
Oct 25, 2002
17,134
38
91
I wasn't expecting a price or release date, but not showing the actual console itself is definitely a departure from the norm. The link to the Major Nelson tweet posted a few posts back pretty much says it all.

Nonetheless, I'm more interested in the PS4 than I was before the announcement, so that's something. I'm curious to see how much momentum this gives Sony, the games weren't really anything we haven't played before unfortunately. Didn't feel like a new console announcement, more like a run-of-the-mill mid-cycle E3 announcement. It's probably difficult to do that anymore though, with the internet leaks. We know a lot of stuff already, it kind of ruins the excitement.

There have been precedents. I remember when the PS2's emotion engine was first announced. It was at some semiconductor conference. They then announced the PS2 elsewhere...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Given that both next gen systems will be nearly identical, and MS has a console as well as OSes on millions of computers, isnt it a given that developers will just develops for the new xbox and then port it over to ps3?

Not even close. The GPU in Xbox 360 is widely rumored to be a 12 Compute Units 768 SP part clocked at 800mhz with 68GB/sec memory bandwidth as a result of DDR3. That is very similar to HD7770. The GPU inside PS4 is much closer in performance to the HD7850 but even has more memory bandwidth than the 7850. The difference between these GPUs is ~50% in performance.

Regarding the OS, MS is once again at a disadvantage. MS OS will have an API overhead while Sony is encouraging developers to code directly to the metal.

"Though the architectures of the next-gen Xbox and PlayStation both resemble that of PCs, several development sources have told us that Sony’s solution is preferable when it comes to leveraging power. Studios working with the next-gen Xbox are currently being forced to work with only approved development libraries, while Sony is encouraging coders to get closer to the metal of its box. Furthermore, the operating system overhead of Microsoft’s next console is more oppressive than Sony’s equivalent, giving the PlayStation-badged unit another advantage."
http://www.edge-online.com/news/the...hand-games-50gb-blu-ray-discs-and-new-kinect/

Sony has done a complete reversal of PS3 -- easier to code for, ditched the expensive horribly inefficient/unoptimized Cell for an x86 CPU and dumped as much money as reasonably possible into the GPU, even taking it as far as going with 8GB of GDDR5!

Unless those Durango specs are fake or MS goes back to the drawing board, their console is looking 50% slower in specs. This time it matters since both the CPUs and GPUs are from the exact same architectures. That means PS4's cross-platform games will either run faster or look better than 720's and most likely PS4's exclusives will look better too. The advantage for PS4 will only pile on if it has SATA 3 / upgradable after-market SSD option vs. Xbox 720's rumoured SATA 2 and continued use of proprietary MS HDDs.
 
Last edited:

American Gunner

Platinum Member
Aug 26, 2010
2,399
0
71
Not to mention that it sounded a lot like Sony actually listened to developers in making this console. If true, that will go a long way with the developers next generation.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Not to mention that it sounded a lot like Sony actually listened to developers in making this console. If true, that will go a long way with the developers next generation.

Yep...I bet Self Publishing games to PSN will be a big success in the long run.

I'm glad to see Sony sitting down with developers and asking them what is missing and what they want. Much better than telling them how you want it or how they will comply.
 

EightySix Four

Diamond Member
Jul 17, 2004
5,121
49
91
Just because I'm curious, IF it is true that they lock the games to an account with a chip in each Blu-ray disc, are you still as excited?
 

ManBearPig

Diamond Member
Sep 5, 2000
9,175
6
81
Weird, sony actually listening to someone. Appreciate all the info guys (esp. russian). Whats this about self publishing? Self publishing vs what?
 

Dari

Lifer
Oct 25, 2002
17,134
38
91
CPU architecture means that PC games won't be straight ports.

Most PC games today can take advantage of 2-3 cores. This is why Intel beats AMD CPUs for most gaming, Intel has significant per-core performance lead right now.

Splitting the workload from a PC game evenly between 8 smaller cores will still take some black magic coding.

When it comes to games, consoles is where the money's at. So, developers code for consoles and port them back for Windows. Hence, unless you're talking about all teh FPS and MMRPG on PCs, ports usually went the other way.

As for the cores, they don't have to use all 8. At least one will be allocated for the OS.

Yep...I bet Self Publishing games to PSN will be a big success in the long run.

I'm glad to see Sony sitting down with developers and asking them what is missing and what they want. Much better than telling them how you want it or how they will comply.

This is good. The days of publishers screwing over developers will come to an end, hopefully.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Microsoft doesn't need to switch from ddr3.
The next Xbox has been designed to over come its bandwidth limitation by incorporating ESRAM on GPU while also having Data Move Engine to help shuffle data around.

Going with eSRAM/eDRAM is actually a cost saving solution over the native GDDR5 setup modern GPUs use. You cannot overcome memory bandwidth limitation (68GB vs. 176GB) with eSRAM. If you could, NV and AMD would have went this route for high-end cards. Clearly then the ESRAM/EDRAM is a serious compromise for budget reasons. Here is proof:

Timothy Lottes, the creator of FXAA, at NV notes:

"A fast GDDR5 may be the desired option for developers. All of the interesting cases for good anti-aliasing require a large number of bandwidth and RAM. A tiny 32MB chunk of ESRAM cannot fit that need even for forward rendering at 1080p. I feel some developers could hit 1080p@60fps with the rumored Orbis specs in spite of good AA. My personal project is targeting 1080p@60fps with great AA on a 560ti that is a bit slower than the rumored Orbis specs. There isn’t any way my engine would hit that concentrate on at the rumored 720 specs. Ultimately on Orbis I suppose devs target 1080p/30fps (with some motion blur) and leverage the lower latency OS stack and scan out at 60fps (double scan frames) to supply a very great lower-latency experience. Maybe a similar title on 720 would render at 720p/30fps, and perhaps Microsoft is dedicating just a few CPU hardware threads to the GPU driver stack to take away the latency problem (assuming it is a “Windows” OS under the covers)."

Lottes seems fearful of Microsoft using a large amount of DDR3 memory because it might pose limits on memory bandwidth. On this issue he says:

“On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target. I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA).”

Lottes also says:

“If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won’t happen right away on launch, but once developers tool up for the platform, this will be the case.

As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API.”


Source
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I wonder why none of the new consoles are using intel chips or nvidia gpus. Maybe amd just gave them an insane deal since its a long term contract.

NV charges $350 more in retail for a mobile high-end GPU that is just 5% faster on average than the HD7970M. Mobile GPUs are most suitable for consoles due to their binning process that results in reduced power consumption and NV charges even more for them (GTX680M vs. HD7970M).

Don't forget that all NV GPUs have broken RGB when running 1080P content over HDMI.
http://blog.metaclassofnil.com/?p=83

http://community.futuremark.com/for...RGB-Range-without-custom-resolution-EASY-tool!

http://forums.guru3d.com/showthread.php?t=360587

This is why PS3 games have washed out colours/black levels compared to Xbox 360 games. On the PC this isn't a problem since you can apply your own fixes via tools/registry tweaks and most PC gamers use DVI. The added benefit of going with an AMD GPU is getting accurate native colours over HDMI out of the box. This is a minor gripe since obviously most PS3 users never noticed. If you fire up PS4 and notice much more vibrant colours and deeper blacks than your PS3 delivered, then you know why ;)

Finally GCN is more advanced on the Compute side which means slightly more future-proof for next gen games than Kepler GK104 chips are. This doesn't matter much for PC gamers who upgrade every 2-3 years but for consoles it could since it adds more flexibility for programmers/developers to utilize GCN's Compute Shaders for advanced graphical effects like shadows, ambient occlusion, post-processing and advanced anti-aliasing.

On the Intel question, they charge a lot of $ for their CPUs because Intel maintains > 60% gross margins. That option was likely too costly.

My only gripe with the PS4 is that I would have personally gone with a quad-core A8-6500 even if it meant delaying PS4 to Q1 2014. 65W and 4.1ghz boost on each core and that 18 CU GPU would use about 100W. That's still manageable. Sounds like they ran out of $ on the CPU selection.
 
Last edited:

American Gunner

Platinum Member
Aug 26, 2010
2,399
0
71
I love seeing all the MS fanboys on other sites talking trash. The 720 will be way more powerful and crap like that. I'm sure they will enjoy the Kinectbox, but its comical that they are already talking shit when they have no idea what MS will do.
 

ManBearPig

Diamond Member
Sep 5, 2000
9,175
6
81
I know a lot of people here hate on people who get used games, but as a person who buys used as well as new games, I've gotta say, probably 80% of the new games I buy are sequels or related to the used games I have played. Plus my brother and I like to exchange games and stuff so I'm glad there's no such protection.

Oh yeah...what exactly is self publishing? Can game makers not do that now?
 
Last edited: