PS4 Pro GPU

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
The way I read Mark Cerny's twitter post leads me to believe it's Polaris with custom elements added on at the request of Sony. This would also make the most sense for both AMD and Sony. It makes far more sense for AMD to port Jaguar down to a newer process then to port Polaris back to an older process and increase Jaguar clocks which would reduce yields further.

In addition, the large PSU doesn't mean power consumption is radically higher. Sony likes to put relatively large PSUs in their consoles for example, a PS3 slim uses a 210w/250w PSU but power consumption with a 45nm CELL and 65nm RSX is around 96w when gaming according to CNET. Power consumption on a launch PS3 with GS+EE, 90nm CELL, and 90nm RSX is 206w but the console uses a 380w PSU. My PS3 slim uses even less power then the figures given by CNET because it's a newer model with a 40nm RSX however, it still uses the same 250w/210w PSU.

There is also no need to stay with a GCN 1.1 ISA because consoles still use APIs that provide some level of abstraction; even previous consoles used APIs. I don't even think it's an option anymore for devs to bypass APIs on current systems like it used to be.
 
  • Like
Reactions: RussianSensation

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Tbh it's the backwards compatibilty that makes me think it isn't proper polaris. There are 100's of PS4 games, all using different coding, different compiler versions, different low level optimisations. Sony wants them all to just work, it can't be wasting millions having to put in all sorts of special driver coding to work around game X or game Y's design.

Now you can't really claim GCN 1.3/4 and GCN 1.1 are all the same at the API level because that's not true, there will be specific optimisations just for 1.1 that don't work anything like the way in 1.3. Even if the code worked the performance would not be the same, you tweak that the game will not have funny errors due to different timing issues. The game devs won't fix these, sony can't fix game devs code so they leave themselves a complex job of low level driver hacking to try and get the games stable.

The easiest way to make it compatible is to have the same cpu and gpu cores running at the same speeds as they do on the PS4. They clearly have the same cpu cores (higher clock but you just downclock them a bit for a PS4 game). It makes logical sense to go with the same GPU cores too. Then everything will just work and Sony don't have compatibility problems.
 

jpiniero

Lifer
Oct 1, 2010
14,599
5,218
136
I believe that the developer has to set a flag, otherwise the game will run with similar specs to the original PS4. However any game released after the Pro's launch has to have it enabled.
 
  • Like
Reactions: DarthKyrie

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Now you can't really claim GCN 1.3/4 and GCN 1.1 are all the same at the API level because that's not true,
Actually there are instructions and image data formats in GCN 1.1 that were removed for GCN 1.2. This is why backcompat requirements are such a critical question. If those instructions/formats need to be there, then Sony would have to stay on GCN 1.1-ish.

http://32ipi028l5q82yhj72224m8j.wpe..._GCN3_Instruction_Set_Architecture_rev1.1.pdf

Summary of kernel instruction change from Generation 2 to 3
•Modified many of the microcode formats: VOP3A, VOP3B, LDS, GDS, MUBUF, MTBUF, MIMG, and EXP.
•SMRD microcode format is replaced with SMEM, now supporting reads and
writes.
•VGPR Indexing for VALU instructions.

•New Instructions
–Scalar Memory Writes.
–S_CMP_EQ_U64, S_CMP_NE_U64.
–16-bit floating point VALU instructions.
–“SDWA” – Sub Dword Addressing allows access to bytes and words of VGPRs in VALU instructions.
–“DPP” – Data Parallel Processing allows VALU instructions to access data from neighboring lanes.
–V_PERM_B32.
–DS_PERMUTE_B32, DS_BPERMPUTE_B32.

•Removed Instructions
–V_MAC_LEGACY_F32
–V_CMPS* - now supported by V_CMP with the “clamp” bit set to 1.
–V_MULLIT_F32.
–V_{MIN, MAX, RCP, RSQ}_LEGACY_F32.
–V_{LOG, RCP, RSQ}_CLAMP_F32.
–V_{RCP, RSQ}_CLAMP_F64.
–V_MUL_LO_I32 (it’s functionally identical to V_MUL_LO_U32).
–All non-reverse shift instructions.
–LDS and Memory atomics: MIN, MAX and CMPSWAP on F32 and F64 data.

•Removed Image Data Formats
–snorm_lz (aka: snorm_ogl)
–ubnorm
–ubnorm_nz (aka: ubnorm_ogl)
–ubint
–ubscal

Mind you, it should be a really awesome console regardless. I just like pondering whether GCN 4 would actually be compatible here.
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
it's a good bit slower than the 480... it's like locking a 480 at 910MHz and 6.8GHz memory.

but still, looks pretty good overall compared to the previous consoles; a shame that it's kind of going to be limited because of the 4K target and the limitations imposed by Sony regarding original PS4 compatibility and so on... if they could make games exclusive to the PS4Pro it would be a lot more interesting.

considering the Xbox One S sits at 1.4 TF at $299, this thing is 4.2 TF at $399 looks nice, but a 470 is faster than the PS4 Pro GPU anyway.
Maybe Sony eventually will deliver a patch with an OC by 25% to go to 480 levels.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Actually there are instructions and image data formats in GCN 1.1 that were removed for GCN 1.2. This is why backcompat requirements are such a critical question. If those instructions/formats need to be there, then Sony would have to stay on GCN 1.1-ish.

There actually isn't a need to stay on GCN 1.1. These instructions could very easily be added at the request of Sony.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Tbh it's the backwards compatibilty that makes me think it isn't proper polaris. There are 100's of PS4 games, all using different coding, different compiler versions, different low level optimisations. Sony wants them all to just work, it can't be wasting millions having to put in all sorts of special driver coding to work around game X or game Y's design.

Now you can't really claim GCN 1.3/4 and GCN 1.1 are all the same at the API level because that's not true, there will be specific optimisations just for 1.1 that don't work anything like the way in 1.3. Even if the code worked the performance would not be the same, you tweak that the game will not have funny errors due to different timing issues. The game devs won't fix these, sony can't fix game devs code so they leave themselves a complex job of low level driver hacking to try and get the games stable.

The easiest way to make it compatible is to have the same cpu and gpu cores running at the same speeds as they do on the PS4. They clearly have the same cpu cores (higher clock but you just downclock them a bit for a PS4 game). It makes logical sense to go with the same GPU cores too. Then everything will just work and Sony don't have compatibility problems.
The purpose of an API is to provide abstraction. So it's very possible.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
What I'm trying to say is what the PS4 Slim uses most likely won't be comparable to PS4 Pro unless you choose to run it games at PS4 settings/frame rates.

The only thing that matters when comparing the efficiency between the PS4 Slim and the PS4 Pro (and thus whether or not the PS4 Pro can be considered an inefficient 14/16nm design) is the load percentage. If the load percentage is similar for the two, then the only determinant of power usage should be the amount of hardware inside of them. In other words it doesn't matter if they are both running at 50% load or 100% load on average, since either way the PS4 Pro would still be activating roughly twice the amount of hardware and thus be expected to use roughly twice the amount of power.

Only if the relative load is different does it make a difference. If the relative load on the PS4 Slim is lower than the PS4 Pro, then that would entail that the PS4 Pro is activating more than twice the amount of hardware, and thus only using twice the amount of power would actually be quite efficient.

If the PS4 was using 130W before the die shrink, what do you think the die shrink would bring it down to? 100W? Less?

No Idea to be honest since the 130W includes all of the components in the PS4, and not all of them have gone through a die shrink, but either way if we assume that the average power usage to rated max power usage remains the same, then the original PS4 used 60% (150W out of 250W max), and so the PS4 Slim would likely use 90W (60% of 150W).

So that would then be a reduction from 130 for the PS4 revision to 90W for the PS4 Slim.

Exactly. And this ties back to one of my original comments: This thing is either a horrendously inefficient 14/16nm chip or it's still 28nm.

When a 480X is higher clocked, has more cores, is running with two sets of memory, a higher clocked CPU, not frame limited, and it comes in whole system @ 301W. Then a console with less cores, lower clocked, less RAM, and a CPU that is anemic - even @ 190W, that is just a giant red flag to me. Also, I just came across the leaks, they state 2.1ghz on the CPU (not much difference but just means it should use less power overall).

Either way, just me pondering.

Except it isn't horrendously inefficient for a 14/16nm design, quite the contrary. Having a max power usage twice that of the PS4 Slim is pretty much exactly what you would expect, given the difference in hardware.

If the average power usage is 190 (or 186W if taking 60% of 310W), then that would still only be about twice that of the PS4 Slim at 90W, so again perfectly reasonable.

I don't see why a PC with comparable (but higher clocked) components using 50% more power somehow makes the PS4 Pro look bad.

It's certainly possible that the PS4 Pro end up being somewhat inefficient, once we get some actual power measurements from it (and from the PS4 Slim), but with the info we have available today there really isn't evidence of it imho.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
@railven

Well , according to https://twitter.com/PlayStation/status/773600156356866048/photo/1 , It says AMD's Polaris Tech.why do you think it's not ? You think They lie?

What is with these odd confrontational posts? Got Russian basically demanding me to admit that I'm wrong and this post accusing me of saying/thinking Sony lied.

I addressed this tweet - notice the wording. "based on elements of AMD's Polaris tech" when it could have easily said "based on AMD's Polaris tech."

Read the AT article, even Ryan is open to the possibility that this thing might not be a straight out Polaris chip. Be really interesting if AMD can even do that and create custom chips borrowing from all their different IP/techs.

Or you confused PSU with PS4 Pro? or you think because they said 310w , this means PS4 Pro can consume up to 310w? Right?
I can replace my PSU with PSU which is rated at 1000w, does that mean my PC can consume up to 1000w?

Um, yeah it sort of does. Haha.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
@railven

Well , according to https://twitter.com/PlayStation/status/773600156356866048/photo/1 , It says AMD's Polaris Tech.why do you think it's not ? You think They lie?
"GPU based on elements of AMD's Polaris tech, and some beyond"

What does that mean - there's lots of stuff that could be called "Polaris tech" ? - the display engine with it's HDR 4k support for example, taking elements of that doesn't confirm they have changed the GCN cores.

@railven
Or you confused PSU with PS4 Pro? or you think because they said 310w , this means PS4 Pro can consume up to 310w? Right?
I can replace my PSU with PSU which is rated at 1000w, does that mean my PC can consume up to 1000w?

Now even you know you are talking rubbish. The PS4 isn't some dodgy home build, it will have exactly the right amount of PSU power required for it's components. Over-spec'ing increases costs and when you are sell millions of these every penny counts. Hence if it's got a 310w psu you can be sure it needs a 310w psu.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
More official words from Sony, look at the wording (using)

http://blog.us.playstation.com/2016...id=774079896226177024&adbpl=tw&adbpr=10671602

PS4 Pro: Technical Specifications
Q: How does PS4 Pro compare to the standard PS4?
PS4 Pro is significantly more powerful than the standard PS4 model. PS4 Pro’s advanced graphics processor unit incorporates many features from AMD’s latest “Polaris” architecture, as well as some fully custom hardware innovations, and is considerably more powerful than the GPU included in the standard PS4.
All in all, this increase in processing power enables developers to tap into far more demanding visual features for PS4 Pro owners, including smoother or more stable framerates, support for 4K rendering, advanced graphics features, and more.

This thing is clearly more than just simply Polaris in a console.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Now even you know you are talking rubbish. The PS4 isn't some dodgy home build, it will have exactly the right amount of PSU power required for it's components. Over-spec'ing increases costs and when you are sell millions of these every penny counts. Hence if it's got a 310w psu you can be sure it needs a 310w psu.

From several posts before yours:

In addition, the large PSU doesn't mean power consumption is radically higher. Sony likes to put relatively large PSUs in their consoles for example, a PS3 slim uses a 210w/250w PSU but power consumption with a 45nm CELL and 65nm RSX is around 96w when gaming according to CNET. Power consumption on a launch PS3 with GS+EE, 90nm CELL, and 90nm RSX is 206w but the console uses a 380w PSU. My PS3 slim uses even less power then the figures given by CNET because it's a newer model with a 40nm RSX however, it still uses the same 250w/210w PSU.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
PS4 Pro’s advanced graphics processor unit incorporates many features from AMD’s latest “Polaris” architecture, as well as some fully custom hardware innovations, and is considerably more powerful than the GPU included in the standard PS4.

Sounds like its based on Polaris with some tweaks from the desktop parts so not true "Polaris". I don't see why everyone is arguing over it :D

GPU in XB1 vs PS4 were similar but the PS4 was expanded with more ACES. Sounds like they just modified the polaris design to fit their hardware better.
 
  • Like
Reactions: DarthKyrie

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Still I'm not convinced that "Up to 310w" means It can go up to 310w.even "Up to 310w" doesn't mean It's Not Polaris or it's base on 28nm.

Now even you know you are talking rubbish. The PS4 isn't some dodgy home build, it will have exactly the right amount of PSU power required for it's components. Over-spec'ing increases costs and when you are sell millions of these every penny counts. Hence if it's got a 310w psu you can be sure it needs a 310w psu.

Remember Making PSU with 80% Efficiency is cheaper than PSU with Efficiency rated at 90%.

still waiting for datasheet.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Sounds like its based on Polaris with some tweaks from the desktop parts so not true "Polaris". I don't see why everyone is arguing over it :D

GPU in XB1 vs PS4 were similar but the PS4 was expanded with more ACES. Sounds like they just modified the polaris design to fit their hardware better.

Speaking of ACEs, doesn't PS4 GPU have 8 ACEs and Polaris only have 4? In order to have proper backwards compatibility that could be one of the PS4 Pro differences from Polaris right there.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Speaking of ACEs, doesn't PS4 GPU have 8 ACEs and Polaris only have 4? In order to have proper backwards compatibility that could be one of the PS4 Pro differences from Polaris right there.

Polaris has 8 ACE's in total. 4 standard ACEs and 4 in the HWS.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I have read that, but has the 2 HWS = 4 ACE been explored? For console games carefully programmed to use all 8 ACEs from the PS4 GPU, is it as simple as using the 4 ACEs + 2 HWS's?

Curious, as I don't really have the technical knowledge. Just speculating a possible modification for nearly 100% PS4 compatibility purposes (which they would probably need).
 

Byte

Platinum Member
Mar 8, 2000
2,877
6
81
The original PS4 and the Xbone were a joke I agree. But this new one is quite powerful in the GPU department and quite cheap at $399. The CPU doesn't worry me as they tend to code console games "to the metal". We've all seen how PC ports seem to waste CPU cycles like crazy with insane overhead.

Maybe DX12 will fix this considering everything is on x86 but these consoles always have their tricks. For example I can't for the life of me understand why an original Xbox can't be emulated. 733MHz P3?

In pure teraflop terms it meets the level of a 970GTX which is minimum requirement for Vive and Oculus.

More like a Celeron. But the xbmc video app is amazing. It can faster forward smoother than ANYTHING PERIOD. I haven't tried that many video players on PC but nothing was that smooth.