How the PlayStation 4 is better than a PC

Page 57 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

galego

Golden Member
Apr 10, 2013
1,091
0
0
Sometimes I could cook an egg on my PS3...how is Sony planning on handling something running probably twice as hot

The PS4 uses an APU design instead of a tradittional CPU+dGPU design. The latter requires dissipation of two chips. Moreover, something as Cell was power hungry, whereas the jaguar cores are based on a power efficient design.
 

joshhedge

Senior member
Nov 19, 2011
601
0
0
The PS4 uses an APU design instead of a tradittional CPU+dGPU design. The latter requires dissipation of two chips. Moreover, something as Cell was power hungry, whereas the jaguar cores are based on a power efficient design.

The overall TDP of the system should be similar though, right? I'm sure Sony would have allocated a maximum TDP for the system somewhere close to that of the PS3 and hence the amount of thermal dissipation would be similar?

Granted the jaguar cores are based on a more power efficient design, but that GPU will generate a solid amount of heat.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Xbox One GPU is 50% weaker than PS4's

http://www.digitalspy.com/gaming/ne...er-in-graphics-than-xbox-one-says-report.html

only 768 shaders on Xbox One gpu, less than even a 6850.

Not only your math failed. You can't compare 768 shaders from one architecture to another.
perfrel.gif


6870 = 1120 Shader Units@900MHz
7850 = 1024 Shader Units@860MHz
7850 have 10% less shader power but is 20% faster.

We don't even know if next gen GPUs will be GCN or GCN2 or what?
 

joshhedge

Senior member
Nov 19, 2011
601
0
0
Not only your math failed. You can't compare 768 shaders from one architecture to another.
perfrel.gif


6870 = 1120 Shader Units@900MHz
7850 = 1024 Shader Units@860MHz
7850 have 10% less shader power but is 20% faster.

We don't even know if next gen GPUs will be GCN or GCN2 or what?

I'm sure we can assume that they are based on the same architecture.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
The PS4 uses an APU design instead of a tradittional CPU+dGPU design. The latter requires dissipation of two chips. Moreover, something as Cell was power hungry, whereas the jaguar cores are based on a power efficient design.

I'm of the opinion that even if the two chips are integrated, a watt is a watt. Whatever goes in must come out as heat. The original PS3 (which is most likely if he was talking about cooking an egg) could use around 200W while gaming; the most power efficient PS3 consumed only 35% of that, at around 70W. I would imagine that the PS4 is going to be between those two values (70-200W) initially; revisions will probably get more efficient. It'll start out running a little hotter than the most recent PS4, and possibly end up cooler after some revision.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
I'm of the opinion that even if the two chips are integrated, a watt is a watt.

That's crazy talk. Next thing you know you'll be telling me that energy is conserved, or that entropy always increases in closed systems! :p
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
The thing I find the most impressive about the new consoles was the article on Engadget where they said they will use slightly more that 100watts. I think a lot of people were expecting close to 200w from these new consoles. It's going to be very hard for a 100w PC to compete with the new consoles.

I'm more concerned that with only ~100 watts of power these consoles aren't going to last very long before even tablet and phone soc's can keep up (i'm exaggerating a little here). The wii u uses even less power but I'll bet that in a couple years an igp will easily keep up with it. xbox one and ps4 may take 5+ years before an igp can keep up.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,667
2,537
136
I'm more concerned that with only ~100 watts of power these consoles aren't going to last very long before even tablet and phone soc's can keep up (i'm exaggerating a little here). The wii u uses even less power but I'll bet that in a couple years an igp will easily keep up with it. xbox one and ps4 may take 5+ years before an igp can keep up.

Assuming perfect 50% power scaling at each node, reasonable tdp for a tablet chip of 4W, and full nodes every 2 years, 100W today can be fit into a tablet in ~10 years. I'd consider that a very reasonable lifetime for the consoles. These estimates are very optimistic, in reality this will take longer.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Assuming perfect 50% power scaling at each node, reasonable tdp for a tablet chip of 4W, and full nodes every 2 years, 100W today can be fit into a tablet in ~10 years. I'd consider that a very reasonable lifetime for the consoles. These estimates are very optimistic, in reality this will take longer.

I was thinking more igp related. (but don't forget that new architectures will be introduced which can strongly shrink the gap, ie unified shaders for modern pcs, vs the ps3 in the past).

53969.png


ipad 4 (almost 1 year old) competes well with the 7900 GS (released fall 2006). About 6 years difference between them.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
I'm more concerned that with only ~100 watts of power these consoles aren't going to last very long before even tablet and phone soc's can keep up (i'm exaggerating a little here). The wii u uses even less power but I'll bet that in a couple years an igp will easily keep up with it. xbox one and ps4 may take 5+ years before an igp can keep up.

I have yet to see a smartphone or tablet SoC even begin to match the Xbox 360/PS3 and those things are 7/8 years old. Plus, tablets and phones have the downside of needing to push 1080p and beyond displays; you can't sacrifice resolution to get better graphical effects.

The ONLY mobile system with PS3-esque performance is the PS Vita which even today would still be top of the line GPU hardware, plus you have developers specifically coding against it and less of an OS in the way as well as Sony specific optimizations.

Besides, even if the power were there, would it matter? Have you ever seen an original game on mobile like Final Fantasy X, Metal Gear Solid 2, God of War or Ico? The hardware certainly is there, the desire to make those type of games isn't though, especially when a 2D game like Angry Birds will give a far better return compared to the amount of money put it (and no one is going to buy a $20+ iPhone game).
 

insertcarehere

Senior member
Jan 17, 2013
712
701
136
I have yet to see a smartphone or tablet SoC even begin to match the Xbox 360/PS3 and those things are 7/8 years old. Plus, tablets and phones have the downside of needing to push 1080p and beyond displays; you can't sacrifice resolution to get better graphical effects.

Last thing I heard, the RSX in the PS3 was a castrated 7800GT.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
interesting still its pre build pc box

Please, define 'pc'...

I was thinking more igp related. (but don't forget that new architectures will be introduced which can strongly shrink the gap, ie unified shaders for modern pcs, vs the ps3 in the past).

53969.png


ipad 4 (almost 1 year old) competes well with the 7900 GS (released fall 2006). About 6 years difference between them.

Nope. It is well below 8500gt... which is what? 2-3x slower than 7900gs. In this review there are clear hints that you can't compare it due to architectural differences. These benchmarks are opted for unified arch with small memory bus etc.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
PCPer is out with an article about how the PS4 version of the UE4 demo was scaled back. I guess at the end of the day, raw GFLOPS matter (and can't be replaced by efficiency gains).

This shows what I have come to understand for a long time - console efficiency and "to the metal" programming is real but over-hyped. A little lower texture res, removal of some lighting and particle effects and lower AA/resolution/frame rate are why games which can't be maxed at 60 FPS on the PC can be played on consoles which are much weaker on a pure FLOP basis.
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
So from this

In both our Xbox One and PS4 articles I referred to the SoCs as using two Jaguar compute units - now you can understand why. Both designs incorporate two quad-core Jaguar modules, each with their own shared 2MB L2 cache. Communication between the modules isn’t ideal, so we’ll likely see both consoles prefer that related tasks run on the same module.

We can surmise that the

Xbox One = Jaguar + Jaguar + 7750 + misc

GCN units = 2 + 2 + 8 = 12 = 768 shader units

PS4 = Jaguar + Jaguar + 7790 + misc

GCN units = 2 + 2 + 14 = 18 = 1152 shader units
 
Last edited:

Tuna-Fish

Golden Member
Mar 4, 2011
1,667
2,537
136
So from this

We can surmise that the

Xbox One = Jaguar + Jaguar + 7750 + misc

GCN units = 2 + 2 + 8 = 12 = 768 shader units

PS4 = Jaguar + Jaguar + 7790 + misc

GCN units = 2 + 2 + 14 = 18 = 1152 shader units

Umm, no. It's all on a single chip of silicon, and the Jaguar cores are not attached to their own shader arrays. The whole point of the Jaguar design is that it's modular -- you don't need to integrate the whole thing, you can just plop down as many Jaguar CU's (as in, 4 Jaguar cores + cache) and whatever else you want on the same chip.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
PCPer is out with an article about how the PS4 version of the UE4 demo was scaled back. I guess at the end of the day, raw GFLOPS matter (and can't be replaced by efficiency gains).

Honestly, that editorial is complete rubbish, although Epic can certainly take some of the blame for throwing out random numbers with imprecise language.

Let's do some math:

A GTX 680 is 3.09 TFLOPs, so in order to render a scene that requires a "theoretical" 2.5 TFLOPs the GTX 680 needs to be (2.5 / 3.09 * 100) 80.9% efficient, which is incredibly high.

Let's look at the latest ratings for 1920x1200 from TPU. It's a bit imprecise, but will fit our needs adequately.

GTX 580 = 60% at 1581 GFLOPs, or 26.35 GFOPs per %

GTX 680 = 76% at 3090 GFLOPs or 40.66 GFLOPs per %

Based on these numbers we can tell that GTX 680 is only (26.35 / 40.66 * 100) 64.8% efficient per FLOP compared to GTX 580. Can you see the issue now?

Assuming that GTX 580 is 100% efficient, which it obviously isn't -- but just for the sake of further driving in an argument we'll assume it is -- this leaves GTX 680 with a (2500 - (3090 * 0.648)) 497.7 GFLOP performance shortfall compared to the 2.5 TFLOPs that are supposedly required to run the demo. This shortfall would be far greater still if we were able to factor in the efficiency rating of GTX 580 instead of assuming 100% efficiency.

The fact that GTX 680 can run the Samaritan demo at all debunks the claim that it requires 2.5 TFLOPs of theoretical power -- GTX 680 isn't anywhere close to efficient enough to bring that amount of power to bear on the task.

Epic is guilty of pulling some random numbers out of their ass (and/our misuse of the word theoretical), and the author of the editorial is guilty of not seeing how he's contradicting himself.

This isn't to say that PS4 can or cannot run the Samaritan demo, just that the editorial is complete garbage.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
Let's look at the latest ratings for 1920x1200 from TPU. It's a bit imprecise, but will fit our needs adequately.

GTX 580 = 60% at 1581 GFLOPs, or 26.35 GFOPs per %

GTX 680 = 76% at 3090 GFLOPs or 40.66 GFLOPs per %

Based on these numbers we can tell that GTX 680 is only (26.35 / 40.66 * 100) 64.8% efficient per FLOP compared to GTX 580. Can you see the issue now?

That is an average of frame rates, which is not how GFLOPS work. Anyways, I think he was talking about peak GLOPs not effective. Also, regardless of what the number is, the PC demo still is clearly running more effects.
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
Umm, no. It's all on a single chip of silicon, and the Jaguar cores are not attached to their own shader arrays. The whole point of the Jaguar design is that it's modular -- you don't need to integrate the whole thing, you can just plop down as many Jaguar CU's (as in, 4 Jaguar cores + cache) and whatever else you want on the same chip.

That's not what I got from the quote.

Until we get the actual details, what I said could be the case.

I would like to add. AMD doesn't seem to produce any 12 or 18 GCN layouts.

They come in

7750 = 8
7770 = 10
7790 = 14
7850 = 16
7870 = 20
7870xt = 24
7950 = 28
7970 = 32

There has to be some reason for such modularity.
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
That is an average of frame rates

The fact that it's an average (which means that the Samaritan scene may be better for Kepler, but it could also be worse) is completely overshadowed by the fact that we're assuming 100% efficiency for Fermi -- no best case scenario is going to overcome that.

which is not how GFLOPS work.

It lets us compare average architectural efficiency. Obviously, there can be bottlenecks that aren't shader related, but those still end up effecting shader throughput.

If you want to single out shader performance you get pretty much the same story. Have a look at this page of TR's GTX 680 review. Absolute best case, in perlin noise (which is about as close as you can get to a test of pure, stupid shading power) Kepler is only 80.2% as efficient as Fermi per FLOP, and in most shader tests Kepler is far less efficient than that. Given that Fermi isn't 100% efficient to begin with, we still have a significant shortfall.

Anyways, I think he was talking about peak GLOPs not effective.

That's a huge assumption, but even if you're correct GTX 680 still isn't going to make that in an actual rendered scene when it's still shy of 2.5 TFLOPs in perlin noise.

I don't think you understand exactly how difficult 80.9% efficiency outside of a power virus (eg. for a complex task) actually is.

Also, regardless of what the number is, the PC demo still is clearly running more effects.

I didn't make an argument stating otherwise, so you bringing up this point is at best a straw man and at worst betraying a significant bias.
 
Last edited:
Status
Not open for further replies.