How the PlayStation 4 is better than a PC

Page 58 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
T
That's a huge assumption, but even if you're correct GTX 680 still isn't going to make that in an actual rendered scene when it's still shy of 2.5 TFLOPs in perlin noise.

I don't think you understand exactly how difficult 80.9% efficiency outside of a power virus (eg. for a complex task) actually is.
Indeed.
It took a long time to get anywhere near as efficient GPU designs as we have today even for fixed units.

Now that GPUs (and CPUs) have hit the power wall, we will see overall power efficiency increase, but single thread calculation efficiency will most likely get even worse.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
I'm of the opinion that even if the two chips are integrated, a watt is a watt.

The key here is that an integrated chip is not the same than the sum of the discrete parts. The Radeon dGPU has a TDP of about 140 W but the whole PS4 APU (which includes the CPU) is waited to be in the sub-100 W.

Since a watt is a watt, as you correctly note, Sony only has to dissipate less than 100W, whereas would dissipate near 200 W if had selected a traditional CPU+dGPU design.


PCPer is out with an article about how the PS4 version of the UE4 demo was scaled back. I guess at the end of the day, raw GFLOPS matter (and can't be replaced by efficiency gains).

This shows what I have come to understand for a long time - console efficiency and "to the metal" programming is real but over-hyped. A little lower texture res, removal of some lighting and particle effects and lower AA/resolution/frame rate are why games which can't be maxed at 60 FPS on the PC can be played on consoles which are much weaker on a pure FLOP basis.

That article has at the end a twitter by Mark Rein that says "I call bullshit on this one". This also applies to that article.

Epic has already explained what happened with the UE4 showed at GDC. Here goes a resume:

  • Both demos are essentially the same but the PS4 version has a different cinematics becoming from joining the previous PC demo with the extended part showed at GDC PS4 event.
  • The main differences are that the PS4 version had SVOGI replaced by other GI technique, tessellation was broken due bug, and had a slight scaling down of particles in some FX.
  • The PS4 version was not running on the PS4 but in a dev. kit that they received only a pair of weeks before the GDC show. They had no time to optimize anything or to fix the bugs when porting from the PC (see below).
  • The dev. kit used non-final APIs. Whereas the PC version is based in the mature DX11.
  • The PC version was slightly scaled back. To maintain 30 fps the PC version, running on a GTX-680 (2 GB) + 16 GB RAM + i7, did target a sub-1080p resolution. The PS4 version targets 1080p.
  • It was leaked on the web that the dev. kit running the PS4 version was using between a 27% and a 29% of the performance of the final PS4 hardware.
  • Although the dev. kit. have 2.2 GB VRAM, it has been leaked by some game developer that some early kits had only 1.5 GB enabled. Recall that the demo requires 2 GB VRAM minimum.
http://www.psu.com/forums/showthrea...ummies/page3?p=6023661&viewfull=1#post6023661
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Galego - do you honestly believe a PS4 is equivalent to >2 Titans in performance? You don't think guys like Mark Rein have ulterior motives to sell you on a console like PS4 so that you'll purchase their games? Use some common sense man. And why are you trying to market a PS4 to PC enthusiasts? Your time would be better spent spinning these fantasies on an Xbox forum.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Ahahaha

8812345624_e9e2a37cfb_o.png


hahaha.

AMD needs to sprinkle some magic pixie dust on Epic.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Galego - do you honestly believe a PS4 is equivalent to >2 Titans in performance? You don't think guys like Mark Rein have ulterior motives to sell you on a console like PS4 so that you'll purchase their games? Use some common sense man. And why are you trying to market a PS4 to PC enthusiasts? Your time would be better spent spinning these fantasies on an Xbox forum.

I have said in this thread how a PC with 3 titans will outperform the PS4.

I have quoted in this thread to a known analyst (expert on graphics cards) saying that a PC with 3 titans will outperform the PS4.

I have quoted a Nvidia developer saying how the PS4 will be ahead of PCs on gaming performance. I doubt his motivation was "to sell you on a console like PS4 so that you'll purchase their games?".

I prefer a person as Mark Rein, whose relation to PS4 is well-known, rather than anonymous posters in forums, whose relation to Intel and Nvidia is unknown, specially when some of those anonymous posters have a large record of bashing AMD.

I am posting in a thread with title "How the Playstation 4 is better than a PC". What better place to discuss this? If you are not interested you can unsubscribe.

Common sense is the less common of senses. There are posts in those forums where you can find posters arguing (in the past) why no AMD APU could be found on the PS4 and the Xbox. They argued that was "common sense".
 
Last edited:

Tuna-Fish

Golden Member
Mar 4, 2011
1,667
2,537
136
That's not what I got from the quote.

Until we get the actual details, what I said could be the case.

We actually have all the details. The VGLeaks leak has been verified to be spot on by several reliable independent parties, including Digital Foundry. Also, using such separated system would be completely pointless, given how the whole point in Jaguar, the thing AMD advertised about it for potential customers, was how it can be quickly integrated into a semi-custom solution with whatever else the customer wants.

I would like to add. AMD doesn't seem to produce any 12 or 18 GCN layouts.

There has to be some reason for such modularity.

The only layouts AMD is producing are 32, 20, 14, 10 and 6. The rest are harvests. In one of the technical presentations near GCN release the AMD rep said that they can integrate the CUs in any amounts they want. Only, masks are expensive, so you only want enough chips to serve all market segments, and since vendors don't want too many models with very similar performance, you limit harvesting to about 3 models per chip.

The consoles won't physically have 12 and 18 CUs. Redundancy and harvesting is absolutely necessary for yields when building large chips on the TSMC 28nm process. By the last numbers they released (somewhat oudated now) their pre-harvesting yields were ~40% for a much smaller chip. Fully enabled chips can *only* be produced economically if you can harvest the failed ones into lower-value models. When you can't do that, you must include some redundancy into all chips so that you can use the majority of chips that have a failure or two.

Given the programmer-visible CU counts, I'd expect there to actually be 14 and 20 CUs on the chip.
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
I have said in this thread how a PC with 3 titans will outperform the PS4.

I have quoted in this thread to a known analyst (expert on graphics cards) saying that a PC with 3 titans will outperform the PS4.

I have quoted a Nvidia developer saying how the PS4 will be ahead of PCs on gaming performance.

I prefer a person as Mark Rein, whose relation to PS4 is well-known, rather than anonymous posters in forums, some of which have a large record of bashing AMD.

I am posting in a thread with title "How the Playstation 4 is better than a PC". What better place to discuss this? If you are not interested you can unsubscribe.

Common sense is the less common of senses. There are posts in those forums where you can find posters arguing (in the past) why no AMD APU could be found on the PS4 and the Xbox. They argued that was "common sense".
galego to me you come across as someone who's genuinely interested in a serious debate, although with a bit of hyperbole not unlike me, however there is no meaningful quantitative analysis that can be done between a PS4 & PC, any PC for that matter ! Its like comparing a custom made motorbike to a factory tuned eight cylinder SUV, sure the motorbike will come out faster in the rarest of rare cases but for the rest of the 99% real world scenarios the SUV will indeed be faster in nearly each & every one of'em ! So personally I'd suggest you to stop making this particular point over & over again because it cannot be proven by Mark Rein or anyone else in MIT/NASA et al, since the only metric comparable in each case is their respective theoretical GFLOP values & not much else I'm afraid ! So I'll reiterate ~ let it go man !
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I'm of the opinion that even if the two chips are integrated, a watt is a watt. Whatever goes in must come out as heat. The original PS3 (which is most likely if he was talking about cooking an egg) could use around 200W while gaming; the most power efficient PS3 consumed only 35% of that, at around 70W. I would imagine that the PS4 is going to be between those two values (70-200W) initially; revisions will probably get more efficient. It'll start out running a little hotter than the most recent PS4, and possibly end up cooler after some revision.

integrating all these things into 1 chips is very beneficial in terms of power usage:
- only one (yea...bigger but still) piece of silicon needs to be under voltage
- power doesn't need to be delivered via PCB (very thin lines with high resistance)
- there are some parts that are shared. One of the most power hungry things are memory controllers. PS4's memory controllers will be used by CPU and GPU at the same time.
7790 takes what? 80 watts? 8 jaguar cores should't take more than 40 watts. Add 6gb gddr5, blueray, PSU etc... Whole console will be about 150W at most.
We can expect that power consumption should get lower and lower along the life cycle of the console. Maybe they even go node down(20nm) in few years?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
It's good that you are all being champions for the PC master race, but the ps4 is a generation ahead in one metric. Gaming performance for watt.

There is no way you will get similar gaming performance around 100watts on the PC.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It's good that you are all being champions for the PC master race, but the ps4 is a generation ahead in one metric. Gaming performance for watt.

There is no way you will get similar gaming performance around 100watts on the PC.

You should create a new thread where perhaps there may be others besides just yourself trying to argue this point. There are two reasons why this isn't being debated

1) everyone knows
2) no one cares
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
It's good that you are all being champions for the PC master race, but the ps4 is a generation ahead in one metric. Gaming performance for watt.

There is no way you will get similar gaming performance around 100watts on the PC.
I'd say that is a gap which cannot be overcome in a generation, two gens at the earliest on a PC, because the console parts will also get a die shrink at 20nm, say 2~3yrs down the line, making them even more efficient/refined than what they're at present !
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
It's good that you are all being champions for the PC master race, but the ps4 is a generation ahead in one metric. Gaming performance for watt.

There is no way you will get similar gaming performance around 100watts on the PC.

Very true. But that's a copout argument. Lets take your logic further, will the PS4 be 50x the speed of current smartphones? ;) I doubt it....they use about 2w and can play lots of games...
 

showb1z

Senior member
Dec 30, 2010
462
53
91
It's good that you are all being champions for the PC master race, but the ps4 is a generation ahead in one metric. Gaming performance for watt.

There is no way you will get similar gaming performance around 100watts on the PC.

And nobody has claimed that. Gotta keep them 850W PSUs busy with something.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You should create a new thread where perhaps there may be others besides just yourself trying to argue this point. There are two reasons why this isn't being debated

1) everyone knows
2) no one cares

The two people building 1oow gaming rigs care :colbert:
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
The two people building 1oow gaming rigs care :colbert:

No, just one. The other is trying to build a theoretical gaming PC that MIGHT be able to compete with PS4 so he's going with Tri-SLI titans and a processor that doesn't exist yet (the current crop of i7s are way too slow) in hopes it can match this "not just an APU" APU. ;)
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The two people building 1oow gaming rigs care :colbert:

No, just one. The other is trying to build a theoretical gaming PC that MIGHT be able to compete with PS4 so he's going with Tri-SLI titans and a processor that doesn't exist yet (the current crop of i7s are way too slow) in hopes it can match this "not just an APU" APU. ;)

You two are wasting my bandwidth...
People actually complained about how hot consoles run. So the performance/watt is second most important thing when it comes to console (performance/$ is first)
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
You two are wasting my bandwidth...
People actually complained about how hot consoles run. So the performance/watt is second most important thing when it comes to console (performance/$ is first)

Valid point.

Keep in mind when the PS3/X360 launched, they actually had a pretty potent GPU. The CPU was not the fastest, but neither option was power-efficient.

Jump forward to now, and you are still getting poor CPU performance (by current standards), but it is a lot more efficient. You are also getting a pretty mediocre GPU, but again an efficient design.

If the new consoles were like the last, you would see a modified 7970 shoe-horned into the system with a BD-esqe CPU in terms of power usage. You can bet those would be sucking down the juice in terms of power.

Edit: Also keep in mind that the first round of consoles was negatively impacted by 'Bumpgate' type issues with the transition to lead-free solders. This gen doesn't have to worry about that most likely...
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
You two are wasting my bandwidth...
People actually complained about how hot consoles run. So the performance/watt is second most important thing when it comes to console (performance/$ is first)
This is how I like to pick my PC parts except that I'm willing to spend more on efficiency than your avg joe is or even hardcore gamers for that matter !
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Very true. But that's a copout argument. Lets take your logic further, will the PS4 be 50x the speed of current smartphones? ;) I doubt it....they use about 2w and can play lots of games...

Your comparisons are not logical. No smartphone GPU can come anywhere close to the sophistication of the PS4 GPU. raw GFLOPS is not everything. architecture and programmability are very important. the PS4 GPU is a modified GCN design and is excellent at graphics and compute. PS4 APU has 1840 GLOPS with 8 Jaguar cores at 1.6 - 1.8 Ghz (CPU clocks not yet confirmed). the PS4 SOC is expected to be 100w. its a very efficient piece of silicon.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Your comparisons are not logical. No smartphone GPU can come anywhere close to the sophistication of the PS4 GPU. raw GFLOPS is not everything. architecture and programmability are very important. the PS4 GPU is a modified GCN design and is excellent at graphics and compute. PS4 APU has 1840 GLOPS with 8 Jaguar cores at 1.6 - 1.8 Ghz (CPU clocks not yet confirmed). the PS4 SOC is expected to be 100w. its a very efficient piece of silicon.

Again I will say. The PS4 has no chance to be 50x faster than a top-tier phone. Deal with it. It is NOT the efficiency king.

Edit" Sophisticated? How? It's a bottom-feeder CPU with a mid-range GPU. Do you think an Atom or Bobcat is 'sophisticated'? No. Efficient? Yes. The most efficient around? No. :)
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You two are wasting my bandwidth...
People actually complained about how hot consoles run. So the performance/watt is second most important thing when it comes to console (performance/$ is first)

Worrying about heat is not the same thing as worrying about consumption. It was an issue to to inadequate cooling which can happen even on more efficient designs.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Valid point.

Keep in mind when the PS3/X360 launched, they actually had a pretty potent GPU. The CPU was not the fastest, but neither option was power-efficient.

Jump forward to now, and you are still getting poor CPU performance (by current standards), but it is a lot more efficient. You are also getting a pretty mediocre GPU, but again an efficient design.

If the new consoles were like the last, you would see a modified 7970 shoe-horned into the system with a BD-esqe CPU in terms of power usage. You can bet those would be sucking down the juice in terms of power.

Edit: Also keep in mind that the first round of consoles was negatively impacted by 'Bumpgate' type issues with the transition to lead-free solders. This gen doesn't have to worry about that most likely...

As general purpose CPUs, yes the Cell and Xenon were barely comparable to the Athlon 64 x2s (64 bit FPUs IIRC) of olde, but even Xenon (3x 128 bit VMX units!) could obliterate it in raw GFLOPS - an important necessity for running games and anything media centric. Would you expect an Athlon 64 x2 (64 bit FPU) to run Battlefield 3 comprehensively? No. Not even an Athlon II x2 (128 bit FPU) would manage decent performance unless you stuck to very small maps and 16 players or so max. Don't expect to get decent performance unless you have an older quad core with 128 bit SIMD or dual core/module 256 bit SIMD capability you get with newer AMD or Intel architecture.

No, GFLOPS are not everything, but they are an important part of the gaming equation. It's much of the reason why the Wii U is being unsupported by third party developers. It's not as much the 1.24 GHz speed as much as it's the 64 bit SIMD per core of the three core CPU that's the issue! Silly Nintendo!
 
Last edited:

Lean L

Diamond Member
Apr 30, 2009
3,685
0
0
Your comparisons are not logical. No smartphone GPU can come anywhere close to the sophistication of the PS4 GPU. raw GFLOPS is not everything. architecture and programmability are very important. the PS4 GPU is a modified GCN design and is excellent at graphics and compute. PS4 APU has 1840 GLOPS with 8 Jaguar cores at 1.6 - 1.8 Ghz (CPU clocks not yet confirmed). the PS4 SOC is expected to be 100w. its a very efficient piece of silicon.

psst... his quote was meant to be a throwaway argument that highlights why the prior argument doesn't make sense to him.
 
Status
Not open for further replies.