How the PlayStation 4 is better than a PC

Page 60 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Thinking doesn't hurt

Which is begs the question, why you don't do more of it? You last post, did the same thing as your last 3 posts... You're apparently arguing with yourself.

So now that we know PC > PS4, you're arguing power consumption... Yes, its more efficient, never said otherwise.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Which is begs the question, why you don't do more of it? You last post, did the same thing as your last 3 posts... You're apparently arguing with yourself.

So now that we know PC > PS4, you're arguing power consumption... Yes, its more efficient, never said otherwise.

And where I said PS4>PC in terms of performance?
All I say is consoles have some upsides to PC. And differences in designs are very interesting. On top of that, new pricing policy from GPU manufacturers is making consoles more appealing.

How about streaming all the good looking stuff?
http://hexus.net/gaming/news/hardware/55721-xbox-one-uses-cloud-render-latency-insensitive-graphics/
But knowing M$ it will most likely be paid service...
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
And where I said PS4>PC in terms of performance?
All I say is consoles have some upsides to PC. And differences in designs are very interesting. On top of that, new pricing policy from GPU manufacturers is making consoles more appealing.
The pricing policy is just awful, first they release mid-range card with high-end price tag, and a lot of people believed that this is high-end and argued arduously when someone called it mid-range, then they released their true high-end for 1000$ and when everybody who was willing to pay that much for a graphics card bought it then they released insignificantly slower card for 650$. They couldn't do better in terms of profit.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Kabini with quad core Jaguar slightly exceeds performance of core i3 (dual core with HT) at the same clocks in Cinebench R11.5 multithreaded benchmark.

http://hothardware.com/Reviews/AMD-2013-ASeries-Kabini-and-Temash-Mobile-APUs/?page=5

to match 8 Jaguar cores you need a core i7 (quad core with HT) at the same clocks. PS4 has eight Jaguar cores at 1.6 Ghz (clocks not yet disclosed) . you are looking at a 1.6 Ghz core i7 to match it. PS4 APU has a 800 mhz GCN GPU with 1152 sp on die. HD 7970m has 1280 sp at 850 Mhz. TDP around 100w. the PS4 SOC is expected to draw 100w.

a core i7 notebook with HD 7970m GPU has 2 separate chips each having their own separate memory controllers , separate physical memory and having external PCI- E bus communication between them. the minute you step off a chip you need a lot of power to drive signals and the power consumption rises significantly.

the PS4 APU is a single die and all communication between CPU and GPU will be through a on-die high bandwidth / low latency bus. Thats much lower power too as you don't need to step off the chip. A single 256 bit GDDR5 memory controller handles all the memory requests of both CPU / GPU.

There is no way a notebook can come close to matching the efficiency of a PS4 APU. :whiste:

Yes, but Cinebench R11.5 is one of those benchmarks that cheat the results. Cinebench uses ICC and its Cripple_AMD function, which cheats about the real score of AMD chips.

At the risk of repeating myself...

1) everyone knows
2) no one cares

Fixed it for you.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Ahahaha

8812345624_e9e2a37cfb_o.png


hahaha.

AMD needs to sprinkle some magic pixie dust on Epic.
lets make sure this gets posted on every page to remind galego
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yes, but Cinebench R11.5 is one of those benchmarks that cheat the results. Cinebench uses ICC and its Cripple_AMD function, which cheats about the real score of AMD chips.

Ahahaha, this is too funny!

What are you talking about? Cinebench was a staple of AMD performance during Phenom II!

4.4GHz 965
d48b37a4_zpse3013800.jpg


4.5GHz 1090T
cb122aee_zps7380c32a.png



The demo was running on AH, not in the PS4; was targeting 1080p, not sub-1080p...

You mean 90% of 1080p for PC, and 1080p with insanely gimped visual effects for PS4 right?

You've got jokes, I'll give you that... A regular riot.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Thats sandy bridge, ivy is a little bit stronger and more efficient. So the ps4 gpu equals a 1.6 ghz i7 sandy (I can extrapolate to a 1.5 ghz i7 ivy or a 1.9 ghz i5 ivy). That is still far weaker than any desktop cpu (around i3 performance).

As for efficiency

From notebookcheck 680m vs 7870m

That includes the 17.3 inch screen. Take that off (~10 watts on max brightness) and you end up with 116 watts for BF3 (singleplayer) add 20 watts for MP cpu power =136 watts. Thats very close considering we don't know what the ps4 is using as a full system (no just the SOC estimation).

the 1.5 ghz kabini was 4% faster than 1.5 ghz sandy in cinebench r11.5 multithreaded. ivy is also around 3 - 5% faster than sandy at same clocks. so they are on par in perf.

also you have to look at performance within a given power and die size budget. Kabini with 4 jaguar cores is extremely power efficient. read anandtech article.

http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/2

"I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W)."

So Kabini with 4 Jaguar cores at 1.5 Ghz does not draw even 10w. This is with an integrated southbridge. Jaguar at 1.5 - 1.6 Ghz seems to draw 2w per core.

A Jaguar core is 3.1 sq mm.two Jaguar cores are smaller than a single ivy core which is 10 sq mm. 8 Jaguar cores are 25 sq mm. 4 ivy cores is 40 sq mm. so both on perf/sq mm and perf/watt Jaguar (1.6 ghz) is more efficient than a notebook ivy at 1.6 Ghz.

You have to understand if a PS4 SOC TDP is 100w it will rarely come close to using all of it.

There is no way a core i7 chip at 1.6 Ghz with a HD 7970m GPU can come close to the same efficiency of a single chip PS4 APU. AMD has shaved off 2 CU (128 sp) from HD 7970m for the PS4 GPU and clocked it 50 Mhz slower. Also this PS4 SOC chip will not have a PCI-E controller like the core i7 and HD 7970m have. All CPU/GPU communication is on-die. Driving signals outside the chip takes a lot of power. you can ask any electrical engineer. so without any external GPU bus , this fully integrated single chip APU is much more efficient than a core i7 ivy (1.6 ghz) with HD 7970m. AMD / Sony have chosen clocks which yield the best combination of performance and efficiency for both the GCN GPU and Jaguar based CPU.

Just the point that HD 7970m is a 100w GPU while the entire PS4 SOC is 100w should give you the idea. if you still don't believe it you can wait for anandtech's PS4 launch article where they will measure platform power consumption. :thumbsup:
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
You mean 90% of 1080p for PC, and 1080p with insanely gimped visual effects for PS4 right?
If the visuals are broken and you don't get expected effect, that doesn't mean the visuals are less demanding.
If there is a bug, it is most likely taking more resources than if it is fixed.
Visuals and compute are two separate things. Not always better graphics need more processing.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
the 1.5 ghz kabini was 4% faster than 1.5 ghz sandy in cinebench r11.5 multithreaded. ivy is also around 3 - 5% faster than sandy at same clocks. so they are on par in perf.

also you have to look at performance within a given power and die size budget. Kabini with 4 jaguar cores is extremely power efficient. read anandtech article.

http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/2

"I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W)."

So Kabini with 4 Jaguar cores at 1.5 Ghz does not draw even 10w. This is with an integrated southbridge. Jaguar at 1.5 - 1.6 Ghz seems to draw 2w per core.

A Jaguar core is 3.1 sq mm.two Jaguar cores are smaller than a single ivy core which is 10 sq mm. 8 Jaguar cores are 25 sq mm. 4 ivy cores is 40 sq mm. so both on perf/sq mm and perf/watt Jaguar (1.6 ghz) is more efficient than a notebook ivy at 1.6 Ghz.

You have to understand if a PS4 SOC TDP is 100w it will rarely come close to using all of it.

There is no way a core i7 chip at 1.6 Ghz with a HD 7970m GPU can come close to the same efficiency of a single chip PS4 APU. AMD has shaved off 2 CU (128 sp) from HD 7970m for the PS4 GPU and clocked it 50 Mhz slower. Also this PS4 SOC chip will not have a PCI-E controller like the core i7 and HD 7970m have. All CPU/GPU communication is on-die. Driving signals outside the chip takes a lot of power. you can ask any electrical engineer. so without any external GPU bus , this fully integrated single chip APU is much more efficient than a core i7 ivy (1.6 ghz) with HD 7970m. AMD / Sony have chosen clocks which yield the best combination of performance and efficiency for both the GCN GPU and Jaguar based CPU.

Just the point that HD 7970m is a 100w GPU while the entire PS4 SOC is 100w should give you the idea. if you still don't believe it you can wait for anandtech's PS4 launch article where they will measure platform power consumption. :thumbsup:

Thats the SOC tdp NOT the ps4 TDP (add ram, HDD, mobo power use). The entire notebook draws about 130 watts in bf3 (more efficient 680m) minus the screen.

Again tdp != power use. Under gaming loads the 7970m/680m does not draw anything like 100 watts of power (the 3dmark 06 results are around 106 watts for the entire platform so about 85-90 watts for the gpu only).

Sure they cut off two 2CU and reduced the clock which will reduce power consumption but will also reduce performance (efficiency may remain the same).

Lets look at that again.

(can also use 3.2 ghz i5m or i3 desktop).

130 watts for 3.1 ghz i7 and 680m. (estimated under BF3 MP)

how many watts for 1.6 ghz i7 + 680m?

Guessing around 110 watts.

Not a major difference in efficiency.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
If the visuals are broken and you don't get expected effect, that doesn't mean the visuals are less demanding.
If there is a bug, it is most likely taking more resources than if it is fixed.
Visuals and compute are two separate things. Not always better graphics need more processing.

Yes. Epic representative already explained in eurogamer that due to different cinematics, different lighting, and bugs the early demo cannot be compared visually with the PC version.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Kabini with quad core Jaguar slightly exceeds performance of core i3 (dual core with HT) at the same clocks in Cinebench R11.5 multithreaded benchmark.

http://hothardware.com/Reviews/AMD-2013-ASeries-Kabini-and-Temash-Mobile-APUs/?page=5

to match 8 Jaguar cores you need a core i7 (quad core with HT) at the same clocks. PS4 has eight Jaguar cores at 1.6 Ghz (clocks not yet disclosed) . you are looking at a 1.6 Ghz core i7 to match it. PS4 APU has a 800 mhz GCN GPU with 1152 sp on die. HD 7970m has 1280 sp at 850 Mhz. TDP around 100w. the PS4 SOC is expected to draw 100w.

a core i7 notebook with HD 7970m GPU has 2 separate chips each having their own separate memory controllers , separate physical memory and having external PCI- E bus communication between them. the minute you step off a chip you need a lot of power to drive signals and the power consumption rises significantly.

the PS4 APU is a single die and all communication between CPU and GPU will be through a on-die high bandwidth / low latency bus. Thats much lower power too as you don't need to step off the chip. A single 256 bit GDDR5 memory controller handles all the memory requests of both CPU / GPU.

There is no way a notebook can come close to matching the efficiency of a PS4 APU. :whiste:

Bolded the relevant parts. An i3 at 3.2ghz is roughly equal to Jaguar at 1.5ghz. I'm really not sure about efficiency and silicon cost, but the math says that an off the shelf i3 is a match.

EDIT: Ivy Bridge desktop i3's have a TDP of 55w which includes the iGPU. I would suspect the CPU itself wouldn't draw more than 25-30w.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Bolded the relevant parts. An i3 at 3.2ghz is roughly equal to Jaguar at 1.5ghz. I'm really not sure about efficiency and silicon cost, but the math says that an off the shelf i3 is a match.

EDIT: Ivy Bridge desktop i3's have a TDP of 55w which includes the iGPU. I would suspect the CPU itself wouldn't draw more than 25-30w.
http://www.cpu-world.com/CPUs/Core_i3/Intel-Core i3 Mobile I3-330M CP80617004122AG.html
i3-330M
Thermal Design Power
35 Watt (Package)
25 Watt (CPU core)
12.5 Watt (Graphics core) 45nm
As you can see TDP for a whole package is less than CPU+GPU.

i5-3330 TDP 69W
i5-3350P TDP 77W
So hd2500 takes about 10W and CPU 70W
i3 55W should be something like:
package 55 watt
CPU 50 watt
GPU 10 watt

Whole 4 core jaguar with igp (A4-5000) takes about the same amount of power as hd2500 alone
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I have said in this thread how a PC with 3 titans will outperform the PS4.

I have quoted in this thread to a known analyst (expert on graphics cards) saying that a PC with 3 titans will outperform the PS4.

I have quoted a Nvidia developer saying how the PS4 will be ahead of PCs on gaming performance. I doubt his motivation was "to sell you on a console like PS4 so that you'll purchase their games?".

I prefer a person as Mark Rein, whose relation to PS4 is well-known, rather than anonymous posters in forums, whose relation to Intel and Nvidia is unknown, specially when some of those anonymous posters have a large record of bashing AMD.

I am posting in a thread with title "How the Playstation 4 is better than a PC". What better place to discuss this? If you are not interested you can unsubscribe.

Common sense is the less common of senses. There are posts in those forums where you can find posters arguing (in the past) why no AMD APU could be found on the PS4 and the Xbox. They argued that was "common sense".

I'll bet you that my socket 1366 rig will slap the PS4 around in gaming performance...single GPU only.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Ok, let us know how your rig plays first-party titles.

Which is the most inane point ever since it has nothing to how powerful or good the pc is compared to the ps4 (or vice versa).

The ps4 has gotta be much weaker than the wii since I can't play the not graphically impressive or demanding mario games on it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
I get the feeling Galego is an AMD shill, he has to be in order to write the ludicrous crap he does on these forums. AMD : hey guys, now that we can't compete on the high end desktop or notebook front, come buy PS4, our APU > 2 Titans! Galego will tell you all about it!!11
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This would have to be his first 'job'. And with a resume of having no experience with any PC gaming or for that matter, a functional console.
 
Status
Not open for further replies.