How the PlayStation 4 is better than a PC

Page 48 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Except that there's someone pretending the PS4's computational power is at worst, equal to the most powerful PC hardware available, and at best, a lot better. It's hardly about comparing equal hardware and saying the PS4 would be more capable. That debate would have ended soon after it started. Well, so did this really, but the guy is just persistent.

Why do u care? Its one guy arguing a position based entirely on theory. When the PS4 gets released then the real debates can begin.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Are those figures double precision? Games use single precision if I'm not mistaken..

It's float, it doesn't even matter...

This. The behavior has gone well past "hi, I'm new and don't quite know how to play well with others" to "hi, I'm here to troll you day in and day out".

I'd advise ignoring the one person that has been trying to keep this farce going.
 

Baasha

Golden Member
Jan 4, 2010
1,989
20
81
PS...what?

n00b please... I wonder if I should send the PS4 team my rig specifications and see if they still want to claim if their console will "beat" PCs. :rolleyes:
 

Silver Prime

Golden Member
May 29, 2012
1,671
7
0
Why do u care? Its one guy arguing a position based entirely on theory. When the PS4 gets released then the real debates can begin.

Indeed, its all speculations, I'm hope-ing to see some mind F**ked reality warping graphics, but I have to be logical because graphics are already almost counter reality, so it will depend on the artistical value and of course gameplay, I heard that the latest Ninja gaiden sucked bad, because of trying to copy God of war type feel and the game play was in-active almost all the way through, it was just like watching scene after scene, pressing only one or two buttons and a pro-longed cut scene does the rest...that...sucks.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
They are linpack/intel burn test numbers.

Are not linpack values but theoretical gflops computed by Intel.

But were both tested on the same app for a basis?

As said are not measured values.

Its the real world performance than counts. Just for kicks when calculated theoretically the xbox360 is around 77 GFLOPS and the PS3 is 230 GFLOPS but the extractable performance is a tiny fraction of that.

Yes real world performance is what counts and a pair of benchmarks for jaguar cores were given. But lacking benchmarks for PS4 8-core jaguar chip we use GFLOPs now to get an idea of its performance.

Note as well that the problems with 360 and ps3 were the difficulty to program the power chip (this problem does not exist now) and the surrounding hardware such as the slow and small memory (this problem is also gone) rather than chip performance.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Are not linpack values but theoretical gflops computed by Intel.



As said are not measured values.



Yes real world performance is what counts and a pair of benchmarks for jaguar cores were given. But lacking benchmarks for PS4 8-core jaguar chip we use GFLOPs now to get an idea of its performance.

Note as well that the problems with 360 and ps3 were the difficulty to program the power chip (this problem does not exist now) and the surrounding hardware such as the slow and small memory (this problem is also gone) rather than chip performance.

If you run linpack/intel burn test you will get similar numbers.

GFLOPS is a horrible way to look at cpus.

If you are going to look at ps4 cpu performance take some jaguar benchmarks (have demoed the quad core 1.4 ghz a6-1450 subnotebook in another thread-can also find it on the internet) and multiply it by about three.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
If you run linpack/intel burn test you will get similar numbers.

GFLOPS is a horrible way to look at cpus.

If you are going to look at ps4 cpu performance take some jaguar benchmarks (have demoed the quad core 1.4 ghz a6-1450 subnotebook in another thread-can also find it on the internet) and multiply it by about three.

What benchmarks? 8core jaguar in PS4 will not be running under windows benchmarking sh!t knows what. It will most likely be specific code designed strictly to unleash every single flops available in this, and only this chip.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What benchmarks? 8core jaguar in PS4 will not be running under windows benchmarking sh!t knows what. It will most likely be specific code designed strictly to unleash every single flops available in this, and only this chip.

Yes, but so what. Theoretical FLOPS does not equal practical FLOPS which do not equal real world FLOPS.

xbox 360 and ps3 were not able to use more than a fraction of their theoretical power.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Yes, but so what. Theoretical FLOPS does not equal practical FLOPS which do not equal real world FLOPS.

xbox 360 and ps3 were not able to use more than a fraction of their theoretical power.

So comparing PC CPU banchmarks is as pointless as comparing theoretical FLOPS.
 

Lavans

Member
Sep 21, 2010
139
0
0
What resolution is the PS3 rendering those games at? If other purdy games rendering at sub-720p resolutions are any indication, those probably are as well.

Nothing has been confirmed on Beyond Two Souls or Metal Gear, but it has been confirmed that The Last of Us will run natively at 720p.

Also, the PS3 version of Mass Effect 3 has been confirmed to go up to 720p, while the Xbox 360 version goes up to 1080p, which again, a Nvidia 7900 series GPU can barely struggle doing 30fps at 1280x1024 in Mass Effect 1.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
So comparing PC CPU banchmarks is as pointless as comparing theoretical FLOPS.
This why we have things like SPEC and TPC, which could actually be run on the PS4, if they allow an other OS option (if they don't lose money on the hardware, I don't see why not to do that). Everybody knows everybody cheats a bit, but they also know how to interpret the results, anyway.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Nothing has been confirmed on Beyond Two Souls or Metal Gear, but it has been confirmed that The Last of Us will run natively at 720p.

Also, the PS3 version of Mass Effect 3 has been confirmed to go up to 720p, while the Xbox 360 version goes up to 1080p, which again, a Nvidia 7900 series GPU can barely struggle doing 30fps at 1280x1024 in Mass Effect 1.

I'd love to see a decent source for that claim, that the 360 is running me3 at 1920x1080, and I don't mean stuff like this:
http://www.lensoftruth.com/head2head-mass-effect-3-analysis/


Should be straight from the horse's mouth.
 

Lavans

Member
Sep 21, 2010
139
0
0
I'd love to see a decent source for that claim, that the 360 is running me3 at 1920x1080, and I don't mean stuff like this:
http://www.lensoftruth.com/head2head-mass-effect-3-analysis/


Should be straight from the horse's mouth.

44077-mass-effect-3-old-full.jpg
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Nothing has been confirmed on Beyond Two Souls or Metal Gear, but it has been confirmed that The Last of Us will run natively at 720p.

Also, the PS3 version of Mass Effect 3 has been confirmed to go up to 720p, while the Xbox 360 version goes up to 1080p, which again, a Nvidia 7900 series GPU can barely struggle doing 30fps at 1280x1024 in Mass Effect 1.

A lot of the pc vs console arguments forgets that modern games are not optimized for really old gpus and there are no drivers for the really old gpus that allow them to really play the game. Pretty much any AAA game sees a massive gain when nvidia and amd release drivers for it.

A 6450 can get 30 fps at 720p on ME3.
 

Lavans

Member
Sep 21, 2010
139
0
0
A lot of the pc vs console arguments forgets that modern games are not optimized for really old gpus and there are no drivers for the really old gpus that allow them to really play the game. Pretty much any AAA game sees a massive gain when nvidia and amd release drivers for it.

A 6450 can get 30 fps at 720p on ME3.

Mass Effect 1 was released in 2008, when the 7xxx series was mainstream and common in most gaming computers. Considering that a 7900GT couldn't even break 20fps average in Tom's review of the card, I sincerely doubt it would fair any better in Mass Effect 3.

http://www.tomshardware.com/charts/gaming-graphics-charts-2008-q3/compare,771.html?prod[2096]=on
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
If you run linpack/intel burn test you will get similar numbers.

Yes. Linpack for the 3770k gave something as 92% of the theoretical value of 112 GFLOP given by Intel.

GFLOPS is a horrible way to look at cpus.

If you are going to look at ps4 cpu performance take some jaguar benchmarks (have demoed the quad core 1.4 ghz a6-1450 subnotebook in another thread-can also find it on the internet) and multiply it by about three.

GFLOPs can be useful or not, as any other index/score. I have not used only GFLOPs. I have complemented those GFLOPs with a pair of benchmarks of jaguar cores given before by other two posters. One of them wrote regarding the windows benchmarks:

8 jaguar cores will match a quad core sandybridge with HT at the same clocks.

All that data together plus the additional fact that Epic has selected an i7 plus the fact that the console is not running a bloated OS such as windows 7/8 lead to the conclusion that the CPU on the console competes with an i7 CPU on a PC using Windows 7/8.
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
8 jaguar cores will match a quad core sandybridge with HT at the same clocks.
At the same clock. That also means 8 jaguar cores will (AT BEST) match a typical dual core desktop i3 when there are 8 CPU intensive applications, and at worst will be 1/4 of the speed of the i3 where there are only 2 intensive threads.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
At the same clock. That also means 8 jaguar cores will (AT BEST) match a typical dual core desktop i3 when there are 8 CPU intensive applications, and at worst will be 1/4 of the speed of the i3 where there are only 2 intensive threads.

at the same clock matters because PS4 SOC has a limited power budget (say 100w) of which the GPU gets the majority share (75w). People keep forgetting that a desktop core i3 has a much higher CPU power budget. So for a given power and die size constraint Jaguar provides excellent multithreaded performance. also the PS4 SOC is architecturally superior to a PC with a CPU and a dGPU . But the raw performance of a high end PC can never be matched by a ps4.

Here is a presentation by Guerilla games which talks about wide multithreading being important for getting the best performance out of PS4

http://www.slideshare.net/guerrillagames/killzone-shadow-fall-demo-postmortem
 

Lavans

Member
Sep 21, 2010
139
0
0
On Ultra quality... At a higher res than the consoles use.

:rolleyes:

Ultra, yes. Ultra being 0xAA, 0xAF, dynamic shadows likely enabled, and textures at high. That's pretty much what consoles use.

Also, do you sincerely believe that dropping the res from 1280x1024 to 1280x720 would enable the 7900 GT to run ME1 at 30fps average? Because I don't
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
8 core CPU in PS4 is AMD doing itself a huge favor. They will 'force' game/engine developers to put more weight to multithreading game/engine and optimalization towards AMD CPUs. That should be reflected in desktop parts performance. Who knows, we might see fx-8150 maching i7-2700 like it was in crysis3
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Ultra, yes. Ultra being 0xAA, 0xAF, dynamic shadows likely enabled, and textures at high. That's pretty much what consoles use.

Also, do you sincerely believe that dropping the res from 1280x1024 to 1280x720 would enable the 7900 GT to run ME1 at 30fps average? Because I don't


Ultra being ultra, says Trilinear not 0xAF...

My belief is the PS3 had Cell to help it out, while the 360 had a better gpu than the 7900GT.

My belief is if you compared the game on consoles to PC on the 7900GT you'd be hard pressed to see any real difference.

My belief is if you move up one generation on the PC you get a noticeably better picture quality, better AA, as well as almost no performance loss from 16x AF, and a higher resolution.
 

Spjut

Senior member
Apr 9, 2011
933
163
106
Assassins Creed was console exclusive for some months. The PC port was pretty punishing on the DX9 era GPUs.
http://www.pcgameshardware.de/Assas...erie/797259/#?a_id=637474&g_id=-1&i_id=797260

I'd like to see some X1950 Pro and 7900GT benches for more modern games, like BC2 and FC2 for example, to see how they stick up with the consoles. Nvidia did have their DX9 cards supported in the main drivers up until some months ago.


In terms of console vs PC efficiency, I don't think it's "fair" to use early versions of UE3 for comparison. UE3 was a PC engine at first, and it apparently didn't even use the EDRAM for "free" AA at first on the 360.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
8 core CPU in PS4 is AMD doing itself a huge favor. They will 'force' game/engine developers to put more weight to multithreading game/engine and optimalization towards AMD CPUs. That should be reflected in desktop parts performance. Who knows, we might see fx-8150 maching i7-2700 like it was in crysis3

errr... today consoles uses multithread aswell o_O

i have to say...
at semiaccurate forums there is a really good discussion, that the XBOX next will might use a Vulcanic Islands based GPU
 
Status
Not open for further replies.