How the PlayStation 4 is better than a PC

Page 63 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
As funny as I find this, I can't help but worry that someone, somewhere is taking this seriously.

Hahah, yeah, this is ludicrous.

PS4 is going to be a great console.

It's still not going to be appreciably better than a pretty midrangish PC at launch. And totally outclassed by high end PCs.

Each will have their own major major pluses though.

First-party Gran Turismo, Uncharted, etc, will be great on PS4. Not to mention sports games if you like them, PC gamers get boned on this front.

For PC, you can have REAL FPS, RTS, RPGs, mods, etc, etc, too many plusses to count.

If I had to choose one, it'd be PC and I wouldn't even think twice. Hell I'd choose a PC even if I were locked into mediocre hardware like 660ti/8gb/i5. But it'll be nice to do both, as long as PS4 doesn't go full retard with the DRM like Xbone is apparently doing.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Do you have hard latency numbers?

Until you attach the memory to that CPU we don't know the latency numbers. I thought I could measure it with SiSoft sandra, but it is measuring CPU to GPU memory latency.

So instead I went digging through the datasheets from Hynix and such and found that at 6.0Gbps frequency the following (with data bus inversion off, but it likely needs to be on and then its normally +1 these figures):

GDDR5 timings as provided by Hynix datasheet:
CAS latency= 10.6ns
tRCD = 12ns
tRP = 12ns
tRAS = 28 ns
tRC = 40ns


DDR3 timings for some Corsair 2133 RAM 11-11-11-28
CAS 10.3ns
tRCD 10.3ns
tRP 10.3ns
tRAS 26.2ns


So its up to about 16% higher latency on the raw access timing in measures people are used to with DDR3. The challenge is that the command rate is at half the frequency, so I doubt it translates like this at all.

In converted timings its impressively high because the base clock speed achieved is actually 1500 Mhz so the GDDR5 uses something like 16-18-18-42.

That is enough of a change in timing to have an impact. Some with an AMD CPU llano/jaguar could tell us roughly how much impact it will have. I'll ask in the CPU forum.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Bob Feldstein, who previously worked for AMD, has revealed the valuation of the Xbox One deal between AMD and Microsoft.

He is currently the VP of Technology Licensing at Nvidia and he says that the project is valued at over $3 billion. The amount should probably include all the expenses like design and should also cover the the costs for parts required to supply over the course of the system?s lifecycle.

In retrospect, Sony had spent $400 million on the Cell processor which was a joint project with Toshiba, so this has to be a multi-year deal with Microsoft.

On his LinkedIn account (membership required), he states that his ?involvement was focused on business management and supply agreement negotiations.?

He also adds that, ?AMD provided a custom silicon solution to Microsoft for the Xbox One, a game console and entertainment device.

?This required the coordination of multiple functional teams within AMD, as well as regular customer meetings with leadership teams responsible for handling the challenges of complex, muti-year deals. This project is valued at $3+B.?

Sony and Nintendo have also used AMD parts in their consoles. Based on the Xbox One estimate, it looks like Sony and AMD may have struck a similar deal, and Sony uses a significantly powerful GPU in the PS4 compared to what?s in the Xbox One, so it?s hard to say exactly how much the deal must have cost.

One thing is for clear though, AMD will make a mark in the next console generation and they should be pleased with that.
http://www.gamechup.com/amd-xbox-one-project-cost/
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Thanks for the numbers!

Honestly, it's a lot lower latency than I expected. I'm not sure if 16% is going to be noticeable, especially compared to desktop jaguar which is probably a bit bandwidth starved with it's single channel memory controller. We also have no idea what memory chips PS4 will use and how they'll play with the clocks and timings, and with GDDR5 it seems the memory controller is of at least equal importance.

Latency will obviously be a lot higher than the Xbone with it's SRAM, but I doubt that alone will make up for Xbone's other failings.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
At least the last generation of consoles were comparable right now PS4 is an undisputed performance leader, while multi-platform games usually looked/run better on Xbox360 now it's going to be the other way around. Most games will be designed to be playable on the lowest common denominator which is Xbox One. M-platform games are just going to stutter less or look better on PS4. I'm curious what's the real performance difference between the two. XboX one is clearly designed to avoid RROD, MS must have lost a huge amount of money on that if they outright decided to not compete with sony on specs and make a console that draws as little power as possible while still delivering kind of acceptable performance, just enough to make m-platform games feasible. Does anyone know how much power 8GB GDDR5 draws as compared to DDR3?
 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Does anyone know how much power 8GB GDDR5 draws as compared to DDR3?

GDDR5 uses less base voltage then DDR3 but it draws around 25% more power :) RAM in general doesn't use much power compared to CPU/GPU/HDD ext! Also the first GDDR5 running on 56nm node wasnt as efficient as the 46nm, 40nm and 28nm GDDR5 that's currently used on HD7970GE and will be in the PS4 even though the base clock has risen to 6000Mhz+ it uses less then half that of 56nm GDDR5

DDR3 is in the same boat and the current Samsung 20nm class Green DDR3 ram is just as efficient as the upcoming 28nm DDR4.

The consoles will shrink ram future other the lifetime of the machine and power usage shouldn't be an issue.

Performance on the other hand IS and sadly the XB1 is lacking in the memory department that will hurt its performance as the machine ages.. DDR3 is at its end of life and GDDR5 should be around many more years to come.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This seems worth discussing another performance claim.
Microsoft: Cloud Will Quadruple Power of Xbox One
http://www.tomshardware.com/news/Xbox-One-Cloud-Jeff-Henshaw-Matt-Booty-Adam-Pollington,22775.html

http://www.oxm.co.uk/54748/xbox-one...e-equivalent-of-three-xbox-ones-in-the-cloud/


The mention of larger levels can refer to storage VS initial download or disc. But there seems to be a claim of processing power from the 'cloud'

Are they hinting at some kind of server/client setting where physics or other game processes are done remotely?
If you look to the cloud as something that is no doubt going to evolve and grow over time, it really spells out that there's no limit to where the processing power of Xbox One can go.


From TOMS link
So what exactly will the Xbox One do with all that cloud processing power and storage? General Manager of Redmond Game Studios and Platforms Matt Booty gave Ars Technica a scenario, describing a scene where the user is moving through a rugged terrain shrouded by volumetric fog.
"Let’s say you’re looking at a forest scene and you need to calculate the light coming through the trees, or you’re going through a battlefield and have very dense volumetric fog that’s hugging the terrain," he said. "Those things often involve some complicated up-front calculations when you enter that world, but they don’t necessarily have to be updated every frame. Those are perfect candidates for the console to offload that to the cloud—the cloud can do the heavy lifting, because you’ve got the ability to throw multiple devices at the problem in the cloud."
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I posted this info way back when... but you were more interested in flame wars and spam...
Anyone knows if this is even possible? There is On-live, but they stream whole display in some kind of video format. Given the preety low internet bandwidth in comparison to other hardware bandwidths it seems like a problem.

How about streaming all the good looking stuff?
http://hexus.net/gaming/news/hardware/55721-xbox-one-uses-cloud-render-latency-insensitive-graphics/
But knowing M$ it will most likely be paid service...
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Sounds idiotic, to be perfectly honest.
Remember: almost every gaming use of, "the Cloud," that isn't backup of saves, and that doesn't apply to an MMO, is about planned obsolescence and eroding the used game market.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Did you even bother to read the article that Rajat Taneja wrote? Or did you just find the article where the author put it in the best possible light? Rajat never wrote anything about "five years." That was the words of the author of the summary piece you quoted.

Even 2 titans is ridiculous. Please explain to me, other that the GDDR5, what is so supercharged about the PS4. And don't say HSA, just another marketing term by amd, or HUMA, you do realize that they will be releasing APU's for the PC market that have this too. As has been pointed out to you many, many times before, the CPU is targeted for tablets and other small form factor devices. If it was anywhere as powerful as an FX or I5 or I7, they would have already released it for high end systems, where there is a lot more margin than consoles.

What do first party titles have to do with how powerful or efficient a machine is? If this was the case, I could argue that since neither the PS4 or XBox One have Mario, then the Wii U must be more powerful than them. In fact since you can't play Mario on the Department of Energy's Titan supercomputer, then the Wii U must be more powerful than the most powerful supercomputer in the world.

From the link "According to EA CTO Rajat Teneja, Sony and Microsoft have systems so magically powerful, they're five years ahead of the most extravagant PCs."

Five years ahead is that he did mean by a generation ahead. Tomshardware and others gave the same conclusion, because a generation are five years or so. The point here is that he said that PS4 is ahead of a high-end gaming PC of 2018 and he was completely wrong as Mark Rein noted.

The supercharged architecture was discussed before.

You continue repeating the mistake of comparing a CPU on a console to a CPU on a PC. Nobody here is saying that the PS4 on a PC would be faster than an i7 or a FX-8350.

Again, read what was said, a first part game can be optimized for the console hardware, instead only using the minimum common denominator with PCs.

More FUD.
Sorry for you, but when the PS3 launched, I had a...get ready for it:
A P4 2.4GHz with RD RAM...it got an AGP Gainward Blisss 7800GS++

As you can see here:
http://www.theinquirer.net/inquirer/news/1028220/gainward-agp-wolf-sheep-clothing

Not only did it perform better than the PS3...it also had better I.Q.

I wasn't limited to upscaled 720P with no AA, no AF, subpar shadowmaps and the rest of the crappy "features" of consoles.

And I am sorry to dissapoint you...but I actually thing my rig consumes less power today...then when I first build it.

It was a i7 920(45nm) + GTX285(55nm) rig with a slight OC.
It's now a i7 990X(32nm) + Titan(28nm) with no OC.

I bet you that I use less power...but have higher performance.
And thus you fail once again.

What is your PC specs again?

This was discussed before. You fail to understand the difference between PC and console ecosystems. Nobody is going to optimize the games for the console from day one. This will be made at latter times. First games will look as in 2x more powerful PCs or so. Only latter games will require much more powerful PCs.

Moreover, you link is to a GPU with the same VRAM than the full memory (RAM+VRAM) on PS3 or Xbox 360. Try to play crysis 2 on a PC with 256 MB RAM + 256 VRAM. Let us know if you even can load it...

Look buddy, I linked the original post by the EA guy on LinkedIn. Where does it say five years? Nowhere.
You're not only twisting and turning things out of context now. You're plain making things up and lying to further your delusions. Just because you put something between
tags doesn't make it a reality. And just cause the truth doesn't fit into your agenda, doesn't make it less so.

What does it matter what a GTX680 got in some demo. Get another one, or wait till 20nm next year. Ta-dah, your performance just doubled. Cause you know, that's how it goes in PC-tech.

Well have fun with your PS4 you probably won't even buy.
About the EA guy stuff see above.

As I said before, PC ecosystem is upgrading-hardware driven. Console ecosystem is optimization-hardware driven (for lifetime of course).
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
This was discussed before. You fail to understand the difference between PC and console ecosystems. Nobody is going to optimize the games for the console from day one. This will be made at latter times. First games will look as in 2x more powerful PCs or so. Only latter games will require much more powerful PCs.

So I would be a waste buying games for the new consoels for the first 2-3 years :whiste:

Moreover, you link is to a GPU with the same VRAM than the full memory (RAM+VRAM) on PS3 or Xbox 360. Try to play crysis 2 on a PC with 256 MB RAM + 256 VRAM. Let us know if you even can load it...
Ah...the fallacy of moving the goalposts.
To bad for you I don't speak "FUD".
Case:
My PC was faster than PS3 at launch and played titles with better performance/I.Q than on the PS3.

Case:
My PC is faster than the PS4 and will play games with better performance/I.Q than the PS4....at launch...and untill PS4 goes EOL.

None of your bibble will change that.

But lets look again at what you wrote:

Moreover, you link is to a GPU with the same VRAM than the full memory (RAM+VRAM) on PS3 or Xbox 360. Try to play crysis 2 on a PC with 256 MB RAM + 256 VRAM. Let us know if you even can load it...
You think Crysis 2 on a console is the same as on a PC?
http://www.youtube.com/watch?v=2WJG14uLA3k

The ignorance is strong with this one...

What are you machine specs again?
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Do you have hard latency numbers?

Are available here on AT. Search them.

Note: Some people compares GPU DDR vs GDDR. This is incorrect, GPU designers are not worried about latency and use cheap memory controllers in GPU. The PS4 APU will use a performance GDDR5 memory controller. Latency will be not an issue, as developers have said the PS4 has not bottlenecks, unlike PCs.


Right from the article

With the fixed platform and low level access you can optimize heavily. Fixed platform and low level acces do nothing unless you actually use the fact that they are available and optimize for them.

I don't get it, you toss out 2 titans and then quote sources which say only 2x. :confused:

I already gave that quote, and shows that he did not mention "heavy optimization" like you did/do.

Moreover he says at least 2x (as other developers). 2x is only API overhead. The titan comparison did not use API overhead only.

Not to mention the topic was the anemic tablet cpu, 2x slow is slow x 2.

Benchmarks show that jaguar CPU matches an i7(HT) at same clock. 1.6 GHz x 2 = 3.2 GHz and this ignores performance gains on the PS4 APU, such as HSA, not available to i7.

Unsurprisingly, Epic and other developers selected an i7 in their PS4-PC comparison. Now you are educated you can ignore it and continue trolling against the PS4.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Exactly, because everyone knew it's a meaningless metric. No one would be talking about it now either, except you keep bringing it up.

Thanks by confirming double-standards in this thread. When they used by posters or by Nvidia against the PS4 nobody is worried. When they are put in the right context and used to compare the performance with a 680 or a Titan, then the are meaningless...

The fact is that are meaningless if you don't know how to interpret, which is the case, as has been explained to you many times before.


So I would be a waste buying games for the new consoels for the first 2-3 years :whiste:

Ah...the fallacy of moving the goalposts.
To bad for you I don't speak "FUD".
Case:
My PC was faster than PS3 at launch and played titles with better performance/I.Q than on the PS3.

Case:
My PC is faster than the PS4 and will play games with better performance/I.Q than the PS4....at launch...and untill PS4 goes EOL.

None of your bibble will change that.

But lets look again at what you wrote:

You think Crysis 2 on a console is the same as on a PC?
http://www.youtube.com/watch?v=2WJG14uLA3k

The ignorance is strong with this one...

What are you machine specs again?

Sorry, but ignoring the points made, and repeating the same mistakes is not going to change anything. About your "I would be a waste buying games for the new consoels for the first 2-3 years" I only can say... LOL
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
First games will look as in 2x more powerful PCs or so. Only latter games will require much more powerful PCs.

Please cut it out with this crap. You haven't the slightest idea how much more powerful a PC needs to be than a console to play a port of that console, you're making things up.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Please cut it out with this crap. You haven't the slightest idea how much more powerful a PC needs to be than a console to play a port of that console, you're making things up.

I have backed my claims with quotes from three or four developers. You are only denying...
 

showb1z

Senior member
Dec 30, 2010
462
53
91
From the link "According to EA CTO Rajat Teneja, Sony and Microsoft have systems so magically powerful, they're five years ahead of the most extravagant PCs."

Five years ahead is that he did mean by a generation ahead. Tomshardware and others gave the same conclusion, because a generation are five years or so. The point here is that he said that PS4 is ahead of a high-end gaming PC of 2018 and he was completely wrong as Mark Rein noted.

Uhuh, lets try to make this clear again:
Rajat Teneja never said what he meant by a generation. Give me a direct quote from the man himself where he says he meant five years.
Unlike you I'll admit I was wrong and won't try to bs my way out of it.
Here, I'll link the original again:
http://www.linkedin.com/today/post/article/20130522214715-10904058-the-technology-behind-xbox-one
And here's the post Mark Rein responded to on Twitter (nothing about five years there either):
http://www.develop-online.net/news/44289/EA-Xbox-One-and-PS4-a-generation-ahead-of-PC

galego said:
a generation are five years or so
Really? In what? Surely not in PC technology where Moore's law is still upheld. What defines a generation anyway?
Was Titan the next generation? Is Haswell a new generation? Will 20nm GPU's be the next generation? DDR4? Are C2D's only 1 generation old? Nope, that's just nonsense to support your own case.
Might as well say he meant a human generation, maybe he said the ps4 is ahead of PCs from 2040.

All in all this is completely unimportant, but it's a nice showcase of the lengths you go to defend your ridiculous claims.
All this nonsense to try and weasel your way out of acknowledging Mark Rein's quote.
Just drop the subject already.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Benchmarks show that jaguar CPU matches an i7(HT) at same clock. 1.6 GHz x 2 = 3.2 GHz and this ignores performance gains on the PS4 APU, such as HSA, not available to i7.

Unsurprisingly, Epic and other developers selected an i7 in their PS4-PC comparison. Now you are educated you can ignore it and continue trolling against the PS4.

So let's just pretend that makes sense.

Current stock ivy bridge cpu's are already equal to the console Jaguar cpu.

Amazingly though, people round there here parts overclock. So we've already got more powerful, quad core cpu's than what the overhead allows the Jaguar cpu to be above what PC can be. Let's not forget we've had faster cpu's than i7 Ivy since 1366 hex cores... :whiste:

Unsurprisingly you'd presume they needed an i7 when in reality given the type of demo it was there was probably about as much cpu load as a Heaven benchmark, meaning a i3 clocked the same would have most likely produced the same results :hmm:
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I already gave that quote, and shows that he did not mention "heavy optimization" like you did/do.

Moreover he says at least 2x (as other developers). 2x is only API overhead. The titan comparison did not use API overhead only.

Benchmarks show that jaguar CPU matches an i7(HT) at same clock. 1.6 GHz x 2 = 3.2 GHz and this ignores performance gains on the PS4 APU, such as HSA, not available to i7.

Unsurprisingly, Epic and other developers selected an i7 in their PS4-PC comparison. Now you are educated you can ignore it and continue trolling against the PS4.

No, he implied heavy optimization. Without any degree of optimization you are not going to see any improvements.

Jaguar doesn't come close to matching i7 at the same clock. At best it matches i3 (which is slightly better but 1.8 ghz vs the 1.5 ghz of the jaguar chips being compared).
 

Rebel_L

Senior member
Nov 9, 2009
454
63
91
, as developers have said the PS4 has not bottlenecks, unlike PCs.

It looks like the PS4 bottleneck at the very least is overall processing power


I already gave that quote, and shows that he did not mention "heavy optimization" like you did/do.

Moreover he says at least 2x (as other developers). 2x is only API overhead. The titan comparison did not use API overhead only.


Are you really trying to say that 50%+ of the work a graphics card does in a PC is related to api waste? In order for you to use a 2x multiplier in your math that has to happen. I thought the api overhead in a console can be eliminated so that there is almost no waste, say down to 1% total resources, then your PC using 10x api waste would sit at say 10%... that still gives the PC 90% of its specs to compete with the console, not half like your math always implies.
 
Status
Not open for further replies.