The PS3 is like an onion

Queasy

Moderator<br>Console Gaming
Aug 24, 2001
31,796
2
0
In an interview with IndustryGamers, SCEA VP Scott Rohde shared this thought about the PS3:

?The way I like to look at what the PS3 can do is that we're still peeling back layers of the onion and finding even more that the SPUs can pull off. From first-hand experience, when you talk to developers and they realize ? especially when talking about a sequel ? that they can throw a lot of tasks at those SPUs, freeing up the main processor to do a lot more than they thought, that's exciting to developers. It's exciting for them when they see they've just scratched the surface of what PS3 can do,? he said.

ahem

Shrek: Ogres are like onions.
Donkey: They stink?
Shrek: Yes. No.
Donkey: Oh, they make you cry.
Shrek: No.
Donkey: Oh, you leave em out in the sun, they get all brown, start sproutin' little white hairs.
Shrek: NO. Layers. Onions have layers. Ogres have layers. Onions have layers. You get it? We both have layers.
[sighs]
Donkey: Oh, you both have layers. Oh. You know, not everybody like onions.

Anyways, it's actually a pretty interesting interview where Rohde goes on to talk about Sony's first-party games strategy, the wand motion controller, etc. Worth a read even if you have to get around some of the corporate speak.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
So far the onion just seems to be making developers cry. The Fallout 3 image quality comparison to the 360 was not flattering at all.

Competition helps keep MS less evil and I do like my PS3. So I hope Sony is smart enough to make the PS4 just 2 PS3 duct-taped together instead of making developers start the learning (and crying) process over from scratch.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: DaveSimmons
Competition helps keep MS less evil and I do like my PS3. So I hope Sony is smart enough to make the PS4 just 2 PS3 duct-taped together instead of making developers start the learning (and crying) process over from scratch.
Terrible idea. Sony needs to ditch the Cell and move back to a normal SMP architecture like everyone else. A PS3 with 14 SPUs is not going to be a lot of fun for anyone.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So far the onion just seems to be making PC native developers cry.

Fixed that for ya :)

The Fallout 3 image quality comparison to the 360 was not flattering at all.

Is there a game developer who is worse at extracting visuals from any platform then Bethesda in the world? Fallout 3 is ugly by 360 standards, it is ugly by PC standards, and it is hideous by PS3 standards. At least it performs poorly on all the systems though, and crashes, and is riddled with random bugs, overall certainly a very typical Bethesda game. I think Bethesda has some of the best ideas overall in the gaming market, and the poorest coders by a huge margin(this certainly isn't the first time I've mentioned this).

A more apt comparison would be say KZ2 to GoW2, or GT5 to Forza3. Generic code will perform the best on PCs, then the 360, then the PS3. When you get into specialized uses the PS3 starts to perform significantly better.

Sony needs to ditch the Cell and move back to a normal SMP architecture like everyone else.

First off, they can't move back to an architecture they have never utilized. The PS3 is very much an evolutionary step from the PS2, even console native devs were upset with how rough it was to utilize VU0 and VU1(some of them anyway, Polyphony didn't seem to mind at all). As far as moving back to the retro computing style used by older non gaming architectures- they are realizing they have to change in order to continue to improve. Look at Larrabee, Intel is heading in Cell's direction, I doubt you will see the reverse of that.

A PS3 with 14 SPUs is not going to be a lot of fun for anyone.

Depends on how devs are doing extracting parallelism at that point. Right now the ROI isn't very high unless you are making a PS3 exclusive for going beyond 3 threads(360 can still scale depending on how thread balancing works out, but it is certainly an issue of diminishing returns at that point).
 

purbeast0

No Lifer
Sep 13, 2001
53,637
6,515
126
i love my PS3 for some of the exclusives. any cross platform games i'm interested in i always get on 360 because of Xbox Live and the Xbox360 controller.

but if I couldn't play games like Uncharted, Ratchet and Clank, God of War, and Hot Shots Golf, those would all be selling points for me to go buy a PS3.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
It seems like the PS3 (and the PS2 before it) were originally developed around the conceptual OpenGL 1.0 implementation. Fast, but dumb rasterizer, coupled with any number of coprocessors to handle what is nowadays done on the gpu (geometry setup, vertex transforms, pixel shaders).
The PS3 somewhat avoided this by using a real GPU. The 360 cpu has enough brute force to handle that style of programming (offloading the computational complex stuff) onto the cpu as well. On a PC, it's not reasonable unless you have a quad core, and even then you'd probably need an i7.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: erwos
Originally posted by: DaveSimmons
Competition helps keep MS less evil and I do like my PS3. So I hope Sony is smart enough to make the PS4 just 2 PS3 duct-taped together instead of making developers start the learning (and crying) process over from scratch.
Terrible idea. Sony needs to ditch the Cell and move back to a normal SMP architecture like everyone else. A PS3 with 14 SPUs is not going to be a lot of fun for anyone.

With duct-tape the PS4's Cell GTXP+ processor might have 2-3 general purpose cores and 10-16 SPUs.

It needs to keep everything in the current Cell to run existing engines and code libraries. It could add more of whatever developers can use, such as another general-purpose core to ease porting from PC/360.
 

Hadsus

Golden Member
Aug 14, 2003
1,135
0
76
Originally posted by: purbeast0
i love my PS3 for some of the exclusives. any cross platform games i'm interested in i always get on 360 because of Xbox Live and the Xbox360 controller.

but if I couldn't play games like Uncharted, Ratchet and Clank, God of War, and Hot Shots Golf, those would all be selling points for me to go buy a PS3.

Blu ray too. I wasn't even looking at a PS3 last month when I went shopping for a Blu ray player but ended up getting one 'cause the Blu ray player I was interested in buying at BB was only about $50 cheaper. The newer Blu rays don't just play disks, but they stream from Youtube, Netflix, Hulu, etc. The PS3 does that with one of several apps you can install on your PC. And the 360, IMO, is just too loud and buzzy to enjoy movies. IMHO. :D
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: BenSkywalker
First off, they can't move back to an architecture they have never utilized. The PS3 is very much an evolutionary step from the PS2, even console native devs were upset with how rough it was to utilize VU0 and VU1(some of them anyway, Polyphony didn't seem to mind at all). As far as moving back to the retro computing style used by older non gaming architectures- they are realizing they have to change in order to continue to improve. Look at Larrabee, Intel is heading in Cell's direction, I doubt you will see the reverse of that.
This is, at best, disingenuous. Please read the wiki entry on why Larabee isn't the PITA that the Cell is:

http://en.wikipedia.org/wiki/L..._Cell_Broadband_Engine

Larabee is a GPU using an extended x86 instruction set. It's not the Cell, and programming it is a very different proposition.

Depends on how devs are doing extracting parallelism at that point. Right now the ROI isn't very high unless you are making a PS3 exclusive for going beyond 3 threads(360 can still scale depending on how thread balancing works out, but it is certainly an issue of diminishing returns at that point).
I remember hearing this when the PS3 came out, too. How long until people realize that having DSPs masquerading as CPUs is a dumb design decision, and that extracting parallelism is a tremendously hard problem?

I really don't think many people understand the _programming_ challenges involved here, and keep reverting back to what the hardware can do as an argument.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Coding for the SPUs can't be any worse than the 6502 assembly I wrote as a kid :)

My point is that moving to larabee means throwing out all the work developers have done so far, and throwing away the chance to share code between the two generations during the years when they're both on the market.

With the magic of duct tape, developers working on a PS3 version already have a working PS4 version as well. That version needs extra work to use the additional cores and new graphics hardware, but some of that could be a one-time engine upgrade that just needs a different build switch for 640p vs. 1080p rendering.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: DaveSimmons
Coding for the SPUs can't be any worse than the 6502 assembly I wrote as a kid :)
Au contraire, it's far worse. Give me assembly any day of the week over trying to write for the Cell. Haven't done any soft real-time multi-threaded computing lately, eh?

My point is that moving to larabee means throwing out all the work developers have done so far, and throwing away the chance to share code between the two generations during the years when they're both on the market.
They could always use Larrabee plus the Cell. I think people are somewhat confused about what Larrabee is really about.

With the magic of duct tape, developers working on a PS3 version already have a working PS4 version as well. That version needs extra work to use the additional cores and new graphics hardware, but some of that could be a one-time engine upgrade that just needs a different build switch for 640p vs. 1080p rendering.
If they're going to do that, they need to lean on putting in more of the PPEs than SPEs. The appropriate response to "developers find our hardware extremely difficult to program on" is not "too damn bad".
 

squatchman

Member
Apr 1, 2009
50
0
0
Scott Rohde needs to understand that the PS3 wasn't unearthed as the last vestige of some ancient technologically advanced civilization. Sony seems to sell people on the idea that nobody in the world could POSSIBLY understand the capability of their magical miracle box every generation, and the armchair-developer-console-fanboys eat it up every time.
 

PieIsAwesome

Diamond Member
Feb 11, 2007
4,054
1
0
Yeah, the PS3's power is not fully realized, developers have yet to discover its potential blah blah blah blah, heard it a million times before. It all means nothing until we see results.
 

galperi1

Senior member
Oct 18, 2001
523
0
0
Originally posted by: PieIsAwesome
Yeah, the PS3's power is not fully realized, developers have yet to discover its potential blah blah blah blah, heard it a million times before. It all means nothing until we see results.

Killzone 2 and Uncharted 2 say hello......
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Originally posted by: Hadsus
Originally posted by: purbeast0
i love my PS3 for some of the exclusives. any cross platform games i'm interested in i always get on 360 because of Xbox Live and the Xbox360 controller.

but if I couldn't play games like Uncharted, Ratchet and Clank, God of War, and Hot Shots Golf, those would all be selling points for me to go buy a PS3.

Blu ray too. I wasn't even looking at a PS3 last month when I went shopping for a Blu ray player but ended up getting one 'cause the Blu ray player I was interested in buying at BB was only about $50 cheaper. The newer Blu rays don't just play disks, but they stream from Youtube, Netflix, Hulu, etc. The PS3 does that with one of several apps you can install on your PC. And the 360, IMO, is just too loud and buzzy to enjoy movies. IMHO. :D

As a BD player, the PS3 is still the best value. You can't buy a brand new BD-Live player and an Xbox 360 Pro for $400; well not anywhere around here. (Though some smart a-- will probably tell me I can :p )

The 360 doesn't really have any games that interest me. Most of the ones that I do like usually end up on PC or PS3 at a later date. I never owned a Playstation 1 or 2 but I must say I'm quite happy with my George Foreman grill console.
 

cheesehead

Lifer
Aug 11, 2000
10,079
0
0
I might be wrong on this, but the PS3 has not had any of the 360's whole "spontaneous combustion" issues.

I really respect a console that actually, y'know, works.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Cheesehead
I might be wrong on this, but the PS3 has not had any of the 360's whole "spontaneous combustion" issues.

I really respect a console that actually, y'know, works.

To be fair, Microsoft has made it as painless as possible to help people who have had problems.(Something Sony has never done when problems plagued launch PS2s) The new Jasper models should have significantly better reliability.
 

cheesehead

Lifer
Aug 11, 2000
10,079
0
0
Originally posted by: dguy6789
To be fair, Microsoft has made it as painless as possible to help people who have had problems.(Something Sony has never done when problems plagued launch PS2s) The new Jasper models should have significantly better reliability.

There was a long period when it was not so painless. Any company that sells a $300 box that might melt any time you turn it on has some serious QC issues.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This is, at best, disingenuous. Please read the wiki entry on why Larabee isn't the PITA that the Cell is:

Is that a joke? Dealing with cache hierachy differences is why Cell is so rough? There are a lot of different ways that Cell is a bit different to deal with then LRB, but dealing with DMA level code is far more one of control versus laziness then extreme levels of complexity. It is more time consuming to manually handle it all, but it certainly helps avoid stalls from cache misses.

Larabee is a GPU using an extended x86 instruction set. It's not the Cell, and programming it is a very different proposition.

Larrabee is entirely in order execution units relying extremely heavily on vectorization to extract reasonable performance. The days of vomiting on your keyboard and having Intel's compilers make it run well are over.

How long until people realize that having DSPs masquerading as CPUs is a dumb design decision, and that extracting parallelism is a tremendously hard problem?

The programmers who do realize that need to rehearse this phrase over and over- "Would you like fries with that?". It will help them out tremendously within the next decade. Single core performance has ceased to progress in any meaningful way and there is nothing on the horizon that hints in any way whatsoever that is going to change. If you are a programmer and can't handle it, I reccomend you go back to school and find yourself a different profession, you aren't smart enough for the job moving forward. The current trend is going to change anytime soon. If you want to be a remotely decent programmer you learn to work in the new model. Crying that you suck too much to do the job isn't going to get you anywhere. Crying that someone needs to come up with an easier solution is a good way to land yourself with a top notch cell phone app job until they go mutli core not all that long from now.

I really don't think many people understand the _programming_ challenges involved here, and keep reverting back to what the hardware can do as an argument.

And how do those programming challenges compare to the laws of physics? Since more talented coders have already been managing to handle extracting parallelism from game code without too much of an issue, even on Cell, I would say that it is more reasonable to assume that top tier coders are going to figure out how to program effectively under the new model of architectures then any of the processor manufacturers are going to figure out how the laws of physics don't apply to them.

They could always use Larrabee plus the Cell. I think people are somewhat confused about what Larrabee is really about.

Why would they cripple their system with a wannabe Cell instead of using Cell with a GPU? Actually, in your world they should probably use a Pentium4, it was easier to develop for then anything that came after it.

The appropriate response to "developers find our hardware extremely difficult to program on" is not "too damn bad".

It shouldn't be, it should be 'there is the door, don't let it hit you in the ass on the way out'. I have no sympathy for people who whine about their job being hard in any profession. You don't like what you do? Find another job. Any person. Any job. Every processor manufacturer is heading in Cell's direction and away from single core OoO architectures. If you don't like it, find something else to do.

If they're going to do that, they need to lean on putting in more of the PPEs than SPEs.

Since the top tier developers all seem to be having no problem with the current setup, I don't think that is likely to happen. If they wanted to target the platform for the shittiest coders in the world they could have just thrown a single core x86 in there and been done with it, clearly that wasn't their goal.

Sony seems to sell people on the idea that nobody in the world could POSSIBLY understand the capability of their magical miracle box every generation, and the armchair-developer-console-fanboys eat it up every time.

Look up the fastest computer in the world right now. Go ahead, check out what it uses for processors. Performance per transistor or watt, whichever metric you chose to use, Cell is still the most powerful CPU on the planet. Does that mean it is going to be easy? Of course not. Is it the ideal choice for a console? Would depend on who you ask it seems. It seems like the most talented of the console developers all seem to have little trouble utilizing it and extracting superior results, along with IBM's team who put together the most powerful computer in the world using them and put out the numbers to use it. Last gen I was absolutely in favor of MS's approach over Sony's, this gen until they added a GPU to the PS3 design I was laughing as they were under the moronic assumption that something like Cell was going to compete with a GPU(Intel is making that same mistake but I digress). I call them like I see them. While it would have been very nice to have a G80 GPU in the PS3 I don't think Cell was a bad design choice by Sony at all.

It all means nothing until we see results.

Unchartered2, KZ2, GT5. We know the 360 has a stronger GPU then the PS3, there isn't really an argument on that one. Why is the PS3 offering flat out superior visuals in games? How can it be possible? How is it that Sony is doing things in games that the 360 can't match?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: dguy6789
Originally posted by: Cheesehead
I might be wrong on this, but the PS3 has not had any of the 360's whole "spontaneous combustion" issues.

I really respect a console that actually, y'know, works.

To be fair, Microsoft has made it as painless as possible to help people who have had problems.(Something Sony has never done when problems plagued launch PS2s) The new Jasper models should have significantly better reliability.

Not even close to the same thing. Many, many more 360's have had problems. Most people have had more than one die on them.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Ahem....

DEVELOPERS, DEVELOPERS, DEVELOPERS, DEVELOPERS!

DEVELOPERS, DEVELOPERS, DEVELOPERS, DEVELOPERS!

DEVELOPERS, DEVELOPERS, DEVELOPERS, DEVELOPERS!

DEVELOPERS, DEVELOPERS, DEVELOPERS, DEVELOPERS!
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
I much prefer Sony's approach of making the (crappy) developers cry as opposed to Microsoft's approach of making the customer cry with their insane hardware failure rates. Although I guess we should really be blaming enviro-nuts who wanted lead-less solder. Yeah, that worked out REALLY well for the X360 chips. BGA fail.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
What the hell, I'll bite.

Originally posted by: BenSkywalker
Is that a joke? Dealing with cache hierachy differences is why Cell is so rough? There are a lot of different ways that Cell is a bit different to deal with then LRB, but dealing with DMA level code is far more one of control versus laziness then extreme levels of complexity. It is more time consuming to manually handle it all, but it certainly helps avoid stalls from cache misses.
Have you ever even read the Cell documentation on how to do multi-threading? _It is insane_. And you can always tell the armchair-developers from the real deal when they tell you that "time consuming" is an A-OK trait for a development platform. TIME IS MONEY. "Cache misses"? These are video games. How many cache misses are you expecting? Think about how video games use memory for like ten seconds.

Larrabee is entirely in order execution units relying extremely heavily on vectorization to extract reasonable performance. The days of vomiting on your keyboard and having Intel's compilers make it run well are over.
Larrabee is a _GPU_, and it is not even pretending to be a CPU. This is _not_ the same as the Cell! I don't know how much more clear I can be.

And, again, typical armchair-developer talk. If the compiler can optimize, I damn well want it to optimize. TIME IS MONEY. Sony's compilers certainly optimize, I promise you that.

The programmers who do realize that need to rehearse this phrase over and over- "Would you like fries with that?". It will help them out tremendously within the next decade. Single core performance has ceased to progress in any meaningful way and there is nothing on the horizon that hints in any way whatsoever that is going to change. If you are a programmer and can't handle it, I reccomend you go back to school and find yourself a different profession, you aren't smart enough for the job moving forward. The current trend is going to change anytime soon. If you want to be a remotely decent programmer you learn to work in the new model. Crying that you suck too much to do the job isn't going to get you anywhere. Crying that someone needs to come up with an easier solution is a good way to land yourself with a top notch cell phone app job until they go mutli core not all that long from now.
You seem to not even understand what's being talked about here. Writing multi-threaded code is hard on an SMP architecture. Writing it on something like the Cell, which is asymmetric, is even more difficult. Unless there is some sort of compelling performance reason for this, and best I can tell, there is not, it was a dumb design decision.

And, just FYI: I used to write soft real-time space flight simulators for NASA in my last job. Each component was multi-threaded, and had components running simultaneously across multiple systems. I have a bit of expertise with writing high-performance multi-threaded code. What about you? Seriously, do you even have a degree in CS? (I do, and from a highly-ranked school, too.) I really don't appreciate being told I'm incompetent by someone who can't even understand what's being talked about.

And how do those programming challenges compare to the laws of physics? Since more talented coders have already been managing to handle extracting parallelism from game code without too much of an issue, even on Cell, I would say that it is more reasonable to assume that top tier coders are going to figure out how to program effectively under the new model of architectures then any of the processor manufacturers are going to figure out how the laws of physics don't apply to them.
You seem to not comprehend that "programming for the Cell" is not the same problem as "writing multi-threaded code". If the latter were so hard, the 360 would be experiencing myriad performance issues. Interestingly, it's not.

Why would they cripple their system with a wannabe Cell instead of using Cell with a GPU? Actually, in your world they should probably use a Pentium4, it was easier to develop for then anything that came after it.
Because, um, Larabee is a GPU, and the Cell has already proven itself not to be? Honestly, did you even read what you were responding to before responding in a fanboy rage?

Since the top tier developers all seem to be having no problem with the current setup, I don't think that is likely to happen. If they wanted to target the platform for the shittiest coders in the world they could have just thrown a single core x86 in there and been done with it, clearly that wasn't their goal.
Ah, so the definition of "top-tier" is reduced to "people who write reasonably-performing PS3 games". That's a definition that only a fanboy could love.

I feel like you didn't even comprehend what I wrote, which is starting to not surprise me. SPEs are hard to fill, because they're so specialized in what they do well. PPEs are generalized, and thus easier to fill. If programmers are finding that the ratio of SPEs to PPEs is too heavily weighted towards SPEs, it's time to adjust that ratio. Why do you think Sony has a magic crystal ball that told them that 7:1 is the right ratio?

Unchartered2, KZ2, GT5. We know the 360 has a stronger GPU then the PS3, there isn't really an argument on that one. Why is the PS3 offering flat out superior visuals in games? How can it be possible? How is it that Sony is doing things in games that the 360 can't match?
Funny that your examples of "unmatched visuals" are games that haven't even come out yet. Let me know how reality matches up with bullshots.
 

brblx

Diamond Member
Mar 23, 2009
5,499
2
0
no offense dude, but spouting off experience and qualifications on an internet forum makes you come off like a bit of a douche. not that i don't totally agree with and/or believe you, it just belittles your argument, imho.

for the record, killzone 2 is out and looks great, but didn't come close to delivering what it promised, much like the original. there really aren't any technically superior titles on the PS3, and that's just a simple fact. and it's already been pointed about that some of the non-exclusives actually look like poo because of the development difficulties.

i don't see how the failure rate of the MS console comes into play here. i've broken two and i barely even play them, but it still remains that there are a ton of them out there, and their hardware failures have no impact on how hard the console is to develop for.

edit- also, developers, developers, developers, developers.
 

PieIsAwesome

Diamond Member
Feb 11, 2007
4,054
1
0
Originally posted by: BenSkywalker
It all means nothing until we see results.

Unchartered2, KZ2, GT5. We know the 360 has a stronger GPU then the PS3, there isn't really an argument on that one. Why is the PS3 offering flat out superior visuals in games? How can it be possible? How is it that Sony is doing things in games that the 360 can't match?

Eh? Since when? :confused:

I'm not saying those games look bad but they aren't exactly proof that the PS3 is capable of outdoing the 360 graphically.

It doesn't help that in multiplatform games, the PS3 ends up with either worse visuals and/or worse performance. I lost the link, but there was a blog where the framerate from some games was taken in the same scene on both the 360 and PS3, and the PS3 often had worse framerates. In some scenes where the 360 held a cosntant 60 FPS in Call of Duty 4 the PS3 dropped below 50, for example.