The PS3 is like an onion

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: PieIsAwesome
Originally posted by: BenSkywalker
It all means nothing until we see results.

Unchartered2, KZ2, GT5. We know the 360 has a stronger GPU then the PS3, there isn't really an argument on that one. Why is the PS3 offering flat out superior visuals in games? How can it be possible? How is it that Sony is doing things in games that the 360 can't match?

Eh? Since when? :confused:

I'm not saying those games look bad but they aren't exactly proof that the PS3 is capable of outdoing the 360 graphically.

It doesn't help that in multiplatform games, the PS3 ends up with either worse visuals and/or worse performance. I lost the link, but there was a blog where the framerate from some games was taken in the same scene on both the 360 and PS3, and the PS3 often had worse framerates. In some scenes where the 360 held a cosntant 60 FPS in Call of Duty 4 the PS3 dropped below 50, for example.

Agreed. Just looking at screen shots on IGN, Killzone 2 doesn't look any better than Gears of War 2. Gran Turismo 5 doesn't look any better than Forza 3. The PS3 offering "flat out superior visuals" is a ridiculous claim.

Kill Zone 2

Gears of War 2

Gran Turismo 5

Forza 3
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
And you can always tell the armchair-developers from the real deal when they tell you that "time consuming" is an A-OK trait for a development platform.

Is English not your first language? Because I have yet to see anyone post anything resembling that in this thread. Perhaps you can tell me your native tongue and I can try to explain in that language instead?

"Cache misses"? These are video games. How many cache misses are you expecting?

Greatly depends on what you are doing. Using SPEs for tesselation/vertex deformation, using them for shadow creation or lighting simulation, many different areas where you could see cache misses and create stalls if cache management was automatic.

Larrabee is a _GPU_, and it is not even pretending to be a CPU. This is _not_ the same as the Cell! I don't know how much more clear I can be.

You can think yourself as concise as you like, doesn't change anything. Cell's initial design goal was to remove the necessity for a GPU, much like Larrabee. In the abstract you can claim Larrabee is a GPU, but a bunch of modified P56 cores with some basic texture sampling hardware seems a lot more like a vector processor with some graphics functionality then what any reasonable person would consider a GPU. Larrabee needs to have an abstraction layer running just for rasterization, if that is what you consider to be a GPU then by all means, knock yourself out.

Writing multi-threaded code is hard on an SMP architecture. Writing it on something like the Cell, which is asymmetric, is even more difficult. Unless there is some sort of compelling performance reason for this, and best I can tell, there is not, it was a dumb design decision.

Since you can't see why anyone would use an AMP setup it must be wrong, yeah, brilliant display of logic there. The performance differences are rather clear and easy to demonstrate, run a physics simulation on the 360s CPU and on Cell, best if you use one that someone else handled by the sounds of it. It isn't remotely difficult to see where Cell is significantly more powerful then its' SMP counterpart running certain types of code. Cell is, without a doubt, more powerful on a per watt or per transistor basis. There is debate on if it was worth the trade off given the increased development time required- the fact that Cell is more powerful isn't in question however.

You seem to not comprehend that "programming for the Cell" is not the same problem as "writing multi-threaded code". If the latter were so hard, the 360 would be experiencing myriad performance issues. Interestingly, it's not.

Where are you getting this logic from? The PS3 isn't experiencing myriad performance issues so it must be as straightforward to code for as the 360 using your exacting logic.

Because, um, Larabee is a GPU, and the Cell has already proven itself not to be? Honestly, did you even read what you were responding to before responding in a fanboy rage?

You break down then exactly why Larrabee is a GPU. Larrabee is closer to Cell's architecture then it is any GPU we have seen to date. Just look at the amount of abstraction require to get Larrabee to handle simple rasterization. Abrash published a nice article on it not that long ago. Maybe it's just me, but I have a hard time taking seriously that something is a dedicated device when its' most vocal supporter needs to explain how you emulate a GPU in order to get it to perform basic rasterization.

Ah, so the definition of "top-tier" is reduced to "people who write reasonably-performing PS3 games". That's a definition that only a fanboy could love.

A top tier developer can write code that performs well on any platform. If the most someone can handle is getting x86 code to perform well on a single core then they are far removed from it.

I feel like you didn't even comprehend what I wrote, which is starting to not surprise me. SPEs are hard to fill, because they're so specialized in what they do well.

Of course, who isn't aware of this? This is no different then PS2 was at all with its' vector units, which is something you keep dodging, why? With all of your self claimed expertise, why do you ignore the fact that Cell, for all of its issues, is still easier to deal with the EE was at this point in its life cycle? If you have nearly the level of experience you claim this should be well known to you. Cell *is* easier to develop for then what it replaced.

If programmers are finding that the ratio of SPEs to PPEs is too heavily weighted towards SPEs, it's time to adjust that ratio.

If Polyphony, NaughtyDog or Insomniac start saying it I would listen. I would take the word of people who have proven themselves utilizing manual coded vector units- but so far they think it works very well based on everything they have said.

Why do you think Sony has a magic crystal ball that told them that 7:1 is the right ratio?

You think of it bass ackwards is your problem. You think it is the hardware companies jobs to make your life easier. As a developer, your job is to utilize the hardware you are dealing with in the best possible manner. It is why we are currently seeing an increasingly large rift between platform exclusive titles in terms of what they offer.

Funny that your examples of "unmatched visuals" are games that haven't even come out yet. Let me know how reality matches up with bullshots.

KZ2 came out quite a while ago, GT5P came out a while before that and UC2 was playable at E3. Do you know anything about the PS3?

for the record, killzone 2 is out and looks great, but didn't come close to delivering what it promised, much like the original.

Playing the game and the old E3 trailer side by side, outside of better AA, what was the big difference?

Just looking at screen shots on IGN

Play the games side by side and get back to me. Hook up both your 360 and your PS3 side by side on comparable displays and try it out. It really isn't all that close(not huge, but the PS3 clearly has an edge on exclusive titles).
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Neat job ignoring my asking for any sort of programming credentials. I don't argue with fanboys who don't even know how to program, and cannot even be troubled to _read my sig_ where I clearly denote that I own a PS3 (which I play frequently). The rest of you can make your decision about who knows what they're talking about - someone with real experience in the field, or a PS3 fanboy with none.
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
Originally posted by: BenSkywalker

Playing the game and the old E3 trailer side by side, outside of better AA, what was the big difference?

Rubbish. Now I'm starting to doubt you have actually played KZ2. I finished the entire campaign
of it on my PS3 and it was a tremendous difference from the E32005 trailer. Take a look
at the Prerendered target. Look at the lack of ridiculous amounts of motion blur that pervades throughout the
retail game. Look at the character models, the smoke effects which far surpass the full game. Above all however,
take a look at the fluidness of the animations which was what blew everyone away at E32005 and is totally lacking in the full game. At that time
it looked like Crysis was going to be surpassed by this game. Instead in the final product we got a motion blur ridden
game that took its color pallete straight out of gears of war(which sucks) Did I mention the level design blows and is
atrociously linear? The only redeeming quality of the game was it's multiplayer.




Play the games side by side and get back to me. Hook up both your 360 and your PS3 side by side on comparable displays and try it out. It really isn't all that close(not huge, but the PS3 clearly has an edge on exclusive titles).

KZ2 is not one of them. GT5 is probably the best looking game on the PS3 currently, and Forza 3 matches that at the minimum.


You think of it bass ackwards is your problem. You think it is the hardware companies jobs to make your life easier. As a developer, your job is to utilize the hardware you are dealing with in the best possible manner. It is why we are currently seeing an increasingly large rift between platform exclusive titles in terms of what they offer.

As a developer, your job is to utilize it in the best possible manner in the time alloted Not every company has the luxury to try a decrypt the inner workings of the Cell processor and extract the maximum benefit out of it.
Guess what, whether you like it or not the majority of developers are on serious time constraints to finish a game within a limited amount of time.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: BenSkywalker
If Polyphony, NaughtyDog or Insomniac start saying it I would listen. I would take the word of people who have proven themselves utilizing manual coded vector units- but so far they think it works very well based on everything they have said.

Subsidiary companies aren't going to readily badmouth their parents...

While Polyphony is working their asses off to try and master the bitch that Cell is to produce a competitive GT5, Turn 10 Studios will have pumped out Forza 2 and Forza 3 in the same time.

No matter how you try to spin it, Cell was a huge mistake.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Neat job ignoring my asking for any sort of programming credentials.

You think for an instant people are going to believe what you say your credentials are? Did you just recently discover the internet? You could say you are Ken Kutaragi, wouldn't change my end of the discussion at all.

where I clearly denote that I own a PS3

Why the silly comment about unreleased games then? You expect anyone to believe that you own a PS3, are an industry insider, have extensive coding experience on the consoles and yet you paid no attention to what Guerilla did. I guess you could get some people believing you, not sure if those people would be capable of reading or not though :)

I finished the entire campaign
of it on my PS3 and it was a tremendous difference from the E32005 trailer. Take a look
at the Prerendered target.

E3 2K5 demo. OK.

Look at the lack of ridiculous amounts of motion blur that pervades throughout the
retail game.

DoF?

Above all however,
take a look at the fluidness of the animations which was what blew everyone away at E32005 and is totally lacking in the full game.

What level of fluidness are you finding so impressive with the 2K5 trailer as far as animation? It really isn't that great(not that it's bad, but nothing to write home about).

As far as the smoke, trailer does look better, but then you have the counterbalance of that like when your craft lands there is no debris kick up in the trailer while there was in game.

KZ2 is not one of them. GT5 is probably the best looking game on the PS3 currently, and Forza 3 matches that at the minimum.

You must have seen some very different Forza footage then I have. What I saw looked marginally better then Forza2.

Guess what, whether you like it or not the majority of developers are on serious time constraints to finish a game within a limited amount of time.

When have I said anything different at all? Is anyone here under the impression that the current situation is any different then it was with the PS2? It's one of the reasons why we keep seeing successive generations of games looking better and better. Developers aren't going to get it all figured out even after a few years. Take a look at GT4- if you showed footage of that at the PS2's launch people would have laughed at you(I would have been one of them in all honesty). It is a tradeoff, peak maximum performance versus ease of development. If anyone reads carefully they would notice I haven't bashed MS's decission to go the way they did either, it is a tradeoff. Where I see an issue is with people under the delusion that Sony's choice is all downsides.

While Polyphony is working their asses off to try and master the bitch that Cell is to produce a competitive GT5, Turn 10 Studios will have pumped out Forza 2 and Forza 3 in the same time.

Is that for real? How is the differing weather models working in Forza? How about rally racing? That coming along well? Appreciating the NASCAR element Turn10 added? Psyched up about having half the cars(disk swapping required!) that a portable racing game has? Is Forza3 going to manage to get more then half the cars racing online that Poly's *demo* has? Trying to blame Cell for GT5 taking longer is obnoxious given the staggering disparity in what GT5 offers compared to Forza as far as content goes. Turn10 seems to have their eyes squarely on the NFS franchise this generation, not Polyphony.
 

brblx

Diamond Member
Mar 23, 2009
5,499
2
0
wow, you, really, really need to un-apply your lips from sony's anus. GT5 is not the second coming of christ and KZ2 was another boring 'next-gen' shooter. get over yourself.

okay, as an avid racing fan, i can't leave it at that.

newsflash- rallying in GT4 sucked. GT4 was in fact little different from GT3 when it comes to the driving model. which kinda sucked.

forza2 was at a level of simulation well past that of GT4, and they're going much further with forza3. what on earth makes you think that GT5 is so head and shoulders above evey racer ever created? another tacked-on rally mode and weather?
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: BenSkywalkermes then? You expect anyone to believe that you own a PS3, are an industry insider, have extensive coding experience on the consoles and yet you paid no attention to what Guerilla did. I guess you could get some people believing you, not sure if those people would be capable of reading or not though :)
I said none of these things, except that I own a PS3. I have done some game development, but it's entirely amateur stuff, no professional studios. It's sad that your arguments are so poor that you've got to resort to making up stuff about me wholesale. I am well aware of what's being done on the PS3 - it's impressive, too. But ignoring the fact that almost all of those really impressive games have seen serious delays is ludicrous. Sony's internal studios don't have a choice as to what platform they're coding on, and it shows - really great games, but the project timetable is just a complete disaster. Either Sony is full of incompetent project managers, or the Cell is a PITA to program on - your choice. Personally, I'm going with the latter.

I don't think the 360 or the Wii are perfect, either, and I'd be happy to explain all the ways they should have been designed better, including a fair few features the PS3 got right. I just think that using the Cell was a poor design decision for the PS3, and that is unnecessarily burdened developers while giving very few useful performance advantages over the competition. There's no shame in agreeing to disagree, but I won't tolerate being hit ad hom for actually knowing what I'm talking about, versus just having read some gaming websites and deciding I know how software development works.

Clearly, neither of us are changing our position, so I guess I'm done here.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
wow, you, really, really need to un-apply your lips from sony's anus.

Last gen people were saying the same thing about MS to me.

GT5 is not the second coming of christ

Are you for an instant under the delusion that Forza3 is in the league of GT5 content wise? Not even Turn10 is trying to pretend they are.

KZ2 was another boring 'next-gen' shooter.

When did I say anything at all about the gameplay, level design or how enjoyable KZ2 was? I don't think GoW2 or KZ2 were very good, but they currently are the overall most impressive titles on a visual basis on their respective systems. Personally I have had the most fun with Geometry Wars out of everything on the 360 by a rather large amount, that doesn't mean I would use that when discussing the systems capabilities.

forza2 was at a level of simulation well past that of GT4, and they're going much further with forza3. what on earth makes you think that GT5 is so head and shoulders above evey racer ever created? another tacked-on rally mode and weather?

More tracks, more cars, more events and more racing modes then Forza3 by a long shot on top of weather. Why are you keeping the comparison to GT4 btw, GT5P is out and is clearly well ahead of Forza2 in terms of engine refinement. For the record I bought a 360 for Forza2, was shocked how small the improvements were over the first Forza.

I have done some game development, but it's entirely amateur stuff, no professional studios.

So you are the only person qualified to discuss the topic then without credentials? Given your comments that you would only converse on the matter with qualified people perhaps I shouldn't have read into that that you had some actual experience.

Either Sony is full of incompetent project managers, or the Cell is a PITA to program on - your choice. Personally, I'm going with the latter.

Really? What about option 3? Releasing your major system movers before you are at a price point to hit the installed base to maximize revenue isn't in your best interest? It would make little sense for Sony to stack slim/price cut and all of their system moving software into one holiday season, they would run the risk of supply limitations hindering the capability of the software driving hardware sales. Better to push the software back and spread it out to drive hardware sales next year after the bump from the price drop slows down and the holiday rush on UC2 and R&C is fading.

There's no shame in agreeing to disagree, but I won't tolerate being hit ad hom for actually knowing what I'm talking about, versus just having read some gaming websites and deciding I know how software development works.

You assume that I am only reposting that which I have read elsewhere, when I give examples you ignore them. You have tried to talk about how Larrabee is dedicated hardware, when I point out that even Abrash doesn't agree with you, you dodge that, you ask for examples of code types that would have potential issues with using automated cache management, when I give examples you dodge that, and you say I am only reposting things read on some fan site. Your answer to this all is that you know software development, and you assume others don't as most people who post in forums such as these know what matters is the content of your posts, nothing outside of it is relevant.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Owned? I think not.

Originally posted by: BenSkywalker
So you are the only person qualified to discuss the topic then without credentials? Given your comments that you would only converse on the matter with qualified people perhaps I shouldn't have read into that that you had some actual experience.
I am certainly qualified to talk about how multi-threaded, soft real-time software works. You'll notice some similarities to the gaming realm of problems, possibly because they are practically the same. Again, where are your supposed credentials? Or am I supposed to be at a _disadvantage_ because I have them? It amazes me that you can lash out at me against this, but given your general tone thus far, maybe not.

Really? What about option 3? Releasing your major system movers before you are at a price point to hit the installed base to maximize revenue isn't in your best interest? It would make little sense for Sony to stack slim/price cut and all of their system moving software into one holiday season, they would run the risk of supply limitations hindering the capability of the software driving hardware sales. Better to push the software back and spread it out to drive hardware sales next year after the bump from the price drop slows down and the holiday rush on UC2 and R&C is fading.
I think option 3 is a joke, and could only be taken seriously by a fanboy. Do you think they're deliberately announcing release windows and then blowing them? If Sony had wanted to do what you're describing, they would have just set the appropriate release window. This isn't just a problem with GT5 or GoW 3, it's been a consistent problem throughout the system's lifetime.

You assume that I am only reposting that which I have read elsewhere, when I give examples you ignore them. You have tried to talk about how Larrabee is dedicated hardware, when I point out that even Abrash doesn't agree with you, you dodge that, you ask for examples of code types that would have potential issues with using automated cache management, when I give examples you dodge that, and you say I am only reposting things read on some fan site. Your answer to this all is that you know software development, and you assume others don't as most people who post in forums such as these know what matters is the content of your posts, nothing outside of it is relevant.
I don't _ignore them_. Your examples are generally wrong and cannot distinguish between the problems of programming for the Cell and programming for the general multi-core case. That's the irony, really - you accuse me of ignoring almost unrelated flaws in my arguments while ignoring the gaping holes I'm opening in yours.

As for Abrash's article, let me quote it for you:
Because initial configurations are designed for use as GPUs

Larabee is a GPU. Follow on processors derived from it might not be, and when this happens, it'll be intriguing. But, right now, it's not, and "calling me out" for understanding this simple fact is silly. The fact that you've moved from attacking my arguments to attacking me is encouraging, really - it means you don't really have good technical responses to them.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Again, where are your supposed credentials? Or am I supposed to be at a _disadvantage_ because I have them?

Here is what you don't seem to be getting, I don't believe you have any credentials at all. That is nothing against you in particular, people who come into discussion threads and try to post resumes are almost always full of it.

I think option 3 is a joke, and could only be taken seriously by a fanboy. Do you think they're deliberately announcing release windows and then blowing them? If Sony had wanted to do what you're describing, they would have just set the appropriate release window. This isn't just a problem with GT5 or GoW 3, it's been a consistent problem throughout the system's lifetime.

Never took any marketing classes clearly. Do you try and drive demand for your product at a higher price point? Or wait to drive demand after you drop your price point? Obviously there is only one sensible approach. This isn't a Sony thing, this is a straightforward business decission. What are the new specs on AMD or nVidia's upcoming GPUs? We don't know, in fact we know a lot more about Larrabee then we do the much closer parts coming from ATi or nVidia. Why would that possibly be? Perhaps they are a bit better at handling marketing then what you seem to indicate a company should be. Intel has nothing to compete against Larrabee currently on the market so their best approach is to try and devalue the existing parts as much as possible against their upcoming offering, ATi and nVidia both very much rely on revenue from their existing offerings so they don't want to devalue them in the face of a launch that is right around the corner. By giving the illusion that you are going to have more support for a product at a higher price point you may help drive additional sales resulting in increased revenue. This isn't a difficult concept, and it certainly isn't a technique only Sony uses.

Your examples are generally wrong and cannot distinguish between the problems of programming for the Cell and programming for the general multi-core case.

The only issue you have brought up is manually handling on die cache for the SPEs, I stated from the beginning that it was more time consuming but it would help avoid stalls, you implied you wouldn't get stalls because it's a game, I provided examples of where you could. That is the only area you have touched outside of the abstract. I have further inquired to you how are the SPEs harder to deal with then the VUs in the EE to which you have given no response.

That's the irony, really - you accuse me of ignoring almost unrelated flaws in my arguments while ignoring the gaping holes I'm opening in yours.

What gaping holes precisely? You keep harping on 'Cell is harder'- noone has said it isn't at any point. The point I have stated is that Cell offers higher peak performance levels then the comparable easier to develop for solution. This is much as it was with the EE versus either the PPC chip in the GameCube or the Celery/P3 hybrid in the original XBox(although the graphics disparity was much larger in those systems as was the RAM limitations).

Because initial configurations are designed for use as GPUs

And you come to the conclusion that-

Larabee is a GPU.

I have to go back to asking what your native language is. The first round of Larrabee based products we are going to see are designed for use as GPUs, noone has argued that. I think I explicitly mentioned the fact that they were shipping with texture sampling hardware along with the modified P56 cores. That doesn't mean that they are GPUs by any definition we have been using to date. Larrabee is forced to complex emulation to handle rasterization, I can see you looked up the article that I was talking about so you must realize how much of a stretch it would be to reasonably call Larrabee a GPU instead of what it is, a vector processor with some rasterizer hardware thrown on that can be used as a GPU via emulation.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: BenSkywalker
Here is what you don't seem to be getting, I don't believe you have any credentials at all. That is nothing against you in particular, people who come into discussion threads and try to post resumes are almost always full of it.
I can provide a paper published at a major aerospace conference relating to the topic I was discussing if you want it. PM me for a link, I think the abstract is on Google. If you don't PM me, at least have the common decency to stop calling me a liar.

Never took any marketing classes clearly. Do you try and drive demand for your product at a higher price point? Or wait to drive demand after you drop your price point?
In fact, I have taken a marketing class as part of getting an MBA, but that's beside the point, really. The answer to your question is "it depends". I see why you'd answer it a certain way, but neither of us has access to the market and financial data that would be used to make that decision. It could also be that Sony would prefer to sell more now, but their software development schedule is slipping. Who knows? I've got my theory, anyways.

It is _bad management_ on Sony's part to not have a software development schedule that they 1) meet and 2) actually take into account when new hardware is coming out. So, it's more like you're going for option 1 in my opinion. That's fine.

The only issue you have brought up is manually handling on die cache for the SPEs, I stated from the beginning that it was more time consuming but it would help avoid stalls, you implied you wouldn't get stalls because it's a game, I provided examples of where you could. That is the only area you have touched outside of the abstract. I have further inquired to you how are the SPEs harder to deal with then the VUs in the EE to which you have given no response.
Two things:
1. I actually think that the thread management bits of the Cell and associated libraries are just as annoying than the cache management aspects - and I did bring this up previously.
2. Whether the Cell is harder to program for than the PS2's EE is completely irrelevant to the discussion of the Cell's disadvantages versus traditional SMP architectures, which is what the competition is using.

What gaping holes precisely? You keep harping on 'Cell is harder'- noone has said it isn't at any point. The point I have stated is that Cell offers higher peak performance levels then the comparable easier to develop for solution. This is much as it was with the EE versus either the PPC chip in the GameCube or the Celery/P3 hybrid in the original XBox(although the graphics disparity was much larger in those systems as was the RAM limitations).
Well, it'd be the threading thing, which you totally ignored. But, I think the point I'm trying to make is that it doesn't really seem like, to me, the higher performance of the Cell in a few situations is worth the massive increase in the time cost of programming on it. This is an opinion, but it is an opinion shared by a fair few developers.

I have to go back to asking what your native language is. The first round of Larrabee based products we are going to see are designed for use as GPUs, noone has argued that. I think I explicitly mentioned the fact that they were shipping with texture sampling hardware along with the modified P56 cores. That doesn't mean that they are GPUs by any definition we have been using to date. Larrabee is forced to complex emulation to handle rasterization, I can see you looked up the article that I was talking about so you must realize how much of a stretch it would be to reasonably call Larrabee a GPU instead of what it is, a vector processor with some rasterizer hardware thrown on that can be used as a GPU via emulation.
I am using GPU in the sense of "it's a graphics chip, not a general-purpose CPU, and is not intended to be running an OS on its own, for the moment". My apologies for not having been more clear. We are essentially splitting hairs on this point, IMHO.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If you don't PM me, at least have the common decency to stop calling me a liar.

I am not calling you a liar, I won't go that route as I don't know you. It is why I mentioned that it is nothing against you in particular. Anyone who posts their resume on any forum I auto assume they are full of it. We have had posters in here who have claimed to work in many different fields without knowing some extremely basic elements.

I see why you'd answer it a certain way, but neither of us has access to the market and financial data that would be used to make that decision.

Market data for sales trends based on projected software for a given platform? If you say you don't have access to that data I'll take your word for it. As I have said previously, posting resumes mean nothing but I will stand by my projections.

1. I actually think that the thread management bits of the Cell and associated libraries are just as annoying than the cache management aspects - and I did bring this up previously.

Thread management is going to be considerably more difficult on any AMP setup, one with cores as different as Cell is obviously going to compound that. In terms of library development I would say that given how new Cell is in relation to the more heavily POWER based cores that is to be expected. Not saying it isn't a valid point, but one that is improving the longer it is being used.

2. Whether the Cell is harder to program for than the PS2's EE is completely irrelevant to the discussion of the Cell's disadvantages versus traditional SMP architectures, which is what the competition is using.

When talking about Cell as a business decission it is absolutely essential to consider how it relates to the EE. The EE was more difficult to handle then Cell at comparable life cycle points and it utterly dominated the market, far moreso then the Wii is doing this generation. We can not discuss why Sony would decide upon the Cell when they had a ten year history of dominating a multi billion dollar industry and stuck with their typical pattern for success.

But, I think the point I'm trying to make is that it doesn't really seem like, to me, the higher performance of the Cell in a few situations is worth the massive increase in the time cost of programming on it.

As time progresses the higher performance of Cell starts to be displayed more frequently. When tool development and libraries have five or six years behind them in terms of evolution I think we will see that the situation where Cell is leveraged for superior results is going to increase by a considerable amount. Do I know that to be fact? No, but history would back that assertion as we have seen it repeatedly play out. Less then a decade ago pretty much every developer was saying that single core performance increases were the only way to go, now everyone is on the same page that it's multi core by default. I think vector processors have a lot more potential then what you seem to be thinking they do, development tools I see as something that simply need more work.

This is an opinion, but it is an opinion shared by a fair few developers.

I would be interested what the consensus will be at the end of this generation. It seems more likely then not at this point that Cell is coming back next generation too, perhaps they will make some changes that the developers have been talking about, perhaps they won't. I would imagine that that will depend heavily on what level of consensus there is amongst their in house development studios within the next year or two.

I am using GPU in the sense of "it's a graphics chip, not a general-purpose CPU, and is not intended to be running an OS on its own, for the moment". My apologies for not having been more clear. We are essentially splitting hairs on this point, IMHO.

I'm using it in the original definition, a rasterizer with dedicated hardware for handling graphics, and I agree that is a hair splitting issue ;)
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
"The PS3 is like an onion"

When I saw this thread title, I assumed it meant that the PS3 smelled worse as you peeled back the layers...:p

It's pretty well known that the Cell was not a great choice...it wasn't good enough so Sony had to add a GPU to make it competitive. Thats a fact.

The real payoff to this thread (and thousands like it) will be if Sony keeps the Cell in the next PS version (PS4?). If they kick it to the curb, you can pretty much infer that they want nothing more to do with it.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: BenSkywalker
I would be interested what the consensus will be at the end of this generation. It seems more likely then not at this point that Cell is coming back next generation too, perhaps they will make some changes that the developers have been talking about, perhaps they won't. I would imagine that that will depend heavily on what level of consensus there is amongst their in house development studios within the next year or two.
Totally agree - the ultimate test will indeed be whether they reuse it for a theoretical PS4. I also think they will, but I also think that the PPE:SPE ratio will be changed somewhat based on the problems people found trying to fill those SPEs. Maybe more like 3:10? Just my opinion.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Totally agree - the ultimate test will indeed be whether they reuse it for a theoretical PS4. I also think they will, but I also think that the PPE:SPE ratio will be changed somewhat based on the problems people found trying to fill those SPEs. Maybe more like 3:10? Just my opinion.

I think even if developers were very pleased with the current setup late in the life cycle it still may end up closer to that ratio as I would imagine that the next GPU used by Sony will be GT4xx/GT5xx or comparable so a good deal of what currently is well suited for the SPEs will be better handled by the GPU anyway(physics spring quickly to mind).
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0

No, absolutely not. Depth of Field is far more intensive on the GPU and has a very noticeable
effect where the foreground is sharp and the background is blurry, in sort of a quasi 3D effect

Depth of field done right

Killzone 2 uses motion blur which is either full screen or applied to specific
moving objects. The problem with this is that it is used on even slow moving objects where
it starts to look ridiculous. The slightest movement of your weapon results in blur being applied,
almost like a cheap form of anti aliasing


What level of fluidness are you finding so impressive with the 2K5 trailer as far as animation? It really isn't that great(not that it's bad, but nothing to write home about). As far as the smoke, trailer does look better, but then you have the counterbalance of that like when your craft lands there is no debris kick up in the trailer while there was in game.

From the beginning of the trailer, the gun animations are unsurpassed by anything I've seen in any game. The way it realistically moves without feeling mechanical like it does in most games. The grenade launcher and weapon reloading and the general movement of the player the entire time is much better than the retail.

This makes sense because the whole thing was prerendered, so of course all the animations were done superbly. The point is that this never translated to the full game.

You must have seen some very different Forza footage then I have. What I saw looked marginally better then Forza2.

Really now? What footage are you looking at or did you forget what Forza 2 looks like.

Forza 2
Forza 3
Forza 2
Forza 3

I would call that much more than marginally better.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: ExarKun333
"The PS3 is like an onion"

When I saw this thread title, I assumed it meant that the PS3 smelled worse as you peeled back the layers...:p

It's pretty well known that the Cell was not a great choice...it wasn't good enough so Sony had to add a GPU to make it competitive. Thats a fact.

The real payoff to this thread (and thousands like it) will be if Sony keeps the Cell in the next PS version (PS4?). If they kick it to the curb, you can pretty much infer that they want nothing more to do with it.

It's a fact, in so far as Sony always planned for the PS3 to have a gpu. Just the original gpu was going to be akin to the PS2's gpu (dumb rasterizer + general purpose processor), but instead they went for an off-the-shelf chip.

Oh, and Cell is theoretically about twice as powerful as the Xbox 360 CPU, but it's also about twice the size. The xbox cpu is dual core sized, cell is quad core sized. However, cell's architecture does seem to allow it to have much higher real-world efficiency in many tasks (primarily graphics related) than the xbox 360 cpu. That might be hard to make good use of in game though, due to having a powerful GPU already, but the OpenGL programming model allows it to be done a bit more easily than directX does.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
No, absolutely not. Depth of Field is far more intensive on the GPU

Motion blur is more intesnsive then DoF.

The slightest movement of your weapon results in blur being applied,
almost like a cheap form of anti aliasing

Are you talking about Quincunx? I'm not seeing what you are talking about unless you mean artifacts from Quincunx which does have a type of blur filter applied. It isn't motion blur.

From the beginning of the trailer, the gun animations are unsurpassed by anything I've seen in any game. The way it realistically moves without feeling mechanical like it does in most games. The grenade launcher and weapon reloading and the general movement of the player the entire time is much better than the retail.

Not seeing the big difference, the gun itself is very mechanical in both the trailer and the game, which it should be since, you know, it is is mechanical. The trailer animations are reasonable in the IK displayed, but the retail game seemed a bit more polished there to me, neither the game nor the trailer had cloth effects included and the trailer was lacking in proper animations on vehicle suspension and was missing numerous particle effects.

Really now? What footage are you looking at or did you forget what Forza 2 looks like.

I was looking at actual in game footage from E3, not press release comparison shots. I put close to no faith in screenshots for numerous reasons.

It's a fact, in so far as Sony always planned for the PS3 to have a gpu. Just the original gpu was going to be akin to the PS2's gpu (dumb rasterizer + general purpose processor), but instead they went for an off-the-shelf chip.

PS2 had a rasterizer. While Erwos and I may have been splitting hairs on Larrabee's classificiation, there is no way you can call the GS a GPU, it doesn't process anything, it just displays. The PS3's original design had a rasterizer, not a GPU.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
PS2 had a rasterizer. While Erwos and I may have been splitting hairs on Larrabee's classificiation, there is no way you can call the GS a GPU, it doesn't process anything, it just displays. The PS3's original design had a rasterizer, not a GPU.

I'm including VU0 and VU1 as part of the graphics 'chip'. They're definitely not rasterizers, one is fixed function iirc, and the other is a very general purpose chip.
 

Krakn3Dfx

Platinum Member
Sep 29, 2000
2,969
1
81
PS3's difficulty curve is nothing new for Sony. If you look at the lifespan of both the PS1 and PS2, the launch games vs. EOL titles are like night and day. Look at Fantavision vs. God of War 2. Look at Ace Combat 2 on the PSX vs. Ace Combat 3. Sony has made it pretty clear over and over that they're building systems that are meant to age like a fine wine, more so than other console makers IMO. The PS3 is likely no different.

There are always amazing looking games from Sony devs like Uncharted, Killzone 2, Motorstorm. I imagine in a few years, those games will look pretty primitive compared to what'll be out in 2011 or 2012.

XBox 360 was the same way too. I remember how people fawned over Perfect Dark Zero's graphics, and now that game just looks awful in comparison to Gears of War 2 or Halo 3.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I'm including VU0 and VU1 as part of the graphics 'chip'.

They were functional units on the EE, they weren't part of the GS at all. Using that definition would be akin to saying that a Voodoo1 paired with a Pentium3 was a GPU because the P3 supported SSE.
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
Motion blur is more intesnsive then DoF.

No, it is not. Try disabling DoF in any game that uses it and enabling motion blur, like in Crysis.
DoF has a huge performance impact due to the background and foreground layer calculations that are not present in a simple motion blur effect.

Are you talking about Quincunx? I'm not seeing what you are talking about unless you mean artifacts from Quincunx which does have a type of blur filter applied. It isn't motion blur.

It may be partially from Quincunx, but it's very clearly motion blur that is being applied on purpose, not as a side effect.

Not seeing the big difference, the gun itself is very mechanical in both the trailer and the game, which it should be since, you know, it is is mechanical. The trailer animations are reasonable in the IK displayed, but the retail game seemed a bit more polished there to me, neither the game nor the trailer had cloth effects included and the trailer was lacking in proper animations on vehicle suspension and was missing numerous particle effects.

The human beings using the weapons are not mechanical however, and that's what matters in the animations. Movement and fluidity. It shouldn't look like a preset animation is being played and it doesn't, while in the game it does.

I was looking at actual in game footage from E3, not press release comparison shots. I put close to no faith in screenshots for numerous reasons.

Those are not press release comparison shots. I don't know why you would make such accusations unless your blatantly biased. If I posted GT5 screenshots I doubt you would say the same.

I got them from this site
and the poster clearly says that he went into Forza 2 and took the shots to make them as close as possible to the Forza 3 ones.
There are many on that site and they prove that there is a large difference between the two games graphically.