Poor console CPU performance, claim game devs

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It's especially true for Playstation 2 developers because it's necessary to use assembly in some parts of Playstation 2 development process.
Bolded for emphasis. I mean you don't really expect 500,000 lines of game logic to be written in assembler and spread across multiple threads do you? More than likely some of the rendering code is assembler and this seems to be supported by them talking about OpenGL.

Given both links are dead for me it's hard to say as I can't see the whole article so I have to rely on your selective quoting.

Creating self contained threads and working with assembly on consoles is normal right now
It was also normal in the dark days of PCs. Fortunately we've evolved from that thanks to advancements in CPU architecture and compiler technology, advancements you're now claiming show be dropped because it's not the way of the future.

Likewise producing reusable and maintainable code is probably not important on a console given it's a fixed closed system with no real concept of patches or mods.

So why not dump data warehouses for $300 emachines
I'm not the one using "Cell" interchangeably whenever it suits me.

How fast can your PC decode a dozen HD video streams at once?
Has anyone made the same effort to create a program that can take advantage of the PC in such a fashion like they did on the PS3?

How fast does Cell run a 32 player UT2004 or Battlefield 2 botmatch? How fast does Cell run Far Cry or Doom 3 at 1600x1200?

Cell had no issue pulling it off at the PS3's unveiling.
You mean like the Battlezone "in-gane rendering" which we later found out was nothing more than a pre-rendered movie?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If you know so much, then maybe you should track down your own inside sources, talk with a few people who have worked with the actual dev kits, get their information, and write a full rebuttal to Anand's article.

I've already been over it with Anand, let's say he was quite in the dark also on console development.

Bolded for emphasis. I mean you don't really expect 500,000 lines of game logic to be written in assembler and spread across multiple threads do you? More than likely some of the rendering code is assembler and this seems to be supported by them talking about OpenGL.

Given both links are dead for me it's hard to say as I can't see the whole article so I have to rely on your selective quoting.

You aren't registered on Gamasutra? I thought you were big into following game development?

Register and read the article- they made their own subset of OpenGL that runs on the PS2(created in assembly)- I saved a lot of the best quotes for you to read yourself(some of the devs that post over there get a little testy if you quote too much of their articles). They had to make their own tools at the assembly level to get the performance they wanted- they also had to manage seperate threads in their code some of which were dedicated to running on VU1- this is nothing out of the ordinary for console development nor has it been- just to lazy PC devs.

Fortunately we've evolved from that thanks to advancements in CPU architecture and compiler technology, advancements you're now claiming show be dropped because it's not the way of the future.

When did I say it should be dropped? Devs are going to have to rework how they write code- and it is going to get harder for some time before progress is made to simplify it again. Compiler advancements have only been major on processor architectures dating to the transistor stone age- the ones you seem to be claiming are the holy nirvana of all things tech based.

I'm not the one using "Cell" interchangeably whenever it suits me.

No, what you are doing is downplaying the processor while knowing nigh nothing about it. Why don't you check out the lower clocked versions with comparable layout to what Sony is using that they are going to be putting in the high end applications. They are simply using more of them put together with higher end support hardware.

Has anyone made the same effort to create a program that can take advantage of the PC in such a fashion like they did on the PS3?

They have tried coming up with one that handles decoding a single stream properly- that has proven problematic due to the incredibly weakness of the x86 architecture- the one you claim is so incredible.

How fast does Cell run a 32 player UT2004 or Battlefield 2 botmatch?

High def decode was on the PC first, I'm not asking a jack @ss question like you need to. Ut2K4 is a seriously outdated title technology wise- BF2 is coming to the PS3 although obviously they are increasing the visuals as the PC is simply too weak to output the level of graphics that are expected of a next gen console.

How fast does Cell run Far Cry or Doom 3 at 1600x1200?

Low res mode? I'm not sure- if they are ported to the PS3 I'm sure they will be running 1920x1080 4x by default(well, FC may use HDR and give up AA). No idea how well they would stack up running in low resolution.

You mean like the Battlezone "in-gane rendering" which we later found out was nothing more than a pre-rendered movie?

Are you claiming they were in game? Because noone at the Sony presentation said anything remotely resembling that they were- I have it saved to my HardDrive and they never so much as gave a vague hint that the game was running in real time- let alone the warped attempt at a 'lie' some whackos make up in their own mind.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Register and read the article- they made their own subset of OpenGL that runs on the PS2(created in assembly)
Well that sort of backs up what I was saying,.

When did I say it should be dropped? Devs are going to have to rework how they write code- and it is going to get harder for some time before progress is made to simplify it again.
I don't necessarily see multithreading as requirement for future performance. Just because Intel and AMD have released dual core CPUs it doesn't mean clock frequencies won't get higher, RAM won't get faster, caches won't get fatter, etc.

I mean people have been proclaiming the doom of MHz for about 20 years and every time a new advancement enables higher frequencies. Organic and optic CPUs have hardly even been touched and there's massive potential in them.

No, what you are doing is downplaying the processor while knowing nigh nothing about it.
I'm downplaying the BS marketing surrounding these "PC killer" consoles. The point is you need a hell of a lot of effort into getting reasonable performance and even then they're no PC killers by any means.

They have tried coming up with one that handles decoding a single stream properly- that has proven problematic due to the incredibly weakness of the x86 architecture- the one you claim is so incredible.
All they need to do is to start using the 16 rendering pipelines sitting dormant in the GPU. It's only a matter of time before they start exploiting them in such a fashion, especially with Longhorn's unified WGF system coming up.

High def decode was on the PC first,
And?

Ut2K4 is a seriously outdated title technology wise
Perhaps; but those consoles would run it rather poorly.

although obviously they are increasing the visuals as the PC is simply too weak to output the level of graphics that are expected of a next gen console.
Are we talking about Cell or the GPU? If the GPU, let's wait until the consoles are actually released before looking at the PC GPUs available, okay?

Low res mode?
1600x1200 is low res? So what do you call 1080i and 720p?

if they are ported to the PS3 I'm sure they will be running 1920x1080 4x by default
Perhaps at a grand 30 FPS, just like your glorious X-Box and PS2 currently run games at.

Are you claiming they were in game?
No. Are you?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Well that sort of backs up what I was saying,.

Before they started writing any code they had a team spend months studying the available information on the architecture and learning assembly and then took the time to write their own tools in assembly so that they could code portions of the graphics engine in a higher level language. That same approach will work with the new consoles also.

I don't necessarily see multithreading as requirement for future performance. Just because Intel and AMD have released dual core CPUs it doesn't mean clock frequencies won't get higher, RAM won't get faster, caches won't get fatter, etc.

The titles that fail to use multithreading will fall behind those that do by an increasingly large amount until they end up several generations apart.

I mean people have been proclaiming the doom of MHz for about 20 years and every time a new advancement enables higher frequencies. Organic and optic CPUs have hardly even been touched and there's massive potential in them.

The difference is that we have always had the fabrication processes to overcome the predicted clock frequency troubles before we ran into them. This time around, we ran into unexpected clock frequency challenges long before we have a viable way of mass producing higher frequency chips. Don't get me wrong, I certainly think there is a lot of room left for single processor performance- several orders of magnitude over the next couple of decades, but that doesn't alter compounding that by using multicore.

I'm downplaying the BS marketing surrounding these "PC killer" consoles.

Of course they are going to kill the PC when they launch in terms of titles available for both platforms(XB360/PS3 v PC)- consoles always do. PCs have to deal with LCD and they have to deal with poorly written generalized code.

The point is you need a hell of a lot of effort into getting reasonable performance and even then they're no PC killers by any means.

How close to the consoles peak performance do you think they will get? If they manage to hit 10% it will be beyond the theoretical limits of today's fastest desktop processors.

All they need to do is to start using the 16 rendering pipelines sitting dormant in the GPU.

Cell was handling all of the computation for the HD demo they did- I'm sure if they added GPU decode they could have increased their numbers considerably also. I was pointing that out as an example of what they have already done with these 'weak' console CPUs that PCs can't come close to. It can be done- and it has been done- even if it is a lot more difficult to get up and running then it would be on the PC.

Perhaps; but those consoles would run it rather poorly.

Why do you think that?

Are we talking about Cell or the GPU? If the GPU, let's wait until the consoles are actually released before looking at the PC GPUs available, okay?

Almost everything. Cell has already demonstrated that it is capable of blowing away desktop CPUs in certain elements where it is likely to be used(physics calculations as an example) and the consoles have significantly more system level bandwidth then a PC, they also both are packing more powerful GPUs then anything available for the PC(that may change, the rest won't). You also have LCD to deal with. Even if the highest end PC is capable in a theoretical sense of handling everything in the next gen games the PC won't be able to see it due to LCD for quite some time. PCs do have a RAM advantage, but that isn't as much of a concern on the consoles as you can stream from storage as needed(and unlike PCs you can figure out exactly when it is needed) this is already done today on current consoles btw.

1600x1200 is low res? So what do you call 1080i and 720p?

1080i is 1920x1080.

Perhaps at a grand 30 FPS, just like your glorious X-Box and PS2 currently run games at.

Why is it that you think that a chip faster then the 7800GTX or X850XTPE would all of a sudden have problems with titles as simplistic as FarCry or Doom3? UE3 is already up and running on the next gen consoles without problem. As far as framerates go- it greatly depends on what titles you are playing. GT4 runs at a nigh contsant 60FPS all the time, while Halo is a whole lot slower. A lot like current PC games.

No. Are you?

Well who did? It wasn't Sony. I think that a title comparable to what they showed may be possible on the hardware at some point in the future- but then again I think the same about PCs when they have a little bit more processing power or the PPUs start shipping.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
1600*1200 = 1920000
1920*1080 = 2073600

Oh wow, look how much more resolution there is! And the xbox, it will be pusing 720P? 129600. Significantly less than 1600*1200.

The reason that the consoles will have problems running those *simplistic* games is lack of memory. PC games already *use* 1 gig - BF2 is taking 2 for smooth gameplay. Then again, as a conole gamer you must be used to having 6 textures in whole game, now you will get how many high-res ones? lol, not too many.

Time for lunch, more rebuttal to come later :)

Nat
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
If Cell is so good in Physics calculations why is there a thing called the Ageia PPU coming out soon which can do well 20 to 30 THOUSAND physics calculations on the fly?

Cell is good at specific uses, the CPU can do everything well.

So Cell can play a heavily physics laden game while decoding or encoding, or maybe even burning a DVD?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The reason that the consoles will have problems running those *simplistic* games is lack of memory.

If you look at the Nurburgring track from GT4 it has more geometry data for that single track then the PS2 has system RAM- this completely ignores the cars, textures, and code that needs to be run. Because consoles are a fixed platform they are able to stream that data off the disk on the fly and load it in to RAM in sections- a level of optimization not possible on PCs. In order to have the same track on a PC you would need to have ~384MB of RAM minimum for smooth operation(between OS overhead, sloppy PC code and loading all of the track and rest of the relevant game data).

PC games already *use* 1 gig

Shocking how poor the coding is, isn't it? Even Doom3 which runs on the XBox at settings comparable to the low quality mode on the PC is using less then one sixth of the system level RAM that the PC version uses- and that's Carmack's code. PCs are hopeless in terms of how wasteful they are with resources in going toe to toe with the consoles.

Then again, as a conole gamer

I'm not a console gamer, I'm a gamer.

If Cell is so good in Physics calculations why is there a thing called the Ageia PPU coming out soon which can do well 20 to 30 THOUSAND physics calculations on the fly?

Because Cell would be far too costly and be far too bandwidth limited to be used as an add in board. The Ageia PPU is a simply a much weaker version of what Cell's vector processors handle. Adding a couple hundred MBs of ~25GB/sec RAM and the power requirements of the processor itself would make Cell far too costly- the cost needs to be kept low to gain market acceptance.

So Cell can play a heavily physics laden game while decoding or encoding, or maybe even burning a DVD?

Actually the chip is extremely well suited for doing exactly that(not that it will be done). Use a SPE each for decoding and ripping and the other five for physics- it would still be more then a match for Aegia's setup running physics code alone.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Um, the texture load between Xbox D3 and the PC version is a lot different as far as my eyes tell me. Besides that, LQ 640*480 = sh!t, in my opinion. Graphics were the only thing that game even had going for it from my standpoint.

The streaming feature is nice, but not all games can really do that. A racing game is nice and linear, where as games like Far Cry or BF2 are not. It is a boon that the Xbox is getting a faster drive, and the PS3 could sport some really awesome data bandwith, but optical media is still much slower than a conventional hard drive, not to mention the speed of hardrives these consoles will be using.

Wasteful? Doubt it. If consoles can put up graphics as good as this years PC games will look, that will be impressive in my eyes. But a couple years from now... now so much. I guess the same can be said of the last couple console generations though, so this should be no surprise...

The reason I have a console is that not all the games are available on the computer, and there is something about playing MP or Kart with a bunch of people on the same big screen that appeals to me... meeting time, sorry for the OT...

Nat
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: HDTVMan
Originally posted by: Avalon
Originally posted by: HDTVMan
How can you argue with these graphics.

XBOX 360 Capture

I'm sold. Microsoft gets my money next round for sure.

The thread needed a little humor.
How can wars be held on what is not available to make judgement?

Wait---See---Decide

Then Game On.

No, I definitely agree. We needed some humor. I thought that was really funny.
:)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Um, the texture load between Xbox D3 and the PC version is a lot different as far as my eyes tell me. Besides that, LQ 640*480 = sh!t, in my opinion. Graphics were the only thing that game even had going for it from my standpoint.

The low quality settings on the PC are directly comparable to the XBox port. Graphics being the only thing the game had going for it makes it a good comparison to PC titles as that is pretty much the entirety of the argument as to why PCs are supposedly better then consoles.

The streaming feature is nice, but not all games can really do that. A racing game is nice and linear, where as games like Far Cry or BF2 are not. It is a boon that the Xbox is getting a faster drive, and the PS3 could sport some really awesome data bandwith, but optical media is still much slower than a conventional hard drive, not to mention the speed of hardrives these consoles will be using.

You don't stream data the same for a shooter the same way you would in a racer- set the level up in a grid and use a geometric LOD for simplifying the level data then use a staged LOD uptake based on the area of the grid you are in. Remember that consoles can code this tightly because they are a fixed platform.

Wasteful? Doubt it. If consoles can put up graphics as good as this years PC games will look, that will be impressive in my eyes.

That would be a horrific failure on the part of the console industry the likes of which they have yet to suffer. This years console titles look far beyond what is hitting the PC, by a long shot too.

The reason I have a console is that not all the games are available on the computer

Shouldn't that read hardly any games? Outside of FPSs and RTSs PCs have next to nothing to offer a gamer.

and there is something about playing MP or Kart with a bunch of people on the same big screen that appeals to me...

And by your comments I assume that you realize that this has not and likely will not be emulated by PCs now or any time soon.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Originally posted by: Avalon
Originally posted by: HDTVMan
Originally posted by: Avalon
Originally posted by: HDTVMan
How can you argue with these graphics.

XBOX 360 Capture

I'm sold. Microsoft gets my money next round for sure.

The thread needed a little humor.
How can wars be held on what is not available to make judgement?

Wait---See---Decide

Then Game On.

No, I definitely agree. We needed some humor. I thought that was really funny.
:)

LOL, definitely, this thread got so serious I got really tired of even coming back to read it - I mean, what point is there to it? It comes down to the games that you want to play, and whether or not this gen of consoles will be powerful enough. We evidently have strong, and somewhat differing ;), opinions, but we don't honestly know squat now and will probably only really have our answers a year from now.

The only thing for sure for me at this point is that I will be buying neither MS or Sony's consoles, probably ever :p

Nat
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Originally posted by: BenSkywalker
Shouldn't that read hardly any games? Outside of FPSs and RTSs PCs have next to nothing to offer a gamer.
wow what a blanket statement. I bet you couldn't count the number of PC games released since PS2 was released. Some are great and some are craps just like console games are.

speaking of adventure games, now i remember the good ones I've played over the years like Kings Quest series, Gabriel Knight 2, Leisure Suit Larry 7, Blade Runner, Sanitarium. These are not like platform action adventures found in consoles. There haven't been many adventures games like these today though, the only good ones i know are Broken Sword 3 and Priates. LSL 8 is a huge disappointment as Sierra tried to design it for consoles. As for racing games, the realism in FIA-GTR puts all GT games to shame.

anyway this thread has become a bit tasteless. I think these kind of console vs pc threads were all over the place when PS2 was introduced. There's no end to continue to argue over some not-yet-released console hardware vs current pc hardware. If the games and graphics of these future consoles are really that good, and the price of high end video card keep sky rocketing, I might start playing more console games, but... until then.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The titles that fail to use multithreading will fall behind those that do by an increasingly large amount until they end up several generations apart.
I don't believe so. In fact I think it's impact will be similar to the likes SSE which is near zero in terms of gaming. Sweeny seems to think the biggest benefit of dual-core will be masking the overhead of the DirectX/OpenGL API and all games should get that due to OS and GPU driver enhancements automatically giving it to them for free.

This time around, we ran into unexpected clock frequency challenges long before we have a viable way of mass producing higher frequency chips.
Intel did but that's because they were pushing MHz too hard. I mean the P3 1.13 GHz had to be recalled due to the chip being outside of the limits of current manufacturing at that time. And here we are today, almost at 4 GHz.

This is nothing more than a minor speed barrier which will soon be solved. In order for dual-core and MT to be mandatory on the PC MHz ramping would have to stop completely along with all other platform advancements (essentially turning PCs into fixed systems like consoles), but I don't see that happening anytime soon. Certainly not in the next 10-20 years anyway.

If they manage to hit 10% it will be beyond the theoretical limits of today's fastest desktop processors
I don't belive this at all. Sure there's sloppy code out there but there's also masterfully tuned code which the likes of Carmack, Sweeney and Croteam give us. AMD's A64/FX processors in particular are astonishingly good at running all types of code very fast.

and the consoles have significantly more system level bandwidth then a PC,
Dual channel 667 memory is already available on PCs. Besides, that ignores the fact that the consoles' RAM is shared as VRAM and that is currently slower than the likes of a 6800U or X800 PE, much less a 7800GTX doing load-splitting across SLI to essentially double its bandwidth.

they also both are packing more powerful GPUs then anything available for the PC(that may change, the rest won't).
I'd say a pair of 7800GTX GPUs in SLI will give any of those consoles a run for their money and we can already get those, unlike the consoles. And if ATi's Fudo is as good as they say it is slapping two of those in crossfire will be an absolute monster.

1080i is 1920x1080.
No, it's 1920x540.

while Halo is a whole lot slower. A lot like current PC games.
You'd struggle to find any PC that runs Halo as slow as the original X-Box runs it, and that's even with the sh*t porting job Gearbox did with the shaders.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Sweeny seems to think the biggest benefit of dual-core will be masking the overhead of the DirectX/OpenGL API and all games should get that due to OS and GPU driver enhancements automatically giving it to them for free.

Sweeney is all set to start pushing PPUs- he is contradictory in his claims.

Intel did but that's because they were pushing MHz too hard. I mean the P3 1.13 GHz had to be recalled due to the chip being outside of the limits of current manufacturing at that time. And here we are today, almost at 4 GHz.

And where did the last process refinement get us? Barely a slight bump. These are more serious issues then any we have seen before in terms of hitting the limits, by far.

In order for dual-core and MT to be mandatory on the PC MHz ramping would have to stop

Not at all. If Intel manages to push out a 6GHZ P5 while AMD is running quad core 4GHZ A64 x4s who do you think is going to be winning the benches? That would assume that single core processors have a future and that is looking like a very long shot ATM. Even if they did, in theory, have wonderful potential- who is going to be making them? Intel and AMD are moving everything to multicore now.

I don't belive this at all. Sure there's sloppy code out there but there's also masterfully tuned code which the likes of Carmack, Sweeney and Croteam give us. AMD's A64/FX processors in particular are astonishingly good at running all types of code very fast.

If you take absolutely perfectly optimized PC code running in the fastest possible manner at the absolute peak level of performance in a X86 processor and compare that to Cell operating at 10% effective Cell is faster if we are talking about any of the computations its SPEs can handle. Does not matter how highly tuned the code is- strictly talking about what the chips can do per clock- that isn't much for x86.

Dual channel 667 memory is already available on PCs.

PCs almost have 20% of the bandwidth consoles have.

Besides, that ignores the fact that the consoles' RAM is shared as VRAM

PS3 has 256MB assigned to the G70

slower than the likes of a 6800U or X800 PE, much less a 7800GTX doing load-splitting across SLI to essentially double its bandwidth

Which is significantly slower then the eDRAM used for the R500.

I'd say a pair of 7800GTX GPUs in SLI will give any of those consoles a run for their money

I wouldn't count on it. Look at what the souped up Voodoo1 is doing in the PS2. Let me frame that- I think that 7800GTXs in SLI COULD outgun the upcoming consoles GPUs IF they weren't in a PC.

No, it's 1920x540.

Take your choice- it's 1920x1080@30FPS or 1920x540@60FPS and the PS3 supports 1080p x2.

You'd struggle to find any PC that runs Halo as slow as the original X-Box runs it

You haven't seen it running on a P3 733 with 128MB of RAM and a GF4 I take it. The XBox obliterates that setup with ease.

and that's even with the sh*t porting job Gearbox did with the shaders.

The shaders are scaling in an almost perfect linear fashion with increased shader power of newer hardware- that doesn't tend to indicate poor coding(that would be something like the original Unreal engine which runs much slower then Quake3 on modern hardware).
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
wow what a blanket statement. I bet you couldn't count the number of PC games released since PS2 was released.

If they aren't any good, they don't offer anything for a gamer.

speaking of adventure games, now i remember the good ones I've played over the years like Kings Quest series, Gabriel Knight 2, Leisure Suit Larry 7, Blade Runner, Sanitarium.

Which is why I brought up Killer7- it is more along the line of those titles then other more action oriented adventure games.

As for racing games, the realism in FIA-GTR puts all GT games to shame.

That a joke? I figured I should check it out since everyone was talking so highly about it and was shocked to find an extremely poorly executed second tier title. First up on the sim element- every car I checked out handles like a mid/rear engine offering- it's as if the devs are Porsche fiends and expect everything to react the same. The visuals are significantly below what is currently on the consoles and MoTec doesn't work nearly as well as the data collected in Forza(when it works it's decent- but then sometimes it won't work at all).

A ~700bhp GTR car can't start off in third gear.....why? In limited traction situations it is sometimes handy to use less leveraged gearing to help keep traction in check- not an option with this 'realistic' sim. Also- the amount of cars is extremely limited in comparison to the titles I was speaking of as if the track selection and types of races available(although it makes no qualms about that being a GT-R sub class simulator).

I guess the biggest shock to me was how poor the visuals were though. It's not like you need to have killer visuals to have a killer game but this looks like something from the Voodoo3 days- extremely dated.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Not at all. If Intel manages to push out a 6GHZ P5 while AMD is running quad core 4GHZ A64 x4s who do you think is going to be winning the benches?
That depends entirely on the architecture of the P5.

PCs almost have 20% of the bandwidth consoles have.
Closer to 30%. Besides, even now PCs have more GPU bandwidth and SLI increases that even further. And how are those consoles' caches?

PS3 has 256MB assigned to the G70
Which has less bandwidth than a single 7800GTX, much less two of them.

Take your choice- it's 1920x1080@30FPS or 1920x540@60FPS and the PS3 supports 1080p x2.
Are you honestly telling me there's a choice there? 30 FPS is a choice? Okay then, I'd say a 7800GTX could run Far Cry or Doom 3 at 1920x1440 @ 30 FPS quite easily.

The XBox obliterates that setup with ease.
You'd need a pretty slow PC to get 640x480 @ 30 FPS from that game and I wouldn't expect its specs would need to be much better than the X-Box. Besides, SM 2.0 increases the quality over the X-Box version.

The shaders are scaling in an almost perfect linear fashion with increased shader power of newer hardware
That really doesn't mean much. Besides, we already have Gearbox on record stating the shaders suck.

There's a GOTY version of the game around that has much faster performance but because the shaders are tied to the maps (utterly hideous programming design if you ask me) they can't just make a simple patch to enable them and you can only get the faster performance if you download the new maps. From user accounts online the performance gain is enough to increase the resolution two notches or so and still get better performance than before.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
That depends entirely on the architecture of the P5.

It would have to be an insanely powerful architecture to come close to quad core A64s.

Closer to 30%.

Depends which system you are talking about- the PS3 has 25.6GB/sec bandwidth compard to the sub 5GB/sec of the fastest PC.

Besides, even now PCs have more GPU bandwidth and SLI increases that even further.

If we ignore the 256GB/sec data rate for the eDRAM on the XBox360.

And how are those consoles' caches?

In terms of size? Not as big as the PCs, but with the lethargic level of system bandwidth on PCs they really need a lot more cache.

Which has less bandwidth than a single 7800GTX, much less two of them.

It does, and if bandwidth becomes a major issue then that will be an area where the XB360 will obliterate it(along with SLId 8800Ultras/GTX or whatever nV is calling them).

Are you honestly telling me there's a choice there? 30 FPS is a choice?

I'm saying it depends on the terminology you use.

Okay then, I'd say a 7800GTX could run Far Cry or Doom 3 at 1920x1440 @ 30 FPS quite easily.

I would hope so, they are quite archaic compared to what the upcoming consoles will be pushing. Those are games that run on the XBox1.

You'd need a pretty slow PC to get 640x480 @ 30 FPS from that game and I wouldn't expect its specs would need to be much better than the X-Box. Besides, SM 2.0 increases the quality over the X-Box version.

SM 2.0 parts decreases the quality of numerous effects in Halo, I force my rig in to 1.1 mode as the downgrade in certain effects aren't worth the trade off. As for performance, Anand posted scores @1024x768 of just under 35FPS using a 9600Pro in a P4 2.8GHZ system with 512MB of RAM. Let's cut the processor clock speed by 75% and reduce the amount of RAM by 70% and see how it does. Link.

There's a GOTY version of the game around that has much faster performance but because the shaders are tied to the maps (utterly hideous programming design if you ask me)

It makes sense to code them that was when you are looking at a fixed platform. Shared memory for vid/system allows you to load them once and avoid moving data around too much. You look at where your bottlenecks are and deal with them.

From user accounts online the performance gain is enough to increase the resolution two notches or so and still get better performance than before.

They went with simpler shaders- no surprise it increases performance. The same was done for HL2 too.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Ben skywalker,

You have totally ignored my post. Your basing all your arguements on speculation. You have not worked with the cell. You do not know how it operates. Find someone who has worked with the cell and knows how it operates, come back here and then post your arguements. Talking it over with Anand is not doing your own research.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Find someone who has worked with the cell and knows how it operates, come back here and then post your arguements.

Already have, where do you think my arguments come from. I was talking to people working on Cell based titles some time prior to Anand posting anything on the new consoles. I have talked to them about the development environment and how impressed they are with how much easier it is to deal with Cell then it was with the EE pre launch.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If we ignore the 256GB/sec data rate for the eDRAM on the XBox360.
All 10 megs of it. Add up the caches on a GPU (such as ATi's X800 series Z cache which holds enough data for 1920x1020 resolution) and you'll probably get quite close.

Those are games that run on the XBox1
Sure, at 640x480 <= 30 FPS with seriously reduced effects and IQ.

SM 2.0 parts decreases the quality of numerous effects in Halo, I force my rig in to 1.1 mode
I don't think so and not even the developers agree with you there. From Bungie's website:

Pixel shaders 2.0 (DirectX 9.0)
In this code path, you are making absolutely no compromises on the visual quality of the game. You are seeing everything as best as possible, as engineered by our team. All the effects are in their most demanding form (as complex of a calculation as necessary to generate the best visual result possible).

Pixel shaders 1.4 (DirectX 8.0)
When running in PS1.4, you are compromising only a subset of effects. Specifically:
- No bumped mirrored surfaces
- Some video effects are two-pass

Pixel shaders 1.1 (DirectX 8.0)
PS1.1 is probably the most widespread pixel shader version currently. When running in the PS1.1 rendering code path, the visual compromises are (in addition to the PS1.4 compromises):
- No model self-illumination (excluding some specific environmental models)
- No animated lightmaps
- Fog calculations are triangle based, not pixel based
- No specular lights

Anand posted scores @1024x768 of just under 35FPS
I don't see any 640x480 scores in there. PC mimimum requirements are 733 MHz + 32 MB GPU which is usually a target of around 25 FPS -30 FPS at 640x480 & lowest detail levels, exactly what the X-Box is.

It makes sense to code them that was when you are looking at a fixed platform.
Perhaps, but it's another stunning example of console development bucking major software engineering practices to produce hideously poor code.

They went with simpler shaders- no surprise it increases performance.
I have seen no evidence that suggests the IQ was degraded in any way.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
All 10 megs of it. Add up the caches on a GPU (such as ATi's X800 series Z cache which holds enough data for 1920x1020 resolution) and you'll probably get quite close.

First off, there is nothing remotely close to 10MBs worth of cache on any of the current GPUs. Next- what on Earth gives you the impression that they would remove the cache from the R500's design? The eDRAM is for tiling out the back buffer data and storing Z data- the main isssues with bandwidth(which the X360 won't have).

Sure, at 640x480 <= 30 FPS with seriously reduced effects and IQ.

Settings the average new PC sold today still can't handle- but the point was that they are very far removed from what next gen titles are already looking like.

I don't think so and not even the developers agree with you there. From Bungie's website:

Active camo is totally hosed under 2.0. Bungie can say what they want- check it out yourself.

I don't see any 640x480 scores in there. PC mimimum requirements are 733 MHz + 32 MB GPU which is usually a target of around 25 FPS -30 FPS at 640x480 & lowest detail levels, exactly what the X-Box is.

Quite clearly you haven't seen the PC running in the lowest quality settings- check it out for yourself(the XBox is far beyond the low quality settings for the PC- actually the only setting on the PC beyond what the XB has is some shader effects).

Perhaps, but it's another stunning example of console development bucking major software engineering practices to produce hideously poor code.

Optimized for the platform- what most devs consider real code. GT4 is designed to stream data off of the optical drive as its RAM can't hold it(with no indication to the end user that it is ever happening)- if the sky started falling and a decent racing game such as that were ported to the PC would that be another example of poor coding?

I have seen no evidence that suggests the IQ was degraded in any way.

I find that statement highly amusing given your rabid anti shader swap stance- no matter if it effects IQ or not that you held not that long ago. IIRC, which I do, I tended to be the one that was on the side of 'of it doesn't impact IQ who cares' while you vehemently opposed that line of thought. Stance change in light of how many 'cheats' ATi is pulling off or due to the context of the discussion?
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Ok ben, I trust you and I won't ask you to reveal your sources. Are you sure those people aren't just Sony/IBM hired henchmen? Well anyways, I still doubt that IBM/Sony's cell would be the processor of the future. Maybe if they came out with a cell that was optimized for PC use instead of gaming use.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
The Bohemia Developers, the ppl who made Operation Flashpoint. Are using all their expertise on the Xbox console and upgrading that and making Allied Assault (i might have got the name wrong) which is really Op Flash 1.5 for the PC.

The only reason why they are doing that, is because they only were making advances on the console because thats what their publisher wanted.

Theyve even said that its easier to code for the PC than for the console, because the console require masses upon masses of optimisation.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Are you sure those people aren't just Sony/IBM hired henchmen?

Some of the people I've talked to I've been talking to since prior to the launch of the PS2 and I can assure you that they have no qualms ripping Sony- badly- if there are issues. They tend to be more forgiving of certain design decissions then I would be(GS operating as a native 16x0 part- stupid design choice IMO) however they have always been rather realistic in their assement of what the machines are capable of(they just think certain features are a lot less important then others). When they talk about working with Cell there are certainly difficulties in getting the hang of what you should and shouldn't do, but it is nothing like EE where there was an enormous amount of pulling hair out trying to figure out what you simply could and could not do.

Well anyways, I still doubt that IBM/Sony's cell would be the processor of the future.

It won't replace x86 by any means, too much legacy and momentum to do anything like that. Not even Intel could manage to replace x86 as of this point- unless some quantum leap in emulation performance shows up. Intel and AMD are both heading in Cell's direction though- check out Intel's roadmap. The Cell design is the direction that the processor industry is headed in, just with x86 main cores.