• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

xbox 360

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Clauzii
7 Cell SPEs each running at 3.2GHZ with 256KB of local memory will do a LOT of physics calculations - that is where IBM, SONY and Toshiba hit it right I think, while future games will rely HEAVILY on realtime physics.

Too bad your misled, and also too bad that it has nothing to do with Pcs or the xbox360.
 
I MUST have been in a deep coma not looking at the day of the last post... VERY....

But the information I stated was actually in line with the hasbeen discussion ......

Whatever... 😉
 
Right now, from what we?ve heard, the real-world performance of the Xenon CPU is about twice that of the 733MHz processor in the first Xbox. Considering that this CPU is supposed to power the Xbox 360 for the next 4 - 5 years, it?s nothing short of disappointing. To put it in perspective, floating point multiplies are apparently 1/3 as fast on Xenon as on a Pentium 4.

xbox360 been ownt
 
Or not. Games like Battlefield 2 hardly relies on the cpu at all. It's all the GPU. We may just see more titles using this form of architecture.
 
Didn't you guys read Anandtech's article about the cell processor and how incapable it is? It was compared to a PIII at 1.4 GHz....

 
Originally posted by: bobsmith1492
Didn't you guys read Anandtech's article about the cell processor and how incapable it is? It was compared to a PIII at 1.4 GHz....
Yeah, because Anandtech so like totally designed the Cell and know exactly how it will run (i.e. like crap).

Dude, do you yourself even know anything about processors? And haven't you seen any of the demos?

Originally posted by: compgeek89

Right now, from what we?ve heard, ...

xbox360 been ownt
Yeah, owned, by heresay. 😕

... the real-world performance of the Xenon CPU is about twice that of the 733MHz processor in the first Xbox. Considering that this CPU is supposed to power the Xbox 360 for the next 4 - 5 years, it?s nothing short of disappointing. To put it in perspective, floating point multiplies are apparently 1/3 as fast on Xenon as on a Pentium 4.

And if you hadn't noticed, the first XBox had an Intel CPU, one that had already been out. This time around it's a COMPLETELY different case.
 
"to end to all computers?"

Even if I overlook the serious grammatical errors in that statement, it is still laughable. Do you seriously believe a console would bring all computers to the scrapheap?
 
Originally posted by: SumYungGai
Originally posted by: bobsmith1492
Didn't you guys read Anandtech's article about the cell processor and how incapable it is? It was compared to a PIII at 1.4 GHz....
Yeah, because Anandtech so like totally designed the Cell and know exactly how it will run (i.e. like crap).

Dude, do you yourself even know anything about processors? And haven't you seen any of the demos?

Originally posted by: compgeek89

Right now, from what we?ve heard, ...

xbox360 been ownt
Yeah, owned, by heresay. 😕

... the real-world performance of the Xenon CPU is about twice that of the 733MHz processor in the first Xbox. Considering that this CPU is supposed to power the Xbox 360 for the next 4 - 5 years, it?s nothing short of disappointing. To put it in perspective, floating point multiplies are apparently 1/3 as fast on Xenon as on a Pentium 4.

And if you hadn't noticed, the first XBox had an Intel CPU, one that had already been out. This time around it's a COMPLETELY different case.

OK then

The most ironic bit of it all is that according to developers, if either manufacturer had decided to use an Athlon 64 or a Pentium D in their next-gen console, they would be significantly ahead of the competition in terms of CPU performance.


Thats developers right there, and PC cpus are so far ahead of the "Incredible CPUs" in the next gen consoles that technology that would compare to them wouldnt even sell.
 
The most ironic bit of it all is that according to developers, if either manufacturer had decided to use an Athlon 64 or a Pentium D in their next-gen console, they would be significantly ahead of the competition in terms of CPU performance.

PC developers trying to take x86 code and compile it for the consoles. It is shocking they run as fast as they do under those circumstances. Console native devs seem to have quite a difference in perspective.
 
Dude, the CPUs in the consoles are weak. Why do you think they can make them so cheaply?

You just better hope those video cards are what matters. But we know nVidia already has a unified architecture card waiting for realease in spring-summer 06, so your o so great xenos advantage is almost gone too. (If it ever existed)
 
Dude, the CPUs in the consoles are weak. Why do you think they can make them so cheaply?

Simple- they can't. Go look around for what Ageia has to say about the console CPUs and PC processors right now. Please.

You just better hope those video cards are what matters. But we know nVidia already has a unified architecture card waiting for realease in spring-summer 06, so your o so great xenos advantage is almost gone too. (If it ever existed)

Xenos advantage is that it has eDRAM- unified shaders mean nothing in terms of performance(it is simply a different approach). I am quite certain that the RSX will be considerably faster overall then Xenos while the RSX has dedicated shader hardware(not unified). I also am not expecting nV to have a unified shader part ready in the spring despite current rumors- although I don't see that as meaning anything other then they don't see a need for it yet(which from what I have seen I would tend to agree with them on). Not saying it won't happen, but it isn't going to be close to a major event like some people are thinking it will be.
 
The eDRAM is very little advantage if any, "Whoopy 2xAA almost free!!"

4x AA is nigh free too, supposed to be less then 5% performance hit. That is where its performance advantage lies however, not in its unified shader approach. As a side note, nothing is stopping any company from utilizing eDRAM on the PC as a performance booster. If you are interested in the technology instead of being a platform bigot you may want to think about things like that.
 
Originally posted by: BenSkywalker
The eDRAM is very little advantage if any, "Whoopy 2xAA almost free!!"

4x AA is nigh free too, supposed to be less then 5% performance hit. That is where its performance advantage lies however, not in its unified shader approach. As a side note, nothing is stopping any company from utilizing eDRAM on the PC as a performance booster. If you are interested in the technology instead of being a platform bigot you may want to think about things like that.
eDRAM for PC is a bad idea, I think Anand already covered that topic in his article. Reason is too many resolutions.

16MB of eDRAM is enough for 4xAA at 720p (1280 X 720).
 
Originally posted by: compgeek89
Isnt 720p 1024x768?.... yes

Neither, actually... It is 720 lines of vertical resolution. It is very close to 768, but far off from 960 that Crazy quoted.

In addition to that, the resolution on the width varies based on pixel type/shape. Some 16:9 screens run rectangular pixels which have a resolution of 1024 X 768, or even 1024 X 1024. It uses a pixel processor to correctly scale the image. Most LCD-TV will run 1366 X 768 or 1280 X 768. It all depends...
 
Either way, very low res, we'd need about 32-64mb of that for 1600x1200. And it aint cheap, more efficient to add a pipeline I think.
 
Back
Top