Before I begin, I would like to say that I will be rather general, however I certainly invite/want the discussion to be "highly technical"(obviously). I'm also sure that this has been discussed to some extent before, but I did a search for it already.
Where do I begin? Theoretically and quite obviously, the PC is much more powerful than the console as far as hardware goes.
For example, the xbox has a modified 733 mhz Celeron Pentium III cpu, 250mhz 4pixel pipeline Nvidia xgpu, with roughly 64 megs of ram. (I know I left out a lot of the other details, again I'm being general to get the point across)
VS. the PC...(approximate specs of PC's in 2002)
For example, an Athlon 1800+, with 325+ mhz ati 9600 gpu, and standard 256megs of ram of its time.
I do not have specific frame rate comparisons or other benchmarks, but without a doubt the xbox will more than likely out perform the cpu setup from what I've seen with several games, particuarly sports games.
Now I will state the obvious, the cpu's resources are also running windows and whatever applications you may have on. The game that you play, must run through windows. While the console is soley concentrated on the game and nothing else. Though I am a noob, this need not be reposted for apparent reasons.
I will assume much of the same with the next generation of consoles. I will assume that a well programmed game for the PS3 or Xbox 360 will out perform the same game programmed for the pc(an attempt to answer my own question, maybe it is the programming). Even if the pc is an AMD 64 X2 4800+ with a next generation G70(maybe even with the SLI) with at least 2 gigs of ram.
Theoretically, it shouldn't be this way. So my real discussion is, what if you cut the crap and ran a game soley off the raw power of the PC? Not off of windows, not with windows, or nothing, basically if you made a game console with the AMD processor, and with 2 gigs of ram( I need not mention the GPU because the consoles deliver similarly for the time being). What are the bottle necks(if any, I would imagine there wouldn't be many)
This of course leads to talks of Bill Gates licensing the xbox software...but back to the focus, the PC should destroy the console's performance... why doesn't it? OR for all I know, I may be wrong, maybe the specs of the X2 system I described with a G70 SLI setup will crush an XBOX 360....let the discussion begin.
Where do I begin? Theoretically and quite obviously, the PC is much more powerful than the console as far as hardware goes.
For example, the xbox has a modified 733 mhz Celeron Pentium III cpu, 250mhz 4pixel pipeline Nvidia xgpu, with roughly 64 megs of ram. (I know I left out a lot of the other details, again I'm being general to get the point across)
VS. the PC...(approximate specs of PC's in 2002)
For example, an Athlon 1800+, with 325+ mhz ati 9600 gpu, and standard 256megs of ram of its time.
I do not have specific frame rate comparisons or other benchmarks, but without a doubt the xbox will more than likely out perform the cpu setup from what I've seen with several games, particuarly sports games.
Now I will state the obvious, the cpu's resources are also running windows and whatever applications you may have on. The game that you play, must run through windows. While the console is soley concentrated on the game and nothing else. Though I am a noob, this need not be reposted for apparent reasons.
I will assume much of the same with the next generation of consoles. I will assume that a well programmed game for the PS3 or Xbox 360 will out perform the same game programmed for the pc(an attempt to answer my own question, maybe it is the programming). Even if the pc is an AMD 64 X2 4800+ with a next generation G70(maybe even with the SLI) with at least 2 gigs of ram.
Theoretically, it shouldn't be this way. So my real discussion is, what if you cut the crap and ran a game soley off the raw power of the PC? Not off of windows, not with windows, or nothing, basically if you made a game console with the AMD processor, and with 2 gigs of ram( I need not mention the GPU because the consoles deliver similarly for the time being). What are the bottle necks(if any, I would imagine there wouldn't be many)
This of course leads to talks of Bill Gates licensing the xbox software...but back to the focus, the PC should destroy the console's performance... why doesn't it? OR for all I know, I may be wrong, maybe the specs of the X2 system I described with a G70 SLI setup will crush an XBOX 360....let the discussion begin.