- Oct 27, 2004
- 214
- 0
- 0
I would like to know technically how and why console such as the XBOX 1 (or the PS2) can handle games (graphically intensive) with its hardware compared to a hardware of a PC with the same or slightly more higher specs than the XBOX 1? I am really curious as to how the consoles do that specially the XBOX with just these specs: modified Pentium III 733Mhz processor and an "xgpu" gpu (ive even read somewhere its just like a gforce 4 variant)with a 64MB RAM on an nvidia chipset
I know if you have a PC with the same specs of the xbox 1 like a 733Mhz or an older gpu with a 64mb ram and play games with pc version lets say "halo" pc version or splinter cell pc version theres definitely no chance you can play it, or play it in a decent fps and graphics. Lets just say again a PC hardware spec with higher than xbox 1 lets say 1.5ghz processor and a gforce 3 or gforce gpu and play "halo" Pc version I think it is still impossible to play that game. while on an xbox or ps2 you can play it at its intended fps and at the highest graphical details. And while on the PC if you want to play games that just like on a console (games with a pc version counterpart) you will be needing extreme hardware specs of 2-3+ghz processor power, gforce 6600 or 7800 if you want to play games at its highest detail and decent fps.
moreover, the hardware specs of the xbox 1 can still handle new games made for both xbox1 and the pc (2005) while these new games cant be handled by PC hardware anymore if you still have the same pc hardware a year before. For example Doom 3, if you have an xbox 1 you have purchased three years ago you will likely still be able to play Doom 3, but if you have a PC you purchased also two or three years ago lets say an athlon xp or a sempron with a geforce 4 ti or just a geforce fx5200 you dont have a chance to play doom 3 the way it was intended to be played (and worst you might not even be able to play it at all on the specs you have three years ago).
I was wondering, could there be any regulation for game developers and hardware manufacturers that in order for you to play new games for the PC you need to always get the new hardware all the time while on consoles its a different thing? technically I have learned that in consoles there is a different set of processing instructions for its hardware than that of a PC but how about the xbox1? and lastly, correct me if im wrong but I was also thinking pc hardware lets say two years ago is much more stronger and powerful than consoles three years ago (xbox, ps2) but why cant it still handle newer games the way console handle newer games?
correct me if im wrong guys.
I know if you have a PC with the same specs of the xbox 1 like a 733Mhz or an older gpu with a 64mb ram and play games with pc version lets say "halo" pc version or splinter cell pc version theres definitely no chance you can play it, or play it in a decent fps and graphics. Lets just say again a PC hardware spec with higher than xbox 1 lets say 1.5ghz processor and a gforce 3 or gforce gpu and play "halo" Pc version I think it is still impossible to play that game. while on an xbox or ps2 you can play it at its intended fps and at the highest graphical details. And while on the PC if you want to play games that just like on a console (games with a pc version counterpart) you will be needing extreme hardware specs of 2-3+ghz processor power, gforce 6600 or 7800 if you want to play games at its highest detail and decent fps.
moreover, the hardware specs of the xbox 1 can still handle new games made for both xbox1 and the pc (2005) while these new games cant be handled by PC hardware anymore if you still have the same pc hardware a year before. For example Doom 3, if you have an xbox 1 you have purchased three years ago you will likely still be able to play Doom 3, but if you have a PC you purchased also two or three years ago lets say an athlon xp or a sempron with a geforce 4 ti or just a geforce fx5200 you dont have a chance to play doom 3 the way it was intended to be played (and worst you might not even be able to play it at all on the specs you have three years ago).
I was wondering, could there be any regulation for game developers and hardware manufacturers that in order for you to play new games for the PC you need to always get the new hardware all the time while on consoles its a different thing? technically I have learned that in consoles there is a different set of processing instructions for its hardware than that of a PC but how about the xbox1? and lastly, correct me if im wrong but I was also thinking pc hardware lets say two years ago is much more stronger and powerful than consoles three years ago (xbox, ps2) but why cant it still handle newer games the way console handle newer games?
correct me if im wrong guys.