Console's Graphic Power

oconnect

Member
Jun 29, 2004
50
0
0
I was wondering how a console for example the Xbox outputs such great graphics on a fraction of computing power of modern PCs. Xbox's hardware consists of a 733mhz (Celeron), a geforce 3 Ti (I think), 64 megs of DDR memory, & a 10 gig hard drive. Compared to my computer that is a crap box. It makes me wonder why Windows is so poorly configured for gaming. Something is really off balance here.

I've tried configuring my system for optimum gaming performance. Shut down services, tweaked with drivers. I still don't get the performance compared to an Xbox. If an Xbox was built on my computer it could display 20x to 30x the graphics it currently outputs.

I've heard the Xbox runs on a modified version of Windows 2000. If true why doesn't Microsoft release a version of windows just designed for gaming performance. I wonder if Microsoft is in bed with all the hardware companies. I wonder if they purposely make gaming performance pour compaired with what it could be so when the hottest new game comes out we will rush & spend another G upgrading our computers.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
If everyone in the entire world had exactly:
1 P4-3.2 GHz
1 256 MB Radeon 9800XT
1 GB of RAM

Then you could see some great graphics(well, greater), cause the developers wouldn't have to worry about anything else, and could write their games to this exact spec.
 

nowayout99

Senior member
Dec 23, 2001
232
0
76
Televisions are low-resolution, run no more than 60 frames per second, but have a much higher brightness level than monitors. It's a matter of what you can do with that low resolution and limited framerate, as opposed to trying to push for more frames and higher resolutions.

Consoles are also "permanant" hardware in that developers don't have to worry about weaker-powered systems. All Xboxes are the same, so they try to max out the hardware as best they can.

Part of it is perception also. What looks good on TV might look like crap on a monitor. If I stretch out a 640x480 image to full screen on my 12x10 display, it looks weak and I can see all of the flaws pretty well.
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
not to mention that you will not be:
video editting on an xbox
image editting on an xbox
using a word processing program on an xbox
running matlab on an xbox
running any sort of CAD on an xbox
compiling large code on an xbox

how messy would your room be if you did everything in it? and i mean everything.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: nowayout99
Televisions are low-resolution, run no more than 60 frames per second, but have a much higher brightness level than monitors. It's a matter of what you can do with that low resolution and limited framerate, as opposed to trying to push for more frames and higher resolutions.

Consoles are also "permanant" hardware in that developers don't have to worry about weaker-powered systems. All Xboxes are the same, so they try to max out the hardware as best they can.

Part of it is perception also. What looks good on TV might look like crap on a monitor. If I stretch out a 640x480 image to full screen on my 12x10 display, it looks weak and I can see all of the flaws pretty well.

TVs generally refresh at ~40Hz, I believe (although maybe HDTVs are faster). Basically, the XBox runs everything at 640x480 NoAA NoAF (although the XBox is capable of putting out 4:3 720p, which is 960x720), and all the games are tweaked to look good with those settings.
 

FrankSchwab

Senior member
Nov 8, 2002
218
0
0
Originally posted by: Matthias99

TVs generally refresh at ~40Hz, I believe


Bzzt! Wrong answer.

NTSC TV's update at 59.96 Hz - why it's not exactly 60 hz, I don't know, but let's assume it is.

This overstates the graphics load by a factor of two though - because each frame only updates half of the horizontal scan lines in the picture. This means that the graphics processor needs to generate about a 640x240 picture every 1/60 of a second, or a 640x480 picture every 1/30 of a second.

If you live in a part of the world where the PAL standard is used, you've got a similar situation, with slightly different numbers. PAL gives you roughly 640x576, so it's higher resolution, but it only paints the screen every 1/50 of a second.

Taking the NTSC case, you have (640x480x30) = 9.2 MPixels / second that you're calculating and displaying. Compare that to a 1024x768 computer monitor at 85 Hz update = 67 MPixels /second that you're calculating and displaying. You can have a graphics subsystem that's 1/7 the speed, and be able to render the same scene at the same rate.

That's why Consoles can create good graphics.

/frank


Link for the dubious
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
The resolutions actually are 720x480 and 720x576, respectively. And you'll only need to produce 30 or 25 images per second, since the TV displays in an interlaced fashion. Finally, TV screens have huge dot pitch, and the standard composite or SVideo signal has such a low bandwidth that you don't even need to bother with anti-aliasing - the display technology will add the blurriness at zero expense.

On the programming end, when you got a hardware platform that's 100 percent defined and constant, you can (a) code much more specifically and (b) dispose of manier inbetween abstraction layers in the driver and system architecture.

These two facts stirred together, and there's your answer.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
console gaming systems are nowhere near as powerful as PCs. If you output your pc games to a tv with a resolution of 640x480, pc games will be as fast if not faster than what the xbox delivers. on an lcd projector, the xbox poor resolution is apparent, and looks kind of crappy. As an example, we compared Finding Nemo played from a PS2 and a computer going through lcd projector. THe PS2 looked like crap, and the pc delivered a stellar picture at 800x600 resolution. my point is, speed is attained through low resolution and to some degree dedicated graphics power.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: Peter
The resolutions actually are 720x480 and 720x576, respectively. And you'll only need to produce 30 or 25 images per second, since the TV displays in an interlaced fashion. Finally, TV screens have huge dot pitch, and the standard composite or SVideo signal has such a low bandwidth that you don't even need to bother with anti-aliasing - the display technology will add the blurriness at zero expense.

On the programming end, when you got a hardware platform that's 100 percent defined and constant, you can (a) code much more specifically and (b) dispose of manier inbetween abstraction layers in the driver and system architecture.

These two facts stirred together, and there's your answer.

ntsc is actually pretty complicated. In short, you're both right. NTSC has more than one resolution format.

The problem with NTSC and its ilk is that it's resolution is partially analog. In effect, it's horizontal resolution could be effectively infinite if your hardware were good enough. I practice, it's nowhere near infinite. And also, they have a funny way of counting the resolution becuase they sometimes count the vertical blank and closed captioning as additional lines which wouldn't be done with modern conventions.

link

more link
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
True, the horizontal signal is analog, and you can call it any resolution you like. However the broadcast signal bandwidth and all the associated hardware has a bandwidth somewhere in the 6 MHz range, and that allows little more than about 360 properly separated pixels. DVD movies are typically rendered at 720, using the inherent blurring of the low bandwidth signal path for a cheap anti-alias.

I hope there's no debate that NTSC uses 480 visible lines, though ;)
 

PhoenixOrion

Diamond Member
May 4, 2004
4,312
0
0
Low resolution with high frame rates (tv gaming) vs. high resolution with high frame rates (pc gaming)?

People wonder why on tv you really don't need the high powered hardware to smooth out the edges of a tree line or 1.2 mile draw distance outline of a mountain range (AA/AF properties). It's because its a blur and that's why its impossible to make a headshot with an opponent a mile away on an FPS game.

I'll take pc gaming for all the eye candy :eek:
 

MalikChen

Senior member
Jan 5, 2004
236
0
0
Originally posted by: PhoenixOrion
It's because its a blur and that's why its impossible to make a headshot with an opponent a mile away on an FPS game.

And you say this like it's a bad thing?
 

tart666

Golden Member
May 18, 2002
1,289
0
0
the xbox bw in 720p mode is comparable to a pc monitor in 1024x768, no? yet the xbox still outperforms a PC with 733cel/ GF3, no?
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
dedicated hardware coded over years and tweaked vs. revolving graphics standards. I wonder if the xbox could render doom3 at 1280 x 1024 and 4x AA and 8x Aniotrophic (sp) at 50 to 100fps. I would wait and compare Doom3 for Xbox and Doom3 for the PC with a new processor graphics card both maxed out and see.
 

tweeve2002

Senior member
Sep 5, 2003
474
0
0
I agree that the resolution of TVs have some to do with it, but it also is the consumer that shares some of the blame for low resolution for TVs, do you really want to buy a 32" TV that would have the same resolution of a Monitor that is that size, no the cost would be too great.

Also Console games are make to take full advantage of the hardware that comes with system...havent you noticed as the console ages the games start looking better then the first gen games, thats because the developers have found out how to squeeze every last drop from the hardware.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
Originally posted by: tart666
the xbox bw in 720p mode is comparable to a pc monitor in 1024x768, no? yet the xbox still outperforms a PC with 733cel/ GF3, no?

I am sure that the xbox simply cant output at 720p and not take a major hit. Something HAS to be taken away to achieve that, and its most likely a higher frame rate. 720p is exactly 1280x720, since its widescreen.

That being said, even though TVs are interlaced, the full 640x480 framebuffer is used(mostly), even at 60fps. The other half of the vertical information is filtered in, similar to the "flicker filter" you have on PC TV-out. Which is why its no sweat for consoles to run in 480p, which is double the resolution of 480i. The information is already there, and all of it is used. Some exceptions to this are some early ps2 games.

As far as why its so much better on a console, Id say its half the hardware, and half the developers.

The hardware can be so much more powerful, but its not completely standardized. So lots of contingencies need to be taken into account, and the end product is more like a good compromise that will work for everyone, rather than the best it could possibly be for one architecture.

But the developers know how much excess speed they have, and know that it would be near impossible to get it 100% bug free and efficient to begin with, so they dont even try as hard. You might need an eloquent solution on the consoles, because you have a limited amount to work with, but with PC games, ti appears a brute force method is applied, and the user is expected to have better hardware to compensate for such. Why waste money and time on programming well, when you can just shift that cost onto the people buying the games?

What really shocks me about this generation is that last time around, it didnt take too long for PC hardware and games to look WAY better than consoles. This time around, a lot of console games STILL look better than PC games.
 

warcrow

Lifer
Jan 12, 2004
11,078
11
81
Great thread guys, I just would like to add a few things:

- Console developers get what they call "Free Anti-Alaising" from a TV because of the low-low resolution of most standard TVs (Read:not HDTV), it will mesh the jaggies together. There is no need to waste computing cycles on whats already there.

- Consoles are *builting* from the ground up to do exactly what they do, play games. You're PC was not built from the ground up to play games. Now, the GPU on your X800 helps it along with doing that (it a very good way I might add), but it only helps.

- Exactly like someone already said, a console developer knows EXACTLY what their hardware target is, and how to eek out every like clock cycle. The PC is much more complicated. Take Half Life 2 for example. Valve has made it so highly scalable that it will run on anything from a TNT2 ----> X800 so that their is more of an audience capable of playing their game.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Build a 700 Mhz P3 with a GeForce 4 and 64 MB of video ram and try running Quake III at 512 x 448i with maximum settings, and set com_maxfps = 30 or 60. It will run quite smoothly, with just about any PC equivilent game that you'll find on the X Box. If it looks sucky, just try a CTF or custom map where there is some color and it will look as good as any console game. What about Halo? I'm convinced they made Halo suck on purpose, probably a condition imposed by Microsoft for a PC release... If Carmack worked on the PC port of Halo it would have been fine.

TV is low resolution. You only need to render at 640 x 480 @ 30 fps or 640 x 240 @ 60 fps. In reality its even smaller than that becuase you have to account for NTSC overscan, so your frame buffer area ends up being 512 x 224 or 512 x 448. This also happens to fall on an even page boundry for systems like PS2 where video memory is allocated in pages, so video memory is used efficiently without a single byte wasted.

Certain effects are also cheaper on TV. Metroid prime, as awesome as it looks, uses nothing more than static lightmaps and dynamic vertex lighting. Thats right, vertex lighting. But it looks great on a TV even with poly counts slightly higher than say, Q3A. Another example is greatly reducing the backdrop texture res and poly count for cinematics and focusing ALL graphic power to the characters. This is pretty much how games like Xenosaga and DOA get away with such detailed character models.

This also means you can get away with lower resolution textures. The majority of textures used on TV screens is 32x32 or 64x64, and usually 8 bit. Low res textures are evident even in Halo. And while on a PC has to work on every system even without 64 MB VRAM, main ram is used as a staging area for textures (driving the system ram req. up) and shipped to the graphics card via AGP. On consoles memory is usually unified and there isnt a bus bottleneck (which also allows vertex data to be used by either GPU or CPU with no penalties or shuffling between memory types). Even if all textures stayed resident in VRAM, the game engine still has to be prepared for a task switch on a PC, where the GDI can come in and clobber your video memory. Backups are maintained in system ram in order to replace them immediately when they are needed. On a console, none of this is a problem, since we have a fixed memory layout free from competition from other apps from the time a level is loaded to the time it is unloaded.

64 MB is alot of memory when you don't have a OS, a shell, and a ton of device drivers loaded while running a game. The microkernels that consoles run are very small and generally built into the BIOS ROM so they are not factored into the RAM consumption! Also console programmers are historically used to working with little memory and go through great efforts to minimize memory consumption, up to and including compressing all the text in a simple game cartridge! PC programmers are used to having relatively infinate RAM and HD space. This is why you don't see hundreds of uncompressed .tga files on a console CD-ROM like you do in your 3 GB install dir on a PC. This is esp true for load times, where consoles are limited to streaming compressed data off a slow optical drive.

More efficient code? The general case is the slowest case on the PC. Even if you bloated your code with 20 differant rendering paths for each version of vertex shader and pixel shader, the added overhead of converting generalized data for a particular texture format, or making a decision at runtime on how to acheive something based on hardware capabilities, starts to add up. Programmers can get frustrated with supporting 30 extensions on each vendors card, and just don't support certain features at all. Even Carmack says Doom 3 is the last time he is catering to multiple vendors with specialized rendering paths and the next engine will use all stock GL and vendors better get their act together.

On a PC a call to draw something goes something similar this:

1) Draw
2) kernel32.dll (or the GDI display driver dll like nv4_disp.dll?)
3) ntdll.dll
4) Interrupt 02Eh kernel mode switch
5) current thread yields remaining CPU time and OS has the ball
6) validate user mode parameters and copy to kernel mode memory
7) call display driver
8) call HAL
9) modify hardware
10) find a differant thread to run (system calls yield the remaining CPU time to another thread)

Note on a PC with a secure operating environment where no user process can crash the system, this level of abstraction is desired!

On a Console it's more like this (having written a small OpenGL lib for my PS2 on bare metal):

1) glDrawArrays
2) build command packet and start DMA
3) return to calling thread

None of the validation and OS abstraction is needed. Because the programmers know Kazza and Bonzai buddy and explorer.exe wont be running at the same time, there is no need for such a classical and rigid OS paradigm. If the program doesn't crash running on the developers environment by itself, it won't crash on the users because it's impossible for the user to install other programs, change settings, etc. In fact, an OS on a console pretty much serves the purpose of convenience to provide some standard services, and it sometimes can be bypassed alltogether!

And even with all that, a fairly optimized 700-800 Mhz PC with a GeForce 4 can run some pretty impressive graphics and push at LEAST 30 fps, either being competetive or beating most console games but the most painstakenly hand optimized console titles! Remember to set the resolution to 640 x 480 and the texture detail to medium or low to be on equal terms with a console. To ease the memory requirements that a PC has over a console, you can kill everything but the bare minimum. That includes even the most 'vital' services that allow the PC to be a PC instead of a game console and the 30 or so megs the shell (explorer.exe) takes up by itself, just alt tab back from taskmgr. If you don't think this is true, make sure you run the PC on TV out instead of on a VGA monitor. You'll be surprised.
 

iwantanewcomputer

Diamond Member
Apr 4, 2004
5,045
0
0
may be a stupid question, but
would it be possible to just dump the entire operating sysem to memory or hard drive and boot only a game with everything it needs that the os would normally supply? it seems like game programmers could make a game that saves everything in the OS and stops processing everything but the game and needed drivers etc. then when the game quits it has a thread that tells the computer to reload the OS and other programs.
another similar idea
remember those old apple II's that you could just boot with a big floppy and only run the game, why not make games exactly like they do with xbox that don't use any external programs, you could set your bios to boot from a cd drive, pop the game in, and play it without windows and everything else. then you'ld have doom III for xbox played on an xbox with it's processor upgraded to an amd 64 and vidcard upgraded to a 9800 and 1gig of ram...about enough of a hardware upgrade to give you the extra performance lost by going to whatever refresh rate and 1024x768 or even 16x12

it makes sense to me anyway
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
You could probably do that with DOS and Win9x...

But look how great an environment those OSes made (lockups, reboots, etc)

There is no hardware level IO standard for anything beyond standard VGA. It would be impossible to boot the computer with a game CD and expect anything beyond VGA support. That is the one purpose of having an OS, along with DirectX, OpenGL, driver sub systems, etc. Remember old DOS setup programs that used to come with games that supported only a few vendors?

If you did that you would have to write code for the 80386 and for standard VGA only, to make sure that it would run on any system out there that didnt support special features. Or write a OS with a driver API and a bunch of drivers... oh wait it's already been done...

Besides when I write a game, I dont want to spend weeks writing keyboard interrupt handlers and file system layouts. I just want to get some pretty graphics on the screen and let the OS do what its there for.

Those Apple II games will only run on an Apple II btw :)