
Unfortunately most reviewers are too transfixed with benchmarking and raw numbers to really cover image quality in any great depth. This situation is even further complicated by so much depending upon your own personal environment, settings, monitor and setup as well as basicly being a very subjective area.

But for IMAGE QUALITY there is very little in it between Rad8500 and GF4TI until res of 1600x1200x32 with 75Hz+ refresh rate where the Radeon seems to fair better. Matrox are still masters of this but there cards are over-priced and underperform in the 3D dept.

For TVOUT the GF4 cards are improved over the GF2/3 cards by using dual RAMDACs as standard thus meaning that the image can be simultaneously displayed on both TV and monitor at the same time with different res and refresh rates. Also the way in which the TVout chips are implimented means that now all sw and drivers should correctly detect and utilise the TVout ability of the card, something often lost in GF2/3 days and some people now find they have to use very old drivers from the manu to use TVout. However the TVout chips nVidia still use are 3rd party and the drivers do not tweak the output at all, the major drawback of this is that you often get a noticable black border around the edges of the TV (lack of overscan), the image is often off-centre and the sharpness and vibrance of the TV image is lower than that of a Radeon. nVidia can address almost all of this with quite simple driver tweaking, but they don't so nVidia uses who want to enhance TVout are forced to use the 3rd party prog TV-Tool. So Radeons' TVout are a bit clearer, sport centring and overscan adjustment and are all-round much better than any current GeForce card.