GeForce's (esp GF4TI) and the quality of the TVout port.

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:( I was aware that the GeForce3 cards (and GF2s) had issues regarding TVout available resolutions and quality (off-centre picture and black borders). This was very evident in the VO only cards which used Chrontel or Conexant chips. The cards featuring VIVO (TV/Video In & Out) used Philips chips and were better but still far from satisfactory.

VIVO part of TomsHW roundup of the preGF4 nVidia TI cards

:D I had initially heard the GeForce4 chips, particularly the GF4TI chips which is what interests me the most, were better, but have since heard that the GF4 cards are also plagued by inadequate TVout. I know the GF4's have twin RAMDACs and so overcome the '1 refresh rate serves all' problem.

;) What are your experiences and recommendations regarding TVout quality? I'd heard that some software (TV-Tool?) can help this, or perhaps new drivers from nVidia. Is this a driver or a hardware problem?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:p It's an extra £20 for a Aopen GF4TI4200 VIVO over a Inno3D GF4TI4200 VO, but I'm already past the amount I have available to spend. I won't need the VI option but I do want decent quality VO for my large screen TV.

:Q I'm afraid the Radeon cards, which I know have good VIVO, are ruled out as they are way too pricey here in the UK.
 
Oct 16, 1999
10,490
4
0
I have a PC with a GF2 w/ the Philips VIVO chip that's now a permanent resident of my entertainment center. The TV out isn't THAT bad, but it is noticeably worse than with my Radeon (which I would use instead, but has no video in). I was able to get it looking fairly decent by playing with the color adjustments (of the main display, I have yet to use a driver version that let's me adjust these settings and the saturation for the TV out specifically as I have seen pictured at various places). My picture was off center at default, but I was able to adjust that out within the drivers. And on my TV the boarders are not that wide, maybe only 3/4 of an inch on the top and bottom and 1/4 on the sides without any picture blooming. If Nvidia would get on the ball and provide the options that TVtool (which can help to a great extent) does within their drivers I don't think many people would be complaining about the TV out quality that much. Maybe if we all bitch together now it will happen.

I also need to add that although I am running in 32-bit to my TV I am getting some color banding that looks like it's running in 16-bit. I didn't notice this on the Radeon. So either that really was running at 32-bit, or it was being dithered down to 16-bit much better than this GeForce is.