Originally posted by: rbV5
I believe ATI's tv tuners have 9-bit Phillips tuners, and I'm not sure, but I think that PVR (hardware MPEG2 encoder onboard) cards all use 9 or 10 bit chips.
Depends on what "decoder" chip the TV Tuner card uses, rather than the "Tuner" the card uses.
*ATI TV cards based on the "Theater 550" chip, have 12bit ADC's (TV Wonder Elite)
*ATI TV cards based on "Theater 200" chip also have 12bit ADC's (AIW X800 XT to AIW 9000pro,TV Wonder USB2.0, eHome Wonder)
*ATI TV cards based on "Rage Theater" chip, have 9bit ADC's (AIW 128 pro 32MB to AIW 8500 128MB)
*ATI TV Cards based on brooktree/Conexant chips have 8bit ADC <IIRC>, (TV Wonder Ve/standard/pro, older AIW cards.
The A/D conversion depends greatly on the signal provided, so all things being equal, the 12bit analog to digital converters will retain more of the original information, and therefore have higher quality output.
However, the quality of the signal from analog broadcasts themselves, and the quality of the signal that actually reaches the decoder chip play just as important role. The signal processing and filtering quality, and ultimately the driver support and software all play a part in the final quality. So a 12bit card "could" certainly be better, a good 8bit card with solid drivers, good software and a decent signal strength could be very good as well. Analog video is relatively low resolution at its very best.