• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

video capture card question

CU

Platinum Member
Is 10bit vs 8bit really noticable? Are there some clips somewhere I can download that where recorded in 8bit and 10bit to compare them? They need to have the same encoding settings also, to be fair.
 
Originally posted by: MisterChief
What do you mean by 8bit vs. 10bit?

He means the controller chip on the capture card, for example, my older WinTV Theatre uses a BT878 (8-bit chip). I believe ATI's tv tuners have 9-bit Phillips tuners, and I'm not sure, but I think that PVR (hardware MPEG2 encoder onboard) cards all use 9 or 10 bit chips.
There really is no way to tell, because most of the times, a new chip means newer, more efficent components are onboard too, but I'd generally go with a newer card. I believe some card descriptions on NewEgg mention whether the chip is 8, 9, or 10 bit
 
Yeah 10 bit should be better, I was just wondering if was noticable. I really just want to know how much better a pvr 150/250 would look over a cheap bt878 card. I know about hardware encoding vs software, but I am just interested in the quality difference.
 
I believe ATI's tv tuners have 9-bit Phillips tuners, and I'm not sure, but I think that PVR (hardware MPEG2 encoder onboard) cards all use 9 or 10 bit chips.

Depends on what "decoder" chip the TV Tuner card uses, rather than the "Tuner" the card uses.

*ATI TV cards based on the "Theater 550" chip, have 12bit ADC's (TV Wonder Elite)
*ATI TV cards based on "Theater 200" chip also have 12bit ADC's (AIW X800 XT to AIW 9000pro,TV Wonder USB2.0, eHome Wonder)
*ATI TV cards based on "Rage Theater" chip, have 9bit ADC's (AIW 128 pro 32MB to AIW 8500 128MB)
*ATI TV Cards based on brooktree/Conexant chips have 8bit ADC <IIRC>, (TV Wonder Ve/standard/pro, older AIW cards.

The A/D conversion depends greatly on the signal provided, so all things being equal, the 12bit analog to digital converters will retain more of the original information, and therefore have higher quality output.

However, the quality of the signal from analog broadcasts themselves, and the quality of the signal that actually reaches the decoder chip play just as important role. The signal processing and filtering quality, and ultimately the driver support and software all play a part in the final quality. So a 12bit card "could" certainly be better, a good 8bit card with solid drivers, good software and a decent signal strength could be very good as well. Analog video is relatively low resolution at its very best.
 
Originally posted by: rbV5
I believe ATI's tv tuners have 9-bit Phillips tuners, and I'm not sure, but I think that PVR (hardware MPEG2 encoder onboard) cards all use 9 or 10 bit chips.

Depends on what "decoder" chip the TV Tuner card uses, rather than the "Tuner" the card uses.

*ATI TV cards based on the "Theater 550" chip, have 12bit ADC's (TV Wonder Elite)
*ATI TV cards based on "Theater 200" chip also have 12bit ADC's (AIW X800 XT to AIW 9000pro,TV Wonder USB2.0, eHome Wonder)
*ATI TV cards based on "Rage Theater" chip, have 9bit ADC's (AIW 128 pro 32MB to AIW 8500 128MB)
*ATI TV Cards based on brooktree/Conexant chips have 8bit ADC <IIRC>, (TV Wonder Ve/standard/pro, older AIW cards.

The A/D conversion depends greatly on the signal provided, so all things being equal, the 12bit analog to digital converters will retain more of the original information, and therefore have higher quality output.

However, the quality of the signal from analog broadcasts themselves, and the quality of the signal that actually reaches the decoder chip play just as important role. The signal processing and filtering quality, and ultimately the driver support and software all play a part in the final quality. So a 12bit card "could" certainly be better, a good 8bit card with solid drivers, good software and a decent signal strength could be very good as well. Analog video is relatively low resolution at its very best.
yeah, I noticed I'd confused tuner and decoder, all the ATI cards use Phillip tuners, but the decoders differ.

as for telling the difference between a PVR 250 and a BT878 based card, I went from WinTV Theatre (BT878) to WinTV PVR250MCE, the difference is noticable, but the Theatre was one of the best 878 cards out there
 
I don't suppose you have any recordings you recorded with the bt878 and the pvr 250 for me to look at.
 
Best analog TV quality out there is on those cards that not only have the Philips 713x series decoder chip, but also the Philips "silicon tuner", not the old analog-mess-under-large-shield arrangement.

LifeView makes those. Available also in disguise from many OEMs like ASUS.
 
I use a Canopus ADVC-100 through firewire...best quality I've ever seen being converted from analog to digital. I use WinDV to dump the video to me computer. I've never lost a frame and the audio is always in sync.
 
Back
Top