Lately I've been thinking about building a system for use as a DVR (among other things) in my living room. I've been looking at lower-end video capture cards, reading up on television signaling standards, and coming up with more questions than answers.
First and foremost (as it affects every other question), I live in Madison, WI (USA). The signals broadcast in my area should be NTSC, correct? If I am incorrect, stop reading, because so far I have assumed NTSC encoding. Assuming NTSC is in fact what is used in my area, what resolution is used and/or how would I go about determining/verifying this? Does any of this change for cable television services?
For calculations I have used a resolution of 525x480 (mostly because it was what I saw mentioned most frequently, and it was below the 576 vertical resolution of PAL signals). I assume video capture cards transmit the captured data over the PCI bus as 24-bit, RGB packed pixels. Also, the only figure I found for NTSC requency was 30 frames/second. Using these figures I get:
( ( 525 * 480 ) * 24 ) * 30 = 181440000 bits/sec or 173 Mbit/sec
Obviously we can't push 173 Mbit/sec over your average 33MHz 32-bit PCI bus, so the 5:1 compression ratio I see mentioned in the product summaries for most capture cards must take place on the card, before it is transfered over the PCI bus. 173 Mbit / 5 = 34.6 Mbit, so now we have overcome the PCI bus.
Higher end cards like the Pinnacle Systems DC2000 have a sustained data rate of 50 MBit/sec. Obviously these can capture all 30 frames/sec of an NTSC signal. What about the lower end cards, such as the Pyro ProDV, which seem to all have a sustained data rate substantially lower than 34.6 Mbit/sec; how do they account for their handicap? I would hope they don't just drop frames at random. Are my calculations flawed? I can only imaging it getting worse with PAL signals if my calculations are correct.
Are standard desktop AGP video cards, like the ATI Radeon 8500 DV, able to achieve the sustained data rate required for full-quality video capture? If not, which capture cards would you reccomend for DV capture from a TV signal?
Regards,
Drew Vogel
First and foremost (as it affects every other question), I live in Madison, WI (USA). The signals broadcast in my area should be NTSC, correct? If I am incorrect, stop reading, because so far I have assumed NTSC encoding. Assuming NTSC is in fact what is used in my area, what resolution is used and/or how would I go about determining/verifying this? Does any of this change for cable television services?
For calculations I have used a resolution of 525x480 (mostly because it was what I saw mentioned most frequently, and it was below the 576 vertical resolution of PAL signals). I assume video capture cards transmit the captured data over the PCI bus as 24-bit, RGB packed pixels. Also, the only figure I found for NTSC requency was 30 frames/second. Using these figures I get:
( ( 525 * 480 ) * 24 ) * 30 = 181440000 bits/sec or 173 Mbit/sec
Obviously we can't push 173 Mbit/sec over your average 33MHz 32-bit PCI bus, so the 5:1 compression ratio I see mentioned in the product summaries for most capture cards must take place on the card, before it is transfered over the PCI bus. 173 Mbit / 5 = 34.6 Mbit, so now we have overcome the PCI bus.
Higher end cards like the Pinnacle Systems DC2000 have a sustained data rate of 50 MBit/sec. Obviously these can capture all 30 frames/sec of an NTSC signal. What about the lower end cards, such as the Pyro ProDV, which seem to all have a sustained data rate substantially lower than 34.6 Mbit/sec; how do they account for their handicap? I would hope they don't just drop frames at random. Are my calculations flawed? I can only imaging it getting worse with PAL signals if my calculations are correct.
Are standard desktop AGP video cards, like the ATI Radeon 8500 DV, able to achieve the sustained data rate required for full-quality video capture? If not, which capture cards would you reccomend for DV capture from a TV signal?
Regards,
Drew Vogel