Hauppauge WinTV HVR-1800 jaggies/tearing on analog source?

Elixer

Lifer
May 7, 2002
10,371
762
126
Friend (no tech experience at all) got this card to replace the old winfast tv2000 (which I installed for him), and I offered to help with the setup again.

The main issue is that no matter what recording quality level we use, there is very poor quality for the analog sourced recordings. Wintv's highest quality is pretty crappy compared to the tv2000 which is using software encoding via winfast PVR2.
There are lots of jaggies/tearing going on, and it really makes the 1800's recordings look like crap vs the software encoded recordings. I am thinking the MPEG2 encoder on the 1800 isn't that good.
He still is using CATV, so this is a important aspect.

The card did find clear QAM & also tested some ATSC, and although that was a bit better, it still had some 'jaggies'/tearing issue, it was far better than the analog source recordings.

I also tested out GVPVR & Dscaler, and the results pretty much mimck what wintv does.

Is this the norm for this card?
 

nickbits

Diamond Member
Mar 10, 2008
4,122
1
81
If you are having issues with ATSC then it is your decoder. ATSC capture cards capture the raw bitstream--nothing is encoded. Too me it sounds like whatever you are using for MPEG2 decoding isn't deinterlacing properly. Try a different MPEG2 decoder.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Originally posted by: nickbits
If you are having issues with ATSC then it is your decoder. ATSC capture cards capture the raw bitstream--nothing is encoded. Too me it sounds like whatever you are using for MPEG2 decoding isn't deinterlacing properly. Try a different MPEG2 decoder.

I forgot that ATSC capture is raw, and doesn't need encoding. :eek: I'll chalk up the issues from that to the cable company feeds, and their crappy compression.

I also tried several MPEG2 decoders, and it didn't make a difference. :(

At this point in time, I am more worried about the analog output though.



*edit, looks like it is a driver problem of sorts:
in another forum:
To confirm this I unistalled the drivers and installed 3.2.
-Good analog, NO Digital / QAM

Went back to Drivers 4.2D1
QAM / Digital works, Crappy analog.
[snip]
I have one of these and would love just to get decent analog. All the drivers I've tried have really horrible interlacing artifacts over all analog sources.
 

imported_BoilerMaker

Junior Member
Jan 23, 2009
4
0
0
I think I know what your problem is as I have experienced the same issues with the HVR-1800. You wouldn't happen to be using an LCD monitor would you? If you view the video on a CRT monitor it will look perfect but if you try to play it on an LCD it will not deinterlace properly and will basically look terrible regardless of what video card you are using (I have tried both ATI and Nvidia). If you want to view your recorded video on an LCD monitor then the only viable solution I have found is to transcode the output video and use a deinterlace filter. It's a bit of a pain, but you're most likely going to want to edit out commercials from what you've recorded anyways. Once you have transcoded with the deinterlace filter the video will display properly on an LCD monitor.
 

PurdueRy

Lifer
Nov 12, 2004
13,837
4
0
Originally posted by: BoilerMaker
I think I know what your problem is as I have experienced the same issues with the HVR-1800. You wouldn't happen to be using an LCD monitor would you? If you view the video on a CRT monitor it will look perfect but if you try to play it on an LCD it will not deinterlace properly and will basically look terrible regardless of what video card you are using (I have tried both ATI and Nvidia). If you want to view your recorded video on an LCD monitor then the only viable solution I have found is to transcode the output video and use a deinterlace filter. It's a bit of a pain, but you're most likely going to want to edit out commercials from what you've recorded anyways. Once you have transcoded with the deinterlace filter the video will display properly on an LCD monitor.

As much as I respect a fellow boilermaker...the monitor has nothing to do with the deinterlacing process so your explanation really doesn't make much sense.

Oh and welcome to Anandtech :eek:
 

imported_BoilerMaker

Junior Member
Jan 23, 2009
4
0
0
Originally posted by: PurdueRy
As much as I respect a fellow boilermaker...the monitor has nothing to do with the deinterlacing process so your explanation really doesn't make much sense.

Oh and welcome to Anandtech :eek:

I think you may have misinterpreted my post a little bit. You are correct that the monitor plays no role in the de-interlacing process, but that wasn't the reason I asked if the OP was using a CRT. A CRT monitor does not need the source to be deinterlaced so an interlaced video will look perfectly normal on a CRT monitor. Conversely, an LCD monitor need a progressive video source so an interlaced video will have noticeable "combing" effect when viewed on an LCD if no deinterlacing filter is used. The following link explains this a lot better than I did: What is Deinterlacing? Facts, solutions, examples.

Ideally the decoder should do the deinterlacing for you, but in many cases the default option on decoders is to perform no deinterlacing. Take VLC for example: VLC has deinterlacing disabled by default. The reason for this is that if you try to de-interlace a progressive video source (most videos you are likely to have on your computer will be progressive) it won't play right.

In any case, if you want to have interlaced video display correctly on an LCD monitor you will need to either use a decoder with the interlacing option turned on (and remember to turn it off when you want to watch a progressive format video), or re-encode the interlaced video using a deinterlace filter.


Thanks for the welcome by the way. I'm actually an RPI alum and not a Purdue alum. The screen name comes from the fact that I design utility class (>200MW) boilers for a living.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Originally posted by: BoilerMaker
Originally posted by: PurdueRy
As much as I respect a fellow boilermaker...the monitor has nothing to do with the deinterlacing process so your explanation really doesn't make much sense.

Oh and welcome to Anandtech :eek:

I think you may have misinterpreted my post a little bit. You are correct that the monitor plays no role in the de-interlacing process, but that wasn't the reason I asked if the OP was using a CRT. A CRT monitor does not need the source to be deinterlaced so an interlaced video will look perfectly normal on a CRT monitor. Conversely, an LCD monitor need a progressive video source so an interlaced video will have noticeable "combing" effect when viewed on an LCD if no deinterlacing filter is used. The following link explains this a lot better than I did: What is Deinterlacing? Facts, solutions, examples.

Ideally the decoder should do the deinterlacing for you, but in many cases the default option on decoders is to perform no deinterlacing. Take VLC for example: VLC has deinterlacing disabled by default. The reason for this is that if you try to de-interlace a progressive video source (most videos you are likely to have on your computer will be progressive) it won't play right.

In any case, if you want to have interlaced video display correctly on an LCD monitor you will need to either use a decoder with the interlacing option turned on (and remember to turn it off when you want to watch a progressive format video), or re-encode the interlaced video using a deinterlace filter.


Thanks for the welcome by the way. I'm actually an RPI alum and not a Purdue alum. The screen name comes from the fact that I design utility class (>200MW) boilers for a living.


For what it is worth, (since the card is going back now) we tried 3 different output devices, 1 dektop LCD, 1 19" CRT, and a LCD TV.
The image issues were apparent in all of those. Namely, a severe case of jaggies.

When we recorded the same show, it was like night & day difference, the dedicated analog card with software MPEG2 encoding trumped the hardware MPEG2 encoded card.

After doing some more reading, it looks like the Saber DA-1N1-E Combo Analog/Digital PCIe TV Tuner Card would be his best choice, but it don't seem to support clear QAM on XP. I am guessing the best thing to do is just get 2 dedicated cards, and be happy with that.

http://www.vistaview.tv/content/view/51/116/
 

imported_BoilerMaker

Junior Member
Jan 23, 2009
4
0
0
Originally posted by: Elixer
For what it is worth, (since the card is going back now) we tried 3 different output devices, 1 dektop LCD, 1 19" CRT, and a LCD TV.
The image issues were apparent in all of those. Namely, a severe case of jaggies.

When we recorded the same show, it was like night & day difference, the dedicated analog card with software MPEG2 encoding trumped the hardware MPEG2 encoded card.

After doing some more reading, it looks like the Saber DA-1N1-E Combo Analog/Digital PCIe TV Tuner Card would be his best choice, but it don't seem to support clear QAM on XP. I am guessing the best thing to do is just get 2 dedicated cards, and be happy with that.

http://www.vistaview.tv/content/view/51/116/

Sounds like you may have had some different issues than I did. If you want to try messing around with the HVR-1800 again you may want to look at the signal strength. I've found that my card seems to be very sensitive to a signal that is either too weak or too strong. I actually had to put a 3dB attenuator on the line feeding the analog side of mine to get the image quality to an acceptable level.

 

Elixer

Lifer
May 7, 2002
10,371
762
126
Originally posted by: BoilerMaker
Sounds like you may have had some different issues than I did. If you want to try messing around with the HVR-1800 again you may want to look at the signal strength. I've found that my card seems to be very sensitive to a signal that is either too weak or too strong. I actually had to put a 3dB attenuator on the line feeding the analog side of mine to get the image quality to an acceptable level.


Now that, I didn't think of. I know the wintv signal meter for the digital side was 40db (maxed out full green bar), but I didn't think to check (or see a utility that reports) what the analog signal strength was.
Hmmm.
 

imported_BoilerMaker

Junior Member
Jan 23, 2009
4
0
0
Originally posted by: Elixer
Now that, I didn't think of. I know the wintv signal meter for the digital side was 40db (maxed out full green bar), but I didn't think to check (or see a utility that reports) what the analog signal strength was.
Hmmm.

To my knowledge there isn't a utility that gives the analog signal strength. You need a rather expensive tester to check it. Thankfully I had written down what the signal strength was on each of the cable jacks in my house when Comcast checked them after I signed up for cable service. I had often read that the "sweet spot" for signal strength was between 0 and -10 dB and I was running at +5dB. On a hunch I ran down to the local electronic shop and picked up a few coax attenuators of various sizes (they're only about $2-$3 each) and tried them out. After trying a few combinations it was clear that reducing the signal strength did in fact improve the reception on the TV card. I eventually settled on using a single -3dB attenuator as weakening the signal more than this didn't seem to be giving any additional benefit