Just how do ATI / AMD video cards do deinterlacing?

velis

Senior member
Jul 28, 2005
600
14
81
Since I have only a HD 3300 (integrated 790GX), coupled with a 3GHz X2 athlon 2, I'm naturally having issues with deinterlacing, especially when a full HD stream needs processing.
So I tried to set it to 50Hz interlaced so that the card wouldn't even have to deinterlace. My TV is more than capable of doing it with excellent quality.
However, this yielded no improvement whatsoever. CCC options didn't change and I still notice stutter if I choose vector adaptive. This leads me to believe that the chip is doing deinterlacing even though it's more than evident that it isn't needed...

So is there any way I can convince my hardware not to do deinterlacing? Should I just choose weave end trust the HW not to complicate?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Setting your TV to interlaced mode usually isn't enough. The software you're using is probably still calling for deinterlacing, so the image is being deinterlaced and then reinterlaced for transmission. You would have to tell us more about the software you're using, as that's where the answer lies.
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
How do you do playback?

Try using Media Player Classic, with ffdshow doing the decoding, and EVR-CP renderer. This shouldn't deinterlace by default...

If you then use CCC to force 1080i, your TV should handle things.
 

velis

Senior member
Jul 28, 2005
600
14
81
I do playback with MediaPortal through EVR. Using MS codec which is using HW acceleration AFAIK (CPU usage drops vs ffdshow).
Still, the issues experienced hint that even though the "card" is doing all the deinterlacing / reinterlacing it pays no attention to this information and it still does all this unnecessary work.

So the real question is: does anyone know of a 100% sure way that the ATI driver WOULD NOT do deinterlacing even though the content pushed through the card IS interlaced. All it is supposed to do is match the frames to actual interlacing.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I do playback with MediaPortal through EVR. Using MS codec which is using HW acceleration AFAIK (CPU usage drops vs ffdshow).
Still, the issues experienced hint that even though the "card" is doing all the deinterlacing / reinterlacing it pays no attention to this information and it still does all this unnecessary work.

So the real question is: does anyone know of a 100% sure way that the ATI driver WOULD NOT do deinterlacing even though the content pushed through the card IS interlaced. All it is supposed to do is match the frames to actual interlacing.
The quick & dirty answer is to use a decoder that isn't DXVA accelerated; I don't use ffdshow, but I'd assume it's suitable for this purpose. When the decoding is done in software it should bypass any hardware deinterlacing.
 

velis

Senior member
Jul 28, 2005
600
14
81
Well, I forced the deinterlacing to weave and set the monitor to 50Hz interlaced which should exactly match the source video feeds (live TV).
It looks like no deinterlacing would be done.

Either the card is marking all frames as progressive or ... i have no idea what.