Since I have only a HD 3300 (integrated 790GX), coupled with a 3GHz X2 athlon 2, I'm naturally having issues with deinterlacing, especially when a full HD stream needs processing.
So I tried to set it to 50Hz interlaced so that the card wouldn't even have to deinterlace. My TV is more than capable of doing it with excellent quality.
However, this yielded no improvement whatsoever. CCC options didn't change and I still notice stutter if I choose vector adaptive. This leads me to believe that the chip is doing deinterlacing even though it's more than evident that it isn't needed...
So is there any way I can convince my hardware not to do deinterlacing? Should I just choose weave end trust the HW not to complicate?
So I tried to set it to 50Hz interlaced so that the card wouldn't even have to deinterlace. My TV is more than capable of doing it with excellent quality.
However, this yielded no improvement whatsoever. CCC options didn't change and I still notice stutter if I choose vector adaptive. This leads me to believe that the chip is doing deinterlacing even though it's more than evident that it isn't needed...
So is there any way I can convince my hardware not to do deinterlacing? Should I just choose weave end trust the HW not to complicate?