Hello all. I'm currently trying to figure out how the best way to get "video in" functionality on my computer, and need a bit of help. My current video card is a GeForce 3 Ti200, and I was thinking the best way to do this was to buy one of those 8 meg PCI All in Wonder's for about $30 on ebay, and just use that (my monitor has two inputs). The other option is to sell off my GeForce3 and spend $300 on a Radeon 8500DV AIW. Need answers to a few questions to help decide.
#1: Is there any quality difference with either vga out or video in? I was convinced that all the AIW cards used the same chip for handling all the VIVO stuff, but ati.com has a blurb on the AIW DV8500 about a new and improved IVTC algorithm (which matters since I am going to be doing some VHS captures with this). Do they or don't they? Also, if there is a quality difference, how noticeable is it?
#2: Is the software the same? I've heard nothing but good things about ATI's TV software, but I want to make sure there isn't some catch with the older AIW cards. The digital vs. analogue cable tuner doesn't matter to me (unless there is a difference in quality . . .).
Thanks in advance,
-Chu
#1: Is there any quality difference with either vga out or video in? I was convinced that all the AIW cards used the same chip for handling all the VIVO stuff, but ati.com has a blurb on the AIW DV8500 about a new and improved IVTC algorithm (which matters since I am going to be doing some VHS captures with this). Do they or don't they? Also, if there is a quality difference, how noticeable is it?
#2: Is the software the same? I've heard nothing but good things about ATI's TV software, but I want to make sure there isn't some catch with the older AIW cards. The digital vs. analogue cable tuner doesn't matter to me (unless there is a difference in quality . . .).
Thanks in advance,
-Chu