Basically, the video capture features are the same between the cards. The 8500dv uses a tuner on silicon(similar picture quality, generates a bit more heat), while the Original AIW uses an analog philips tuner similar to all the other AIW models. The 8500dv uses a different cable that includes both the VI and VO on the same cable.
Would a dedicated capture card be that much better? Is it the size of the capture that determines quality, eg..640x480 vs 720x568?
AIW cards have pretty good capture quality, definately comparable to stand alone cards. Quality is determined by many factors, signal quality being probably the biggest factor. 480i is full NTSC resolution (basically 640x480 PC resolution) some lower quality cards only capture a single field (240 lines, basically 320x240) so stay away from those if you want the best captures. Analog to digital conversion is another factor, the 8500dv and original AIW use 9bit ADC's, the new AIW's use 12bits and standalone cards are generally 8bit, while some newer cards use 10bit ADC's. I'm not sure that these differences make all that much difference in the end, but potentially, the more bits retained during the analog to digital conversion...the better the final output "can" be.
All in all, the 8500dv will likely not make much diffence over your current AIW as far as quality is concwerned. The biggest feature difference IMHO (not counting 3D gaming performance) is the dual RAMDAC on the 8500dv support "theater mode", so you can watch a video using TV-out without having to "clone" the primary monitor (leaving your desktop for browsing or what not)