The PCI TV- or video-in cards render, just like video-in functions on the graphics cards themselves, feed the uncompressed raw video stream directly into the graphics card's so called Overlay buffer. The graphics card in turn feeds this directly through its color space conversion, scaling and deinterlacing filters right onto the screen.
Sure, the TV/video grabbers can only scale down not up, so all they deliver is the original 640x480 or 768x576 of the NTSC or PAL signal. But the graphics card's rendering stage DOES scale it up to fullscreen at any resolutions.
I regularly do exactly that, with various games consoles of all generations, onto a 1400x1050 screen, with a measly Radeon 9200SE. No lag, no complaints ... other than that you'll get to see how low resolution these games consoles actually run, something you normally don't notice on the much worse tubes of TV sets. Until recently I had just a 7500LE - I swapped for the 9200SE for the sole reason that the latter does hardware deinterlacing; speed wise, the 7500LE was nowhere near too slow for this.