It's more complicated. Above 1600x1200, the graphics cards pull some tricks to extend resolution support. "Reduced blanking period" timings, as well as generally lowered frequencies, let a single-channel DVI reach further.
I've seen stock NVidia cards do 1920x1200 no problem (Apple CinemaDisplay). ATi cards know those tricks too. At least NVidia has added more trickery to the 50 series drivers, to support even those displays properly that have really odd resolutions - like the smaller Apple display that is 1680x1050 natively. Works like a charm.
With the DVI engine being integrated into the main chip in both NVidia and ATi solutions, there's little a card maker can screw up in that area. Drivers more so.