<< GF2s have poor image quality over all, and all nvidia cards are limited to the 3d quality that the nvidia drivers provide. The MX cards are the worst of them all for IQ. >>
Untrue. All GF1 and GF2 chipsets have identical 3D rendering quality (and features too, contrary to what nVidia wants you to believe). 2D image quality, however, greatly varies from manufacturer to other. Being generally assembled from cheapest available components, MX series cards are likely to have worse 2D iq on the average than GTS/pro/TI cards.
Onto the original question, these are based on the "real" GeForce2 chip, and only vary from each other by the default core/mem clock speed:
GeForce2 TI (250MHz core, 400MHz memory)
GeForce2 Pro (200MHz core, 400MHz memory)
GeForce2 GTS (200MHz core, 333MHz memory)
These are based on the value GeForce2 core, which has roughly half the performance of the normal core, assuming there are no memory bandwidth limitations. The MX200 core based cards are considerably slower than MX400 ones, since they have less than half the memory bandwidth at their disposal.
GeForce2 MX400 (200MHz core, 183MHz SDRAM memory on 128bit bus)
GeForce2 MX200 (175MHz core, 166MHz SDRAM memory on 64bit bus)