First of all, Matrox uses the G400 for in its RT2000 video editing system and the images that come out of it have to be of "broadcast quality", ie look good enough to go on TV. Therefore they had to make the image quality superior. Every other company used "shortcuts" to gain more fps, and of course many of those compromise a bit of the image quality.
The difference in image quality comes from all over the place! Higher quality RAMDAC that converts the digital image into an analog signal for your monitor, better filetering, higher precision calculations so the colors look great, etc. Just the filtering can make a big difference. When programmers use DirectX to program, all they can tell the card is "use this kind of filtering for the image", but really the card is free to do whatever it wants. I don't know if there are any apps on the net that can show the diffence between a well filtered 3D scene, a badly filtered one and a non-filtered one, but there's a big difference.
Basically at any resolution and color depth, you'll see a very noticeable difference, even if you don't have a very high quality monitor. I haven't seen the image quality on the "newer" card like the GeForce2 or the Radeon, but if you compare a TnT2 or a Voodoo2 card with a G400... there's a BIG difference! If you can, try to borrow a G400 for a few days and play a couple of hours a day on any 3D games with it (or more!) and when you'll switch back to your old card, if it's anything like a TNT2 or a Voodoo2... well you'll probably go buy a G400! ...of course you won't have a lot of FPS with it but it's still very playable for most games in 800X600 or more 32 bits and it looks better in 16 bits than a lot of cards do in 32 bits, because of filtering differences, calculation precision, etc etc.