I've thought about this... nVidia gives card vendors a reference design. The card vendors probably *never* actually test the bandwidth of the RF filters on the RGB lines, they just put whatever parts nVidia has spec'd in the reference design.
Component "cheapness" doesn't really work. It's a filter, and even with cheap, low-Q parts you can design a filter that will be fine.
Board layout might be a factor, but even at 350MHz it's not going to make that much of a difference (unless they do something dumb like messup the PCB stackup or confuse a 6 layer stackup with a 4 layer stackup).
My best guess (and the same conclusion that everyone else has) is that nVidia picked some fairly expensive parts on their reference design and the board makers picked the cheapest... and in this case there is a big enough difference in parasitics and Q between those parts to knock the filter way off.
I have a background in RF, and I've messed with the filters on a TnT and my Geforce2 GTS. I have noticed slight improvements at the highest resolutions by optimizing the filters, but I don't think my cards were really *bad* to begin with.
I wish I was still at my old job, I'd like to buy a card with *bad* 2D and analyze it. But I still know some people there, perhaps I could have them take a look...
If anyone has a cheap Geforce2 (MX, MX400, MX200 whatever) with pisspoor 2D (I want to see BAD 2D) and would be interested in parting with it for a modest price (or trading cards, I have a few lying about) so I could try to figure out the difference between it's "bad" filters and the "good" filters on my Geforce2 or my Geforce3 please let me know.