what I don't understand is why the graphics companies keep busting their asses to come out with expensive-to-produce, super fast $500 video cards that double in performance every year or so, when very few gamers buy them. Meanwhile, the there hasn't be a good midrange card since the 6600GT. I thought that when that card came out, nvidia finally got it - come out with a $200 card (which is the average gamer's price range) with half the horsepower of the $400/$500 card, and make it cheap to produce by basically designing a chip the exact same way as the high-end one, but with only half the transistor count (8 pipes vs. 16, 128-bit bus vs. 256, half the memory, etc). Everybody is happy - the cards aren't cut down; you get what you pay for, the cards run nice and cool, and nvidia doesn't cannibalize its own sales by selling $200 cut-down cards that people softmod into $500 ones.
Instead, what we get is essentially minor speed bumps on the midrange cards - 20% bump with the 7600, another 20% bump with the 8600. meanwhile, over the same time period, the 8800GTX is probably 3-4 times faster than a 6800 Ultra. so now there are $200 crappy midrange cards that no one wants, and the graphics companies come up with crippled versions of their flagship cards, or keep producing their old high-end cards (remember when ATI kept producing 9800 pros to compete with the 6600GT?). this is bad for them because a) they are expensive to produce and b) people softmod them, spending hundreds less than they otherwise might. If they had, or example, made the 8600 a card with a 256-bit memory bus, half the stream processors, and comparable clock speeds to the 8800, people would eat it up for $200-250 and they would have a nice, cheap to make, non-blast furnace of a video card on their hands.