I'm not talking about product refreshes which usually bring no real benefit, I'm talking about gaps between generations of video cards, the 8800 was a strong part for sure, maybe one of the strongest, but I would not call it an exception, I would say that in general the gaps between generations has become slimmer.
The reference I always tell people to keep in mind is that as graphics technology becomes more and more improved, is becomes harder to further improve. Big tweaks in efficiency, e.g. unified shaders which dramatically increased the efficiency of GPUs, are one-time deals and once they've been implemented, that's one less thing to improve upon. GPU manufacturers also have to deal with power constraints, as they've expanded to the ~300W PCIe wall, as well. Also, new node technology is harder to work with as TSMC is also pressured under a tighter schedule, manufacturing, and supply. All of these factors and more add up.
I've bought a flagship card from every single generation from before the Geforce 4 range all the way through to the GTX580 I have now, that is a good £400-550 every 18 months for maybe 15 years now.
The 680 is the first time I've look at the market and said without a doubt that it's simply not worth the upgrade, and funnily enough this is a time in my life I have the most disposible income to throw at hardware upgrades. Sure prices of some of the parts are cheaper, but the performance increase is just...severely underwhelming.
Going from a GTX 580 I can perfectly understand this. The GTX 680 is only ~30-40% faster. However, you have to realize that nvidia shifted gears with Kepler and focused on performance/watt, which in turn reduces absolute maximum performance of the chip. This is further compounded by the fact that Kepler chips are voltage locked and their efficient designs can't be exploited through overclocking like in the past.
A large part of that right now is that I simply do not have need for that small amount of extra power, my 30" @ 2560x1600 generally runs well for all the games I play with mostly max settings and maxed AF plus some AA. My 120Hz panel gets high frame rates at only 1080p with this card also.
If there was some really pressing need for that extra bit of performance you might be able to justify the purchase, but lets face it, with the endless stream of quake3 engine based CoD games coming out, directly ported from consoletown...there's no possible way to justify that without running something insane like 3x1080p panels or bigger resolutions.
Maybe when the new consoles land we'll see a sharp increase in quality and we'll flip back to being GPU limited in our games, then maybe I could justify buying a 30% increase from a underwhelming new generation, but right now it's kind of laughable, and pathetic.
That comes down to personal experience I think. BF3 certainly will crush your GTX 580 at 2560x1600 as will newer power houses, like Crysis 3, Metro Last Light, etc., without turning down settings. However if you don't mind turning down settings for a few games, then certainly you can save some money.
To relay my experience, last January I upgraded to a 7970 at release because reviews showed that with a good overclock it would double the performance of my current "6970" (unlocked 6950). I got the card, benched it, and it turned out I was right:
https://docs.google.com/spreadsheet...FlmUGVQMUZReHI0bFg4czR1Z3AwdXc&hl=en_US#gid=1 . Furthermore it mined bitcoins at nearly double the hashrate but at a fractional power increase. The card paid for itself this summer when bitcoin prices rose and I dumped my wallet, and now continually makes me profit. In all, this makes the 7970 hands down the best graphics card I have ever purchased.
I think the thing to take away is that graphics cards are more advanced and involve a more rigorous purchasing decision now than ever before. Gone are the days where you could look up a few benchmarks and clearly define your next purchase. It really comes down to each individuals needs and expectations.