I just don't get why it's impossible to consider a few possibilities as not mutually exclusive. For example, didn't early nVidia drivers accidentally expose the 680 as a 670 and only later was it renamed once AMD's 79xx series performance levels were revealed? I just don't think the fact that the GK110 (or its successors) being moved to 2013 for potential use in the consumer line (and perhaps further) is proof of anything beyond the fact that nVidia learned their lesson from Fermi.
It's very clear that GK104 is an evolution of the same design mentality that produced the 460 and 560. It's also very clear that the GK110 is an evolution of the same design mentality that produced the 470/480 and 570/580. I don't think nVidia had any intention of releasing the GK110 product to consumers any time this year once they started assessing actual fabrication of said products. I believe they said as much repeatedly. That said, I don't think that was their original intention back in the day and what I think most people would blame AMD for in this regard is not the fact that the nVidia high end wound up as "just" the GK104 product, but that AMD set the bar so low with the Tahiti line that nVidia was able to swing the GK104 as "high end."
I don't think it was designed to be high end. I think it was designed as a successor to the 460/560 line and they were going to release the "mid-range" for this year with them hopeful a dual-GPU card would fill the high end. Instead, AMD made them happy (and they said that, too, at the time of the R7970 launch) with their performance levels being so unimpressive as to help nVidia justify bumping the price and branding up on a card they were about to release as 670 or lower.
That'd explain why nVidia is making more money than AMD currently, btw. nVidia is getting $500 regularly for a GPU that costs considerably less since it was designed initially to be in cards selling for $200-$300. The sheer amount of RAM the default config should help you get that. Meanwhile, AMD's busy producing a larger GPU and charging less for it, probably hitting a point far closer to their cost to manufacture than they'd like.
My point is basically that I think AMD set the bar very low and nVidia was able to capitalize on that gleefully by being able to mark their product up to higher levels than they'd dreamed possible. From what I remember, last year a lot of the rumor sites were saying GK110 was looking late, looking possibly going into 2013, and looked unlikely to have any effect at all on the consumer market. More to the point, I remember having the distinct impression that nVidia was going to try and weather the storm with mid-range products and dual-GPU as the high end. That's exactly what they've done, except instead of "weathering the storm" they've pwned the storm.
This is all a direct result of AMD's failure to produce a card that pushed the high end. nVidia had what they had and their card was designed to satisfy the mid-range, but suddenly they looked like prophets given their performance level and the way it matched AMD performance in the high end. Neither company produced cards that really pushed the boundaries in the way we're used to, so each company focused on their strengths (ironically having swapped strengths since last generation. Ie., Suddenly, AMD cares only about Compute and nVidia cares only about efficiency). But GK110 was too big and complicated to produce reliably on a relatively new 28nm process.
It was meant to be the high end, but nVidia learned from their mistakes in the past (ie., Fermi in particular) and instead of doing what they did with Fermi, delaying and waiting, they planned ahead. They knew they could use the mid-range to keep the boat afloat while waiting for the technology to catch up to GK110's complexity to fabricate. In the meantime, they could use on-card SLI to keep up with AMD when AMD decided to bring it.
Except AMD didn't bring it. They were a wet noodle. Suddenly, nVidia was so smug they couldn't resist saying so in response to AMD performance benchmarks, admitting they thought AMD's performance levels would be higher. That was early this year.
So now, why WOULD nVidia rush to bring out a complex GK110 part to face down a minor performance advantage in a few benchmarks by the 7970GE this close to the next generation of cards? Why when they can still charge $400 for 670 and $500 for 680 versus Radeon cards in their freefall pricing? Clearly, that would be the most stupid thing in the world to do. Especially when GK110 is so complex they'd take a beating trying to keep it reasonable in pricing.
Instead of pushing out cards with bad fab and/or leakage, having to hack off parts of the chip (Fermi) to get them out the door, they can just shrug and say, "It'd cost us an arm and a leg. After Fermi, we need to milk the market for a while." So they'll just use their smaller, cheaper, more focused part until such time as fabrication catches up to the complexity of the GK110. They realized this was what they'd have to do before they saw the Tahiti performance benchmarks. What those benchmarks let them do, though, is realize they could still have high end pricing. Before that, they seem to have intended the GK104 series to be at mid-range pricing with what became the 690 being the high end if they needed it to match whatever AMD produced.
And if you think nVidia would release a GK110--a whole new chip--just to win a couple of benchmarks (the ones they're not already matching or winning) against the poorly named and launched Radeon 7970GE, then I think you really need to reassess. AMD hasn't given them a good reason to push the Gk110 harder to be fabricated. Everything this year has just proven to them they can sit on the GK104 and its successors for the near term because AMD doesn't have anything anywhere close to the GK110 level, so they don't need the Fermi-like tradeoffs of performance for more heat.
I suspect nVidia is realizing wholly by accident what AMD professed several years ago. It is better to have a smaller GPU less generically awesome but razor focused on the core usage model of discrete GPU's than it is to have a Fermi or Tahiti-like chip. nVidia happened upon it by accident when GK110 turned out to be a whale of a chip to fab. That's why they pushed ahead with Gk104 hard and that's why when Tahiti showed up and they were like, "That's it?" you could practically read the palpable joy in the way they said it.
It's like gearing up for the fight of your life, knowing you twisted your ankle on the way to the fight, but arriving to find your fighter is a one-armed man. Suddenly, the fight's in your favor. That's why suddenly all the rumors from last year that had been saying, "GK110 not coming next year, but nVidia mid-range is" turned to, "nVidia mid-range is now being renamed 680 because they can match performance."
Hell, I remember reading forums of people saying, "There's no way nVidia's producing a mid-range card that can match the 7970!" The rumors were that pervasive. And whaddya know? That's exactly what nVidia did, except they figured if they had a card that matched the high end bar set by AMD then they should get the pricing of it, too.
So, I think AMD probably would have been better served depleting any excess inventory they had in the channel (and there were a lot of 68xx, 69xx cards for sale for a long time after the 78xx and 79xx series) with solid pricing and showing up in June with truly refined drivers (12.7-like), higher clocks, and pricing that matched up to slightly higher than what they have now.
RS likes to talk about "First Mover" being a great strategy, but I suspect somewhat lower prices coming with awesome drivers that provided Geforce-beating performance with reviews all saying uncategorically, "Radeons are faster with great power utilization/efficiency and more RAM for lower prices!" would have made the whole line do much better against nVidia than what's happened.
AMD released their cards before they had proper drivers for them. That gained them a reputation for lower performance than they have now, worse power efficiency numbers they really have, a reputation for Crossfire problems that continues to this day, and let nVidia get a few thousand reviews all saying "Kepler is incredibly more efficient than Tahit yet performs the same and costs an equivalent amount! You'd be stupid to buy the hotter card." That legacy continues to haunt the 7970 series. This is part of the reason for the 7970GE launch, but even that was tainted by AMD's baffling decision to pre-launch the card months in advance and to use the worst possible cards with the highest power usage they could find to do that pre-launch. By the time the real cards showed up, those reviews were not indicative of the final product, but good luck finding any reviewer going back to edit their reviews on that fact.
Newer reviews might say otherwise, but few care about those reviews. Most are reading that initial review where the whole card, its technology, its legacy and history, and all its promise are described in minute detail. Remember the "first reviewer advantage" is always to the first few reviews of a given product and what they say about said product. That defines the product and only incredible shifts can change that early perception once a perception has set in.
What's the first thing people remember about Radeon 7970? They launched at high prices for low performance gain? First thing about 680? Great efficiency, much better than R7970 and pushed the Radeon into a freefall of pricing. 670? 95% of 680 for 20% lower prices. R7950? Overpriced until very recently. Etc, etc.
Everything about the Radeon line is tainted by those early reviews using drivers that put these cards in a bad light. If you google these cards, more often than not you get reviews from early this year. Not everyone knows the little things, like 12.7 catalyst drivers being truly awesome and helping the cards push past the Kepler products in many ways. Or that 7970GE cards by the best OEM's are actually awesome in efficiency.
They don't know these things because AMD did such a crappy job of a first impression and nVidia used that crappy impression to make their first impression sterling by comparison. AMD seems to have learned their lesson if their delaying the 8xxx series is any indication.
But just because nVidia doesn't apply the GK110 or its successors as the high end of the consumer division next go-round, that does not prove to me that they NEVER intended it to take on that role. They don't NEED it to take the role. Having the high end crown matters not at all if you're able to charge more for your high end card and sell out of them even while your opponent wins a few more benchmarks and yet has to sell their card for $100 less to lure anyone to buy it. I think they intended to push for GK110 harder, but once it became clear it was unnecessary, they put de-prioritized it. It just wasn't required. Nothing in the consumer market really requires it and nothing AMD's making comes close to making the competition necessary.
So now, nVidia has no motivation to do anything differently. Fermi ate up a lot of profits and why should they want that to be repeated this go-round? They can take their time. This generation has been exceptionally kind to them in every way. They could release their cards as slowly as they liked, they could launch mid-range cards at high end pricing and still be called the winner, keep the channel well stocked and yet sell out dramatically across the board while charging more for less performance than their competition.
And all thanks to some really drab reviews and horrible word of mouth (all those former Radeon users who sold their Radeon's to get 680's and 670's to escape the pre-12.7 crap). Much of this was fueled at least in part by poor launch planning by AMD and a legacy of bad drivers still pungent from late last year (especially at Rage's launch) in the build-up to the Radeon 7970 launch when the driver teams had to focus on the very new and different GCN architecture to the detriment of the older Radeon products. It didn't help that Crossfire was completely broken in their first few drivers for the 7970, either.