I don't think it was Sandy Bridge, I think it was the fact that Intel's 32nm HKMG was a process node built like a tank that could take the abuse of 1.5V OC's needed to push SB to 5GHz without killing the chip from degradation or sudden death syndrome.
Sandy Bridge the microarchitecture really is a great architecture for being a power-miser that sips the watts when it is clocked where it was intended to clock. Ivy Bridge shows us this.
But what Ivy Bridge doesn't have going for it is a 22nm node that was built like a tank like 32nm was. Instead 22nm was built to be dainty and enable all sorts of magical sub-10W SKUs for the tablet and phone form-factors. Push on the voltage as needed to get the clockspeeds out of a 3770K and suddenly your temperatures and power-consumption go to 11.
IMO Sandy Bridge the CPU product (like my 2600K) is only viewed as being great compared to its successors not because of its microarchitecture but because of the underlying process node (32nm 2nd gen HKMG).
(Bolding the part I'm commenting on.)
Not to make this sound like just another vapid, "I agree with you post," but I do wholeheartedly agree with you on this one. As my interest in all-things-gaming related has shifted from GPUs to CPUs over the last few years, I've begun to realize that with all architectures, the home is only as good as the foundation it's laid on. I remember being blown away by my roommate's Nehalem based Core i7-920 when it was first released, but Intel gave us another "Core2 moment" with the release of Sandy Bridge. Thanks largely in part to the underlying 2nd gen 32nm HKMG process as you described above, Sandy Bridge gave us nice improvements in efficiency, clock speeds, and IPC, which combined blew the doors off an already great CPU architecture. It was like Intel opening up a new restaurant down the road and beating everyone in price, convenience, and quality all at once.
I also want to comment on the 22nm part too. It wouldn't surprise me if Intel's corporate culture is likely similar to that of Microsoft. When you lack real competitors, what do you do? Where do you go? That's not to say that either company releases bad products or stop pushing boundaries where possible, but companies that make it to the top tend to adopt more conservative approaches to product development to protect the assets that they already have. Basically what I'm saying is that companies become more "reactive" in nature.
Intel got beaten by ARM in the ULP mobile category, and Intel is now reacting.
And that changes the foundation in which their products are made from. As we both know, you can't make a process node that excels in both high power and low power applications at the same time. And from my understanding, it's easier (and cheaper) to develop a low power process node than it is to develop a good, low leakage high performing one. It's a shame that we probably won't see another tank-like 2nd gen 32nm HKMG node equivalent come out of Intel ever again, but given the economics of where the market is going, are we surprised?
I also look back to an article I read in PC Gamer while I was in high school back in the late 90's. The article stated that one day, the CPU and the GPU would ultimately be combined into the same package, and I thought to myself, "Well where's the fun in that?" What I didn't realize at the time was that GPUs would ultimately become more much, much more versatile, more capable, and that programmers would utilize this added functionality to both enhance and accelerate the programs that they created. I don't know about you guys, but I'm streaming 1080p Starcraft 2 from TwitchTV on my second monitor and have another tab open from when was searching for Baskin Robbins on Google maps earlier when my sister wanted ice cream. This, obviously, is a highly trivial task for any modern computer to do these days, but when I read that article in PC Gamer years ago, that definitely wasn't the case.
How I lived this long without Starcraft 2 I haven't a clue.
So that begs the question, is the trend for Intel necessarily a bad thing? If Intel manages to "hold the line" in clockspeeds with all architectures/process nodes moving forward while all along vastly improving the low end, then I can't really see that as a bad thing. Sure, it'll mean we won't see another Sandy Bridge-like architectural release for a long, long while at the high end, but it doesn't stop programmers from pushing boundaries with current hardware. Yes, it'll make things that we've enjoyed in the past like overclocking largely unexciting (or worse, impossible), but there's bound to be benefits that maybe we haven't realized yet.
Like Starcraft 2... we could always use more hidden gems like Starcraft 2...
In the end, I think we'll see a product from Intel that we can all agree on is much better than what's available out there now. It just may not what we've been accustomed to having these last 10 years. Sometimes change isn't a bad thing.