I would argue that since 3D graphics cards have become prevalent, this is about as much as a CPU has ever mattered for gaming.
You have cases where CPUs are critically important, but for gaming, the actually noticeable difference between a "good" CPU and a "great!" CPU is a little more latency in less than 5% of total frames.
Yes, the extreme budget CPUs are not good, but it's always been that the actually noticeable difference between a moderate CPU and high end CPU is pretty negligible for gaming, since the graphics card is usually the primary bottleneck.
I think we are seeing a little bit of a resurgence of people focused on CPU importance for three reasons:
1) game graphics stagnation combined + 1080p stagnation + GPU advancements + post-process AA has led to an "entry level gaming" GPU very capable to display good quality images at 60+ FPS, revealing more CPU bottlenecks than an entry level GPU was able to in the past. Once you move to enthusiast quality cards, it's becoming nearly standard that a $200 card is using 4xaa at the popular res. That never happened before. The CPU bottlenecks are being revealed because the GPUs relative to what's necessary for 30FPS are just way better... and thus more often revealing CPU issues in reviews.
These have probably always been there, and they probably aren't large enough issues for the average enthusiast gamer to even notice, but since it shows up in a review, even if that review is for 2 cards in Crossfire or SLI that are better than what this person has, the psychology has that person knowing there is a constraint at some point. Then the overkill "more is always better" nature of gaming takes over and people buy more than they need.
3) Poorly ported console games tend to require significant CPU.
example: Skyrim before whatever patch it was. Then that patch came out and CPU issues completely disappeared... because they weren't CPU issues, they were software issues.
Also GTA 4, the worst ported (highest CPU dependency) AAA title I know of.
2) 120Hz monitors have gained some traction, and this is a case where CPU bottlenecks in #1 potentially become more real issues. *This is one case where most CPUs are actually inadequate for current software. *Few games can run a full 120Hz, regardless of CPU / GPU combo, largely because of CPU bottlenecks.
I think it will be a LONG time before 120Hz becomes reasonable on mainstream CPUs, just because the the stuff a game has the CPU doing is stuff that you can't really scale well. On the graphics card you can always have features that can be enabled and disabled that only affect visual quality and not the core gameplay: AA, resolution, depth effects, texture quality, etc... However, the stuff the CPU does is AI, motion / direction / physics, tracking objects... things that affect the core of the game. On the GPU side, you can scale visual quality so you can get the same core gaming experience on hardware 1/10th or less the capability of the top tier hardware, but on the CPU side, there are lots of things you can't afford to scale at all without dramatically changing the core gaming experience. You have to develop a game for the average CPU or shut out a huge portion of your potential market.