Most games are actually *NOT* CPU limited at the resolutions and settings used by normal people
In the cases where CPU affects the AVG FPS, it's usually only really affecting the MAX side of the FPS, and the MIN FPS is actually the same.
For example, I did some benchmarking of Battlefield Vietnam lastnight where I artificially capped the MAX FPS at 80 FPS (by using FRAPS, then copying the results into Excel and using a macro to replace any number larger than 80 FPS with 80.)
I used an Athlon XP that was unlocked and ran 200 x 5,6,7,8,9,10, and 11
There was no significant difference in average FPS until the XP got down to 1200 MHz. Even at 1000 MHz the minimum FPS didn't change from 2200 MHz.
In most cases the CPU is simply not the limiting factor for how you perceive a smooth game. Looking at Average FPS is totally useless. You need to look at a histogram of the data or at the very least a MIN - AVG - MAX. Who cares if you get an increase in MAX FPS? Do you even notice that? HardOCP is the only site that really realizes this. Even that Firingsquad article doesn't tell you enough of the story. Sure you can tell when a CPU is having SOME effect, but you don't know what effect that is. If it's only affecting the times when the FPS is already over 80 or 100 FPS, who really cares?
No, it's the graphics card that affects the smoothness of gameplay. Of course the CPU makers probably don't want people to know this, but it's true.