I don't really think "new and exciting" have stopped. You need to widen the perspective. Different parts of computing moves in spurts at different times.
Intel have sort of got stuck, that is true. Their 3.8GHz may very well not be able to perform more than 1-5% better at sustained load than CPUs available 2 years ago (3.06@533, 2.8C), due to advanced power management features regulating heat and clock. But AMD have sort of crept steadily forward. And it might be that had we been on 64-bit software, that would have been slightly more noticable as well.
But OK, CPU action is slow at the moment.
But other areas have improved. Harddrives for instance, have grown thremendously in past years.
More recently, graphics have made huge leaps forward. Much faster than Moore's law, and also I think more & faster progress than ever before.
Personally, I think doing without AF (anisotropic filtering) is a huge loss. AF looks so incredibly much better. Same with AA, even for 1280X1024. Regardless of resolution, AF & AA makes the 3D world *come together*, and loose the pixel noise and that cardboard feel.
Also, while not normally playing FPS, I noticed FC looks a lot better on some of the very highest settings, than on the next highest (light, textures and detail/environment). FPS is also much easier on 'instantenous' framerates. As well as less straining for the eyes.
All these things, very good reasons to get a a very good videocard.
I find that the OP somewhwere motivated his restraint on videocards by the very fact that the progress was so fast, and the money "wasted" or better spent on the cpu. This is a bit peculiar logic :/ ...considering the topic line...
For personal private use, I always buy cards featuring the high end chipset. Not the premium and most expensive, down a bit in clocks is OK, but the top end architecture, GF3 Ti, GF4 Ti, FX5900, R9800, 6800, whatever that is at the moment. It's so much more rewarding than the 'medium' cards. I've noticed some people, even gamers, put more emphasis on the CPU. Often financing the CPU with a pedestrian videocard. I'd suggest the very opposite.
Recently, many hardware introductions have been largely for the manufacturers. Like PCI-e, SATA, DDR2. These will not offer the user any sudden revolutionary breakthrough, but will offer easier manufacturing of higher specs in the long run.
For the user, videocards is the currently most exciting area.
But I'm hoping that the future will bring some excitements through software improvements: 64-bit, better multithreading for multi CPU, dualcore and hyperthreading, better vector optimization. The software have to change a bit, then we can probably have the CPU train rolling again. If/when it does, it might be almost as hard and expensive to hang on, as it currently is with videocards.