You know, I agree with the general wisdom of the OP's premise (GPU>CPU) in
most applications. We might note AT's own review of CPU performance in Oblivion, however, specifically with regard to CPU/GPU scaling tests done in town:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747&p=4
Outdoors, where the GPU demands are higher, only crossfire x1900xt's required the use of anything more than the lowest-end Athlon 64 (remember the time frame in which the article came out, pre-Conroe). But in towns, with the much higher concentration of NPCs, AT found that even an X1800xl paired with a 2.6Ghz Athlon64 would soundly beat crossfire x1900xt's paired with a 1.8Ghz model.
BUT, the difference between losing the ~9fps from the low 40s to the low 30s in the 2.6Ghz/x1800xl vs. 1.8Ghz/x1900xt "town" comparison, isn't nearly as important, of course, as the the difference in the "outdoor" benchmark: ~12fps from the upper 20's to the upper teens.
Thus the GPU>CPU wisdom generally holds true in even CPU-intensive titles.
Still, I would *not* want to play Oblivion on a Celeron processor. Sure, you can get semi-acceptable framerates if you pair it with ultra-high end graphics cards, but even then it will be seriously sluggish (19.9 fps in the Oblivion Gate bench, and 21.5 in the town bench with crossfire x1900xts). Considering just how much you can stretch the performance of even a single x1900 with better processors, a Celeron owner would have been better served ditching the second x1900 and upgrading their CPU to at least an Athlon64 3000+. That setup would have beaten the performance of x1900crossfire by 10fps in every benchmark and would probably have been cheaper, even taking new motherboard & RAM into consideration.
Situation, situation, situation. There is no single, magic bullet for upgrading your GPU/CPU. Generally, your GPU is likely to be more of a bottleneck in most games. If you can find the information, however, it just may be that your favorite game needs a slightly better balance of CPU & GPU.