Well one way to tell is by benchmarking specific games.
For instance:
CPU LIMITED:
-the larger the framerate fluctuation is with various cpus and same videocard using same videogame settings, the more the game is cpu limited
GPU LIMITED:
-the larger the framerate fluctuation is with various videocards and same cpu using same videogame settings, the more the game is GPU (graphics card) limited
Therefore, a general rule of thumb is: the smaller the workload on the graphics card, the more the game is cpu limited; the larger the workload on the graphics card, the more the game is gpu limited. This helps to explain why a game can be cpu limited at 640x480 and turn into gpu limited by 1600x1200 while the cpu and the videocard are still the same.
However, this question is difficult to answer. For instance, take an older game like Call of Duty. Say a system with 9600Pro and 9800Pro but both using 3.0ghz P4. At 1280x1024 2AA/8AF, 9600Pro would be significantly GPU limited and struggling. 9800Pro would handle it. But as soon as we switch to a Doom 3 game, both graphics cards will be GPU limited with the same 3.0ghz p4. Why? - because the graphics workload has increased. Therefore, it depends on the game and the settings/image quality set.
At the same time, in games where AI is important and the cpu needs to calculate a lot of physics, etc., the cpu might be more important.
With the latest games and the direction the gaming industry is heading, it's a safe bet to upgrade the videocard twice every one cpu upgrade. Today graphics card is far more important than before. Pretty much AXP 2500+ and 6800GT will smoke A64 4000+ and 9800xt.
Maybe you could read this article that helps to explain GPU/CPU limitations:
Highend CPU's vs High End Graphics cards
Pretty much in most cases you'll be both cpu and gpu limited. But in latest DX9 games, you'll be more GPU limited due to implementation of intensive shaders.