The real measure of any CPU is a benchmark. That benchmark should be a program that you use. If you are a gamer, then Q3/UT/etc. If all you do is surf the web, then a 533 MHz Celeron is as good as an 1.3 GHz Athlon.
Another example: the Athlon beats many single CPUs on supercomputers in the number of floating point operations per second (flops) it can perform. The CPUs on supercomputers cannot play Quake3. (Well, they could if someone ported it. Someone did that as an example for an immersive, virtual reality experiment. But that is getting off on a tangent.)
What good is a supercomputer then? The benchmark to use on them is not Quake3, or even FLOPS. The benchmark to use on them is the programthat you intend to run. For example, CTH (a shock physics code) or GASP (computational fluid dynamics) or GAMMESS (computational chemistry). Supercomputers use up to thousands of CPUs with a fast interconnect that a gigabit ethernet cannot touch.
Even then, an Athlon in a Beowulf cluster can beat a supercomputer, if you choose the correct benchmark.
To close, I will repeat myself.
The real measure of any CPU is a benchmark. That benchmark should be a program that you use.