And in the
Sandy Bridge review Anand used low resolutions (even 1024 x 768) and no AA/AF in order to make the Phenom II look bad compared to Intel.
No matter that nobody runs a fast CPU and video card at those settings.
Copied and pasted from another thread where I had to make this same explanation:
It's not "showing off" or "making a CPU look bad", it's demonstrating the contribution the CPU makes to the task. Obviously both the CPU and GPU need to be fast for most games to work properly, but some games just don't care and are highly CPU or GPU dependent. If you bottleneck the game by having GPU settings that are too high, you can't see potential CPU bottlenecks.
Remember that benchmarks are an average of a run.
Let's say half a benchmark is easy on the GPU and half is hard on the GPU when you're at high resolution (say... indoors and outdoors). You end up with an average frame rate of 30 FPS because indoors is getting 50 and outdoors is getting 10. This will hide the fact that the CPU may be supplying enough data for some theoretical faster video card to run at 50 fps outside versus if you turned the GPU-heavy detail items down and let it 'run free'. Once the GPU-bottlenecking details are turned down/off, you can see whether the CPU will allow the game to run any faster or if it's still only running at 10 FPS - in which case the CPU is also holding it back, and you should have a faster CPU (and GPU) to run the game better. On the other hand, if the cheapest CPU on the market runs the 10 FPS section at 400 FPS when the GPU-dependent detail's turned down, it's safe to say a faster CPU is not needed to improve the game.
This is why you must try to isolate the components. Make a CPU decision based on CPU-isolated tests, and make a GPU decision based on GPU-isolated tests.