Here we go again, the endless debate about cpu gaming benchmarks. I agree that both benchmarks are needed, as they tell you different things. Problem is when one benchmark is used to show something that it doesnt.
GPU limited benchmarks show only one thing. At the settings tested all the cpus that have the same framerate are "good enough" to use the gpu in the test to the max at the settings being tested in that particular game. If you used a more powerful gpu or a more cpu limited game, the results most likely would be different. The problem comes when one tries to extrapolate that because cpu A is equal to cpu B in one game at one gpu limited setting, that they are equal as gaming cpus.
Low resolution benchmarks show more clearly the difference between cpu performance, but the same relative differences obviously will not be maintained as the gpu becomes the limiting factor. I would say however, that if one cpu shows higher performance at low resolution than a different cpu of the same architecture it can pretty safely be extrapolated that it will ultimately allow higher performance with a powerful graphics card. This is the usefulness of low resolution tests. The user must determine if his ultimate use will be gpu or cpu limited.
If only one test system is available, I would prefer to see a very high end card, with tests done at lower settings to evaluate cpus and with increasingly more demanding settings to see where the gpu becomes limiting. Then one could extrapolate that data to get some idea of gpu vs cpu limitations with other cards.