Can you explain what you mean by this? Why is it pointless to run CPU benchmarks at low resolutions?
Because, no one games at 800x1600 or 1024x768 with a GTX680 or similar. The point of CPU and GPU testing is to provide an assessment to PC builders regarding
real world gaming performance. If you fire up a game like Crysis 3, Tomb Raider, Metro LL with a GTX680 or 780Ti, etc. at common resolutions like 1080p/1440p/1600p, and you end up with results that show little to no difference in gaming performance between a i7 2600k and i7 4770k, well that's the correct conclusion. The point of benchmarks is not to skew benchmarks to try and sell products, but to show us whether or not the new part will provide us with a tangible and real world gain. For most games now, even at 'budget' 1080p resolution, we are more GPU than CPU limited with any modern Intel i5/i7 CPU. The biggest differences for CPUs are in minimum frame rates.
Even then, an upgrade from Titan/GTX780Ti to the next flagship from NV/AMD will net a greater performance improvement in 98% of PC games than a move from i5 2500K to Devil's Canyon. Even if you run into a CPU limited situation where you are > 100 fps, you can increase visual fidelity via a variety of AA settings like SSAA, etc. and shift the load to the GPU. One of the common exceptions are Blizzard titles which tend to be dual-threaded and rely heavily in clock speeds and IPC.
Those of us who'll be upgrading from i5 2500K or similar to some future Intel CPU (DC or Broadwell/Skylake) will do so because we are: (1) bored and want something new to play with; (2) want next gen features like Ultra M.2; (3) keeping up with our hobby. I can't see how a 5.0Ghz Haswell will provide any tangible performance improvement in games over a 4.4-4.5Ghz SB. The money is better spent on a new GPU/SSD upgrade when it comes to real world performance gains.