I would like the benchmarking methods to be detailed so that I can repeat the same test on my system. If I need to download any demos or whatnot, they should be available. I know that reviewing a video card isn't a science, but the whole point if scientific rigor is to only publish repeatable results.
Publishing minimum framerates is also a must. Who cares if I can get over 60fps on average if it dips to 20 every time I shoot? Or turn?
Also, at least one set of benchmarks should compare whatever cards are the main subject of the review to some older cards. For example I just upgraded from an x1900XT last week and it was a huge pain in the balls trying to figure out what kind of performance gains I would get.
It would be nice if the benchmarking method used from review to review was consistent so that if you compare a chart published on March 30th to one published on January 30th, the numbers would line up as you would expect. At least for one set of benchmarks per review. And 3dmark doesn't count. Granted, a lot can change in the market in just two months, and sometimes it doesn't make sense to review a new enthusiast (or budget) video card with the same settings as you reviewed a video card last month, but as much as possible should stay the same, for consistency's sake. I like it when I see an apples to apples comparison, but it would be nice to get apples to apples to apples to apples, between different reviews.
On the other hand, lately I've been liking and trusting the reviews on hardocp more and more. Apples to apples is nice, but it's also nice to get an editorial perspective saying, "here is the best gameplay experience you will get with this card, and this is how it compares to the best gameplay experience you will get with this other card."
Lastly, this site does a really good job of comparing stock clocked cards to overclocked cards, but in reality not everyone will be able to achieve the same overclocks. How about calculating some kind of performance increase index, that says clock for clock this is how much framerate increase (both minimum and average) you would expect to get per incremental clock increase. Also, does a maximum overclock result in a better gameplay experience, like is it smoother? can you enable higher settings?
I've noticed that, especially nvidia and since about 3 years ago, graphics cards have different specifications other than clock speed which have more of an impact on performance than any overclock you can possibly achieve. For example the number of ROPs or texturing units. A few years ago the manufacturers would have the same exact chip at a few different clock speeds and price points. Now they just release "OC" versions at different price points. It really doesn't matter to me because I'm going to make my decision based on real-world performance, not model numbers or clock speeds; however, it would be nice to see some editorial perspective, like "no matter how high the overclock is the real-world benefit is negligible," or "those gamers interested in playing this game and this resolution will get an increase in value for higher clocked cards."