Originally posted by: lifeguard1999
Originally posted by: Sunrise089
3 changes, in order of desire:
1) Include many cards in each comparison - stop not including popular cards like x800xl and x850xt in reviews of new games.
2) Test on a variety of hardware - even though many overclock, we don't all have 2.8ghz FX-57s, so maybe test with either a 512k cache Athlon 64, or at least test with something clocked at like 2.4ghz.
3) Test as many resolutions as possible with and without AA/AF - especially low resolutions for those with CRTs that may lower resolutions in order to enhance playability (sp?) and widescreen resolutions
The problem is that with 2 & 3. the CPU becomes the limiting factor. Would you really want to buy the 7800 GTX if a 6600GT is just as fast since you are CPU limited??? (In theory I agree with you though.)
Althoug that is true, I think many people are concidering staying with their AGP based motherboards (P4 and AthlonXP systems), so it might be interesting to see just how much the CPU limits some games, ie do a set of tests on an Uber system, and another set of test on a more common system.
All I'd like to see a 95% confidence interval FPS rating, ie 95% of the time (while doing this test, playing this demo) the frame rate stayed above xxx fps. This might even be good at lower resolutions, where many cards perform really well, but perhaps on perticularly heavy scene one card is better then another, and this rating should show that very well. Since one heavy scene would really bring down the rating of the card, if it wasnt able to cope with it.
And isnt that really what we want to know, not that the average frame rate will be xxx, but that 95% of the time i'm playing this game, I will get above X frames per second. This is also better then min frame rate, because min fps only tells you that; for 1 second in a long 4 minute test, the frame rate dropped to the min value for what ever reason (hhd swap? maybe something else?).
TC