Hello guys, I own a 8800GT SLI setup and am decidedly against AFR (=SLI). I first noticed something like this when I first got my second 8800GT. Previously (with one 8800GT), I was able to play Crysis at 16x10 with everything on High without too much problems, aside from some occassional slowdowns (especially at the latest stages of the game) When I got my second 8800GT, I benchmarked and got amazing results. With SLI I was able to push most of the settings to Very High, while getting the FPS I got from a single 8800GT on High. But when I played the game again with said mostly Very High settings, the game was obviously slower; but when I checked with FRAPS or r_displayInfo 1, I saw that the FPS was very acceptable. After all, I ended up being only able to enable Sunshafts... What a great benefit from the second card.
Then the discussions related to this in various sites emerged, and I at once knew what I was up to.
I started a forum related to this issue in Donanimhaber (famed Turkish hardware news site), and I'd like to show YOU my calculations I did for that site as well.
I don't need to repeat why un-synched frames suck, others here have done that perfectly.
First; some results from Lost Planet:
http://img111.imageshack.us/img111/6594/lpzr1.jpg
It's obvious that every third frame sees a jump from around 50 FPS to around 150 FPS. I don't think I need to tell you that in this situation your eye will see the fluidity of a 50 FPS system with some stuttering (caused by the super-fast delay third frames). However, the reported frame rate will be 1000*(63-48)/(981-753) = 66 FPS.
So all hardware sites will take this result and, I'm sorry but, stupidly compare it to non-AFR single GPU frames and say "wow, our FPS increased from a single 8800GT's 40 to 66 when we plugged in another 8800GT!!" Which is obviously nonsense. I can't believe that after all the awareness evoked from this kind of threads in hardware forums, nearly no hardware site mentions this in their reviews, including Anandtech, which I wouldn't normally expect such a thing from.
If we take momentary FPS values in groups of three and for each group, equate each FPS to the minimum FPS encountered in that group; we might "measure" the sense of fluidity within a game. So, that makes the Lost Planet situation I mentioned, the "adjusted" FPS's would be 61/61/61/41/41/41 instead of what you see on the link. This might look very cruel, but in fact it works good, and the "real" FPS measured this way doesn't differ from the benchmark-reported FPS's in games with no synch problems with SLI (such as World in Conflict, Assassin's Creed and BioShock).
With some geeks from Donanimhaber we did this calculation and then took the averages for six games. Here are the results:
Assassin's Creed:
Reported FRAPS FPS: 53
Calculated "Real" FPS: 53
BioShock:
Reported FRAPS FPS: 106
Calculated "Real" FPS: 102
Call of Juarez: (<---- Sucks in SLI)
Reported FRAPS FPS: 37
Calculated "Real" FPS: 25
Lost Planet (<---- Sucks in SLI)
Reported FRAPS FPS: 63
Calculated "Real" FPS: 47
World in Conflict
Reported FRAPS FPS: 42
Calculated "Real" FPS: 41
Crysis (<---- Sucks in SLI)
Reported FRAPS FPS: 38
Calculated "Real" FPS: 27