I'm very interested, as has been said before in this thread, in why the VRAM utilization is so much higher in Crysis 3...
Is the same amount of system RAM installed in both systems? Other than that, the differences are PCIe configuration, system RAM speed, and # of cores. Those have collectively previously been shown not to have much bearing on framerate, if any at all.
So, RussianSensation, I would agree with your assessment that this is a very surprising result. I'm wondering if this holds true in CrossFire for Fury X's as well, and what of the above is causing the discrepancy.
However, if you get a 3-5% increase from all 3 things above in a sort of perfect storm case (minus the possible increased RAM capacity), you would end up at 9.3-15.7% increase in performance, so maybe TW3's result is reasonable based on those combined factors... A 7-10% average increase from each (which is largely unheard of previously) would result in the sort of difference seen in Crysis 3.
I mention total system RAM in each because I seem to recall a Tom's Hardware article exploring CrossFire Scaling with the 5800 series and total system RAM. The conclusion I seem to recall is that 16GB fared noticeably better than 8GB (or 6GB vs 12GB or something), but I can't for the life of me find the article...
Sidenote: can we get average/min/max framerate info for each of the 3 games tested? It'd be easier to see than a video.