Originally posted by: toyota
Originally posted by: OCguy
Coming from someone with a 2X setup, I find it hard to believe.
The 2nd card in a multi-gpu is not nearly as stressed as the card in the primary slot, meaning it will not draw the same amount of power.
In a X3 setup, you can assume the max power draw would be even less.
again look at the results from xbit which clearly show a 4870x2 uses double the power of the 4870 under full load. thats not a theory but is real world results. plus TDP is usually much higher than worse case actual wattage so the TDP of the 5870x2 being about twice as high as the 5870 makes perfect sense.
But the problem with such a 'test' is that it doesn't test average load in games.
peak load in some specific test or Furmark is a poor indicator for actual real world load.
Be it a CPU or GPU just because you can run some specialized code through it, that forces a certain peak watt number out of it, does not mean that the number is really the amount of power the component would draw in real world scenarios.
For example, it could be possible that ATI will go the Intel route and simply cap the peak load. Show me a test measuring average gaming load and I'll be convinced.
It is also a shame all power consumption 'measurements' are apples to oranges.
Lets say we have card A and card B, if card A is twice as fast as card B but uses less power to draw each frame (each equal task of work), then when both cards are limited to 60fps (apples to apples), card A would use less power.
However all 'power consumption tests' focus on making the card do the maximum amount of work, thus they would find that card A uses more power, hardly a surprise that when a card does twice the work it uses more power than a card that does less work.
While the number they come up with have some use, it is largely not as relevant as the other number, for GPUs think playing games that do not max out the card, or power/point ratio for stuff like F@H.
Likewise the same problem showed in all i7 vs. PII reviews, power consumption while encoding a movie and finding that the PII uses less power (when loaded) is hardly a very interesting result if it also takes twice as long, because during the entire task it ends up consuming more power (even when idle power for the i7 is included).
So my hope is that AT and other review sites will eventually realize this and give me a power consumption number I can actually use, instead of making me do 30 minutes of math to figure out which is best.