He is using 990mhz @ 1.03V. That's why. Increasing voltage from 1.175V to 1.256V increases my power consumption 42-43W alone! So think about him dropping his GPU clock to 990mhz and dropping voltages below 1.175V all the way to 1.03V. That's probably going to drop power consumption 40-50W+.
I tried running the memory at 1375mhz and at 685mhz (the lowest it goes for me in MSI Afterburner Unofficial overclocking mode 2). There is only a 4-5W power consumption difference but performance is slightly worse with memory at that speed. Card temperature is unaffected for me running 685 or 1375mhz memory.
I wonder if he used ATI Tray Tools to drop memory to 180 then?
His throughput of 2.9GH/s for 5 cards means ~ 580Mhash or 17% slower than my card (680 Mhash) That would work out to almost 1.8-2 extra BTC/month or $14-16. The loss in 17% mining production isn't worth the savings of 40-50W of power per card in that case for me since my monthly electricity cost for ~200W is less than $20/month to begin with. That 17% speed increase actually almost pays for my monthly electricity cost to mine.