You have to be very careful here. Maximum could be the 100 microsecond blip depending on the granularity of the measurement.
In the same review the average power is 112W. AMD could easily counter with a SKU that drops clocks under very heavy loads to maintain a 130W envelope.
If you look at the top of the page, it doesn't appear that these were momentary blips. The description is as follows: "Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high no-game power-consumption that can typically be reached only with stress-testing applications. The card was left running the stress test until power draw converged to a stable value."
It's worth pointing out that the R9 series rebrands were more power-hungry, in general, than their 7000-series counterparts. For example, the R9 270X's maximum power draw of 172W compares poorly to the 7870's maximum power draw of 144W. (The 7850 was even more frugal, barely breaking the 100W mark even with Furmark.) I don't know what happened here - did AMD just want their "new" cards to chart a bit higher, so they bumped the clocks even though efficiency went down the drain? Even the Tahiti-based 7950 wasn't that bad in terms of efficiency, maxing out at 179W, well short of the rated TDP of 200W. Even Tonga doesn't do that well.
As I said, I agree that AMD could get Pitcairn's TDP down to 110W-130W without butchering performance. As noted above, that's pretty much what they did with the 7850, and that was back in 2012. But I still don't think this is going to be a competitive product in 2015, not when it lacks modern features like FreeSync, Crossfire XDMA, and 4K video decoding.
