Am I reading that right? The GPU used higher wattage on 1080p than on 4K? Was that because the 4K was bottlenecked by the limits of the UVD block and handled fewer frames? (It looks like this is a synthetic test with the decoding going full throttle, not a normal playback.)
It's system power and at 1080p it's putting out 4x the amount of frames, so it makes sense that power would actually go up in that case.
