This sounds a bit different. GPU clock varies heavily depending on the CPU load.
To be precise the chip throttle in the CPU + GPU stress test but not when only Prime 95 is used.
A genuine 15W CPU can drain 25W during peaks on the order of a millisecond, thoses 25W peaks wont appear at the main level even with good measurement tools as the excess energy over time is low enough that the peak is litteraly leveled by the power supply time constants.
What Intel did was to extend this notion of instantaneous power comsumption to periods as long as 30s or 1mn, to summarize they re assuming, and that s right mathematicaly speaking, that 25W during 30s followed by 0W during the same period amount to 12.5W on average.
I find this kind of logic somewhat dubbious even if founded technicaly speaking, at least it s not a pure scam like the race to idling myth.
In this latter case, and excluding trivial and almost irrelevant solutions, the math says that optimal perf/Watt is when the CPU comsumption is equal to the rest of the system comsumption, if the CPU power drain exceed the rest of system power drain then efficency decrease because starting from this point the CPU performance delta in %age will be less than the total system power delta in %age, hence battery life will be less at equal work.
This is due to the CPU power increasing as a square law in respect of performance, while the race to idling myth is implying that it s a linear law....