
I figured it might be useful in some ways to show how TDP doesn't always relate to power use, how OCCT and actual power use in a game differ, and how neither company particularly accurately represents its power use with TDP.
Also might help people work out how much power their card uses/might use, and see what the real differences are between different cards.
Data all comes from Xbitlabs (Spring power consumption roundup + individual more recent reviews). TDPs taken from Wikipedia.
Obviously AMD power is typically lower than NV, but that's not the point, it's more about how cards compare, and how inaccurate TDP is, amongst other things, as well as highlighting more typical vs worst case power use.
(http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010.html
http://www.xbitlabs.com/articles/video/display/geforce-gtx-580_5.html#sect0 <- GTX580, because I know at least some people will comment on its apparently high use vs the GTX480)
Last edited: