However a chip can put out more heat than the TDP. LinX is one example on CPUs as is furmark on GPUs.
Ya the problem is Furmark, OCCT, LinX and Prime95/Orthos -- each of these programs have nothing to do with real world usage.
I used to test all my overclocked CPUs with Prime 95 and Orthos to monitor the maximum CPU temperatures at load. However, I've never encountered a single application that taxed my CPU as much. So now I strictly use them for stability and not temperature testing. Instead, I use the most intensive programs that run on my system to test maximum temperatures because this testing 100% applies to how I will use the system. At least for CPUs, those temps are generally +/-5*C within stress testing program results. For GPUs though, using stress testing programs to monitor maximum power usage is a complete waste of time.
Look at this:
http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_3.html#sect0
5970 - Crysis Warhead - 240 Watts
5970 - TDP - 294 Watts
5970 - OCCT: GPU - 355 Watts
GTX295 - Crysis Warhead - 312 Watts
GTX295 - TDP - 289 Watts
GTX295 - OCCT: GPU - 400 Watts
OCCT puts a completely unrealistic load on all of the components of the graphics card. There is little point in testing the "maximum" power consumption of a videocard since you'll
never encounter that for your intended purposes. Also, based on the results above, strictly focusing on TDP numbers is equivalent to car buyers arguing fuel consumption MPG manufacturer specifications.
According to these results, GTX295 is expected to run at a similar load in games as a 5970, but
is using at least 70 Watts more power!
Without real world power consumption figures, it's impossible to say how much power GTX480 will consume compared to 5870.