Why? If there is TDP headroom, the card clocks higher. This might happen at 30fps, at 60, at 100. Two applications might load the GPU to 100% (according to Afterburner, GPU-Z etc.) but the actual consumption is different. So in app A it might clock higher than in B, giving additional performance.
In practice you don't really see that. You set your offset for a max, and it pretty much runs at that max or 1 bin below it if you're running in the low 70's (or you run slower if you're using vsync or the "fps target" functionality and don't need the extra speed).
3dmark11, furmark, the witcher 2, metro 2033, wow, etc. All sit at 1215 Mhz for me the entire time. The only time it goes below are when the scene just doesn't require that much to maintain 60 fps (I use adaptive vsync).
edit: these are all overclocked amounts. If I don't overclock it, it just sits at 1098 Mhz the whole time. That's mine's default max boost.