OK I ran a quick check using the Folding@Home client. I'm assuming this is the CPU, not GPU version. I just grabbed the first one I came across using google.
Using a kill-a-watt, with the system currently in my sig (Q9650 @4ghz, 3 HDD's, 8GB RAM, 9600GT). I turned on the computer, and let it settle down for a few minutes before I did anything to it. I had forgotten to turn off screensaver and display suspend, so the first reading is with the monitor asleep. It's a few watts lower, I guess because the GPU actually sleeps from idle? That's a guess... Anyways...
Idle (w/monitors asleep) = 147-148w
Idle (w/monitors "on") = 150-151w
F@H (nothing else running) = 177-178w
OCCT = 263w
NOTES:
-All readings were pretty constant. I checked back every 5 or 10 mins except for OCCT which I only left on for a total of about 10mins and checked only once.
-F@H used up only about 25% CPU. This was from the very beginning when launched, and every time I checked it later it was still at 25%. The slider to control max CPU usage in configuration was already at it's highest. I don't know if any other adjustments could be made to make it use up more CPU.
-Increase from idle wattage to F@H wattage was almost 30w. CPU temp went up about 5c. Not a show stopper I guess, but I probably wouldn't do this (at least not on this machine) long-term. Primary reasons are electricity cost is already too high here, which covers the actual wattage used by the comp and also the AC has to run more (really poor insulation, etc) which also contributes to more wattage. I already fret about the filserver (~60w) and HTPC (~90w) on pretty much 24/7. And even those I will occasionally turn off. My main desktop (which I did this test on) I only have running when "needed" due to electric bill/heat. If I were in different conditions regarding my income or a more efficient living space I might change my tune.
-I will let the work units I got finish out and then stop there.