Originally posted by: redly1
You're talking theoretical maximums, of course.
I've got a P4 3.0Ghz machine. I hooked it up to a watt meter and never saw the computer pull over 120W (even under full load running a DC client). Considering that a switching power supply is ~85% efficient at best, the components are only pulling ~100W. And that's in a high end P4 system with a video card (not internal) and 2 hard drives running.
Food for thought
No, actually I'm not talking about theoretical maximums. The CPU power was given by Intel's TDP, the video card power by (near) full load, the optical drive by typical seek/read, the hard drive by typical read/write/seek. The maximum power spikes up much higher on some devices (the CPU to about 75W on power up instantaneous), the hard drive to almost 30W during start up and I probably underestimated the motherboard's load (memory, fans, etc).
Yeah, but during typical use, the power load will be lower. But when playing a game, for example, a 150W PSU will be overloaded with a newer video card, as I described above. But the 150W PSU should be fine for older video cards like the GF4 Ti series and Radeon 8500/9000/9100/9200, if the AGP slot is even used, also like I described above.
For your example, I think you need to double check your watt meter. A P4 3GHz CPU under full load draws 82W (see the datasheet). Even if the hard drive is asleep, it's not possible for a P4 3.0GHz under full load system to draw around 100W, especially if you have an AGP video card, fans and other components. For typical (low) use, it may be that low, but not under 100% load.
