does too much power supply waste energy?

Kremlar

Golden Member
Oct 10, 1999
1,426
3
81
If I use a power supply that's overkill for my current components, is the daily energy use higher than a lower-rated power supply?

For example, if I, say, only need a 350W or so power supply but purchase a 750W PC Power & Cooling. Will it consume more energy than, say, a PC Power & Cooling 500W would have?
 

Beanie46

Senior member
Feb 16, 2009
527
0
0
A power supply will only consume the amount of energy needed to supply what's required of it at any particular moment. So, for example. a 400W and a 1000W power supply, being asked to deliver 325W of DC power, will be delivering the same amount of wattage/amperage to the same exact system at any given moment....of course, this does not address the power supply's efficiency, meaning the more efficient the power supply, the less power the unit will have to draw from the socket to deliver the asked for wattage/amperage.

And with efficiency, the sweet spot for almost all power supplies is around 50-60% of rated output, meaning the power supply is at its highest efficiency delivering around half its rated capacity, such as 500W from a 1kw unit, 250W form a 500W unit, etc. Take a look at 80plus.org and see how the efficiency curves are given for the tested power supplies. Almost without exception, the efficiency curves peak at around 50-60% of full rated capacity...easily seen by looking at 80plus's testing of individual units (and they're all tested on the U.S. 115V current, btw.)
 

Yellowbeard

Golden Member
Sep 9, 2003
1,542
2
0
Additionally, the difference between 80% and 82% on your power bill at the end of the year is likely going to be pennies.

Keep in mind also that ideally you want to operate your PSU at or about 65% of its maximum rated power output.
 

Kremlar

Golden Member
Oct 10, 1999
1,426
3
81
Thanks for the replies, guys!

Keep in mind also that ideally you want to operate your PSU at or about 65% of its maximum rated power output.

So, what is the downside to operating it at, say, 30% of it's rated capacity?

Any?
 

Yellowbeard

Golden Member
Sep 9, 2003
1,542
2
0
The worst thing I know of is that it will operate at a reduced efficiency. It should not effect longevity like operating at , for example.....90%.
 

Kremlar

Golden Member
Oct 10, 1999
1,426
3
81
Just wanted to post back some quick findings.

I tested a cheap, new, dual core PC with very basic components with 3 power supplies. I pretty much confirmed everything mentioned here with my Kill-A-Watt.

All numbers below are at idle:

Standard 350W Power Supply: 51w
PC Power & Cooling 750W: 46w
PC Power & Cooling 370W: 42w

So, here?s how it breaks down per my testing & what I've learned:

- Different power supplies have different ?efficiencies?. Better power supplies need to draw less power from your electrical panel to deliver the same amount of power to your components. In this case, upgrading to a PC Power & Cooling 370W power supply can save 10w, which doesn?t seem like much ? but it?s almost 20%!

- Power supplies are at their best efficiency when they are operating at approximately 65% of their rated load. This is why the 750W is using more power than the 370W ? because the 750W is way overkill for this basic system, and it?s not being as efficient as it could be.

I bought PC Power & Cooling 750Ws for a couple of servers I am assembling. They are probably way overkill for me and not operating as efficient as they could be, but I'm glad I didn't go with a cheapie.

Thanks guys.