If you have more power than you need, do you still draw the extra power?

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
I've been curious about this for a while now - if you have a power supply that has more power than you need (say, 650W, if you only need 450W), do you still draw the extra power? Or just what you use?
 

MotF Bane

No Lifer
Dec 22, 2006
60,801
10
0
I've been curious about this for a while now - if you have a power supply that has more power than you need (say, 650W, if you only need 450W), do you still draw the extra power? Or just what you use?

It will take the 450, plus a bit for inefficiency (about 20%).

t
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If your components are drawing 450, its going to draw at least that, plus inefficiencies. And typically power supplies have a sweet spot that they like to run at. If you go under or over that, they are less efficient. So if you have a 650W, but you only have it at 40% load, then its most likely going to be less efficient than if you were at 80% load.

Those inefficiencies lead to extra power draw.

If you look at some of the reviews here on the main site, you can see how most PSU's scale over load.
 
Last edited:

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Fun Guy, the PSU will deliver only as much power as is necessary to run the system. Components consume a certain amount of power, and the PSU will deliver that, regardless of how much its maximum rated capacity is.

Comparing a 650W and 450W PSU, efficiency (converting AC power from the wall to DC power for components) will be affected a bit at a given load level. Hence, the amount power drawn from the wall will be slightly different for units of difference capacity, even though the same amount of power is delivered to the components by the unit.

Efficiency is rated for 20%, 50% and 100% load levels (see Understanding the 80Plus certification), such that efficiency at 50% load is optimal, while efficiency at 20% and 100% loads may be up to 4% worse, according to the 80Plus certification. Efficiency below 20% will usually degrade more steeply, e.g. Corsair AX750 achieves 89% at 20% load, but only 83% efficiency at 10% load.
 

BoomerD

No Lifer
Feb 26, 2006
66,315
14,722
146
Consider your electric outlet in the wall. It's capable of providing 15 amps of 110 volt electricity...does it provide that all the time? No, it only provides as much as is needed at the time. Your power supply is (sort of) just like that. It only provides as much power as is needed at the time. (of course, because of the inefficiencies in power supplies, it will draw a bit more from the wall than it actually provides to your components. IF your computer actually uses 450 watts, your power supply might pull 540 watts from the wall outlet...assuming an 80% efficient unit.)
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
1
71
Original power supplies use to IIRC, but that was back when 100W was considered overkill. Since then all power supplies have been what is called "switched mode" which only takes what it needs to run as mentioned by the other posters.
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
So, it sounds like I can safely go a little higher than I need in order to get a higher-quality supply. I was asking because although I will likely only need 400W-450W, the better quality supplies seem to be in the 600W-750W range.

What about all of the 80-plus ratings with the gold, bronze, etc. ratings. Does that really mean anything? Are there standards they have to pass, or can anyone slap a rating on?
 

PreferLinux

Senior member
Dec 29, 2010
420
0
0
Original power supplies use to IIRC, but that was back when 100W was considered overkill. Since then all power supplies have been what is called "switched mode" which only takes what it needs to run as mentioned by the other posters.
I doubt it. To do that, it would have to be deliberately done by adding an additional load to dump the extra power into. But you are partly correct – they were using transformers that used a fixed amount of power (small, but significant), plus the amount drawn by the components divided by the efficiency (which was very low).
 

tynopik

Diamond Member
Aug 10, 2004
5,245
500
126
while it's true that power supplies may not be the most efficient outside their target range,

1. overall the difference is minimal
2. the quality/efficiency of higher-end power supplies might (more than) offset that difference

Let's say a high end PS is 90% efficient at its optimum capacity but only 83% efficient where your load is. That's bad.

BUT if you get a lower-end PS, it's PEAK efficiency may only be 78%

Thus running the high-end PS outside it's target zone MAY still be more efficient than getting a smaller PS.

Of course all those numbers were pulled out of thin air, which is why you would have to carefully study the products in question to make an informed decision.
 

BoomerD

No Lifer
Feb 26, 2006
66,315
14,722
146
So, it sounds like I can safely go a little higher than I need in order to get a higher-quality supply. I was asking because although I will likely only need 400W-450W, the better quality supplies seem to be in the 600W-750W range.

What about all of the 80-plus ratings with the gold, bronze, etc. ratings. Does that really mean anything? Are there standards they have to pass, or can anyone slap a rating on?

The 80 Plus ratings are a measure of efficiency.

http://www.plugloadsolutions.com/80PlusPowerSupplies.aspx

http://en.wikipedia.org/wiki/80_PLUS

Not all "80 Plus" power supplies are worth buying, and yes, companies falsely claim their units meet that standard all the time...other companies' units might meet the 80 Plus standard at a low operating temp, but fail them at warmer temperatures...
 

JEDIYoda

Lifer
Jul 13, 2005
33,986
3,321
126
If your components are drawing 450, its going to draw at least that, plus inefficiencies. And typically power supplies have a sweet spot that they like to run at. If you go under or over that, they are less efficient. So if you have a 650W, but you only have it at 40% load, then its most likely going to be less efficient than if you were at 80% load.

Those inefficiencies lead to extra power draw.

If you look at some of the reviews here on the main site, you can see how most PSU's scale over load.

You do not know at all what you are talking about!!
You are spewing mis-information!!
 

Meghan54

Lifer
Oct 18, 2009
11,684
5,228
136
If your components are drawing 450, its going to draw at least that, plus inefficiencies. And typically power supplies have a sweet spot that they like to run at. If you go under or over that, they are less efficient. So if you have a 650W, but you only have it at 40% load, then its most likely going to be less efficient than if you were at 80% load.

Those inefficiencies lead to extra power draw.

If you look at some of the reviews here on the main site, you can see how most PSU's scale over load.


You do not know at all what you are talking about!!
You are spewing mis-information!!




Actually, he's close to the truth, JEDI, and you know it, although he's a tad off on the efficiency curve.

In truth, most if not all current power supplies have their highest efficiency right around 50% load....which mirrors what 80plus.org's testing reflects. Take 80plus's certification at any level---standard, bronze, silver, gold, platinum---is done at 20%, 50%, and 100% loading of a power supply. The efficiency curves produced are always highest at the 50% load level, reflected by their requirements of the various levels, i.e. bronze 82%, 85%, 82% at the aforementioned load levels of 20%, 50%, 100%. The same exact trend follows with all the other efficiency levels---silver, gold, platinum---best efficiency is always at 50% load.

So the poster, JEDI, was a tad off where the highest efficiency point was located. It's not like you've never been wrong (a la PCP&C builds their own power supplies......)


And as for his description of power draw....that was pretty much spot on. The power supply will draw as much power from the wall as demanded from the computer plus inefficiencies.

So, how was he completely wrong and spewing misinformation?
 

Infrnl

Golden Member
Jan 22, 2007
1,175
0
0
So ideally; for example if you use 200w; you would want to get a good 400w psu?
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
So ideally; for example if you use 200w; you would want to get a good 400w psu?
Not really. How often would you be surfing the net (@100W) and how often would you be playing a FPS game (@400W).

Surfing the next @ 100W - 90% of the time
Gaming @ 450W - 10% of the time
-----------------
Equals ~135W average

But you will still need a 500W PS at least, right?
 

BoomerD

No Lifer
Feb 26, 2006
66,315
14,722
146
Not really. How often would you be surfing the net (@100W) and how often would you be playing a FPS game (@400W).

Surfing the next @ 100W - 90% of the time
Gaming @ 450W - 10% of the time
-----------------
Equals ~135W average

But you will still need a 500W PS at least, right?

Yeah...you can't average out power demand like that...not to determine what size PSU to buy anyway...:p

If I KNEW my computer was going to draw 450 watts while gaming, I'd get at least a 650 watt PSU.
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
Yeah...you can't average out power demand like that...not to determine what size PSU to buy anyway...:p

If I KNEW my computer was going to draw 450 watts while gaming, I'd get at least a 650 watt PSU.
Right? I mean the way to do it is not averaging power, but calculating your max power draw and then adding a factor, by let's say, 20% to 25%.

Avg Draw = 150W
Max Draw = 400W
Max Draw + 25% = 500W
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Right? I mean the way to do it is not averaging power, but calculating your max power draw and then adding a factor, by let's say, 20% to 25%.

Avg Draw = 150W
Max Draw = 400W
Max Draw + 25% = 500W

that's a safe bet, imho. it takes two video cards to actually reach 500 watts. and you need some high end stuff to hit 400, per bench.
 

JEDIYoda

Lifer
Jul 13, 2005
33,986
3,321
126
Actually, he's close to the truth, JEDI, and you know it, although he's a tad off on the efficiency curve.

In truth, most if not all current power supplies have their highest efficiency right around 50% load....which mirrors what 80plus.org's testing reflects. Take 80plus's certification at any level---standard, bronze, silver, gold, platinum---is done at 20%, 50%, and 100% loading of a power supply. The efficiency curves produced are always highest at the 50% load level, reflected by their requirements of the various levels, i.e. bronze 82%, 85%, 82% at the aforementioned load levels of 20%, 50%, 100%. The same exact trend follows with all the other efficiency levels---silver, gold, platinum---best efficiency is always at 50% load.

So the poster, JEDI, was a tad off where the highest efficiency point was located. It's not like you've never been wrong (a la PCP&C builds their own power supplies......)


And as for his description of power draw....that was pretty much spot on. The power supply will draw as much power from the wall as demanded from the computer plus inefficiencies.

So, how was he completely wrong and spewing misinformation?

Lets be realistic...the days of having too much power or the myth of using more power than needed are long gone.
What we are talking about is mere pennies if you have a power supply that is way larger than needed.

So YEs -- it is mis-information to make out like it is a huge deal if you purchase a 850 watt PSU to power 250 watts worth of computer......mere pennies.

If it was true all the major PSU manufactures would be making smaller PSU`s...but it`s just not true.

Sure the math says otherwise but --- pennies on the dollar..add up to nothing in the scheme of things!!

We can talk efficiency all you want; there are quite a few variables that have to do with PSU efficiency and none of them are going to break the bank when you have to pay the electric bill!!
 
Last edited: