How Much PSU Headroom is Enough?

TXjohnny

Junior Member
Dec 11, 2012
2
0
0
I currently have a Seasonic 650w PSU. I have had that PSU in my computer for well over 3 years and have no reason to believe it is not working perfectly. Having upgraded various components over the years and looking at a couple of PSU calculators, my system uses somewhere around 580w. Is the 70w or so difference enough headroom for my PSU? How much bigger should the wattage of a PSU be compared to the computer's usage?

Thanks for your help.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
PSU calculators are notorious for grossly overestimating. Why don't you list your parts so we can tell you instead?
 

nickbits

Diamond Member
Mar 10, 2008
4,122
1
81
get a klll-a-watt meter and see how much you are actually using. I really doubt you are using 580w.
 

TXjohnny

Junior Member
Dec 11, 2012
2
0
0
Sure, my system:

i2500k o/c to 4.4 (arctic cooler heatsink)
Asus P8Z68-V
16g of memory (4 sticks of 4g g.skill P17000 (2133m))
1 WD 500g velociraptor internal HD
1 seagate 7200 rpm 1T internal HD
1 WD 500g external hard drive
Xonar DX soundcard
Sony DVD/CD burner
nvidia 560ti
28 inch Asus monitor (1900x1080)
razor mamba mouse
MS usb keyboard

Again, thanks for your replies.
 
Last edited:

philipma1957

Golden Member
Jan 8, 2012
1,714
0
76
WELL the cpu can pull 95 watts under max load


that card maxed is good for 165- 175 watts so

that is 270 max. the raptor is about 12 watts max the seagate 8 watts .

so you are at 288 the wd 500gb is around 7-9 watts now you are at 297-299.

round to 300 just in case.

i left out the ram. 4 sticks at 3 watts is 12 watts
the cd burner 25 watts during a burn 25 watts
the xonar sound card 10 watts max 10 watts
keyboard 2-5 watts 5
mouse 2-5 watts 5
and your cooling fans 4 at 2 watts 10 rounding up this is 67 more watts


so i get 367 watts less say I am under by 10% so add 36 you come to 403 watts. buy a kill-a-watt for 20 bucks max your system.


if you show 500 on the kill-a-watt and your psu is an 80% psu 400/500 is 80 %, so I really think you should be fine. but a kill-a-watt is worth 20 bucks just to be sure.
 
Last edited:

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Your computer probably uses around 300-350W under full load. Next time a Kill-A-Watt goes on sale for $16 shipped, buy one. You'll be amazed and amused.

Anandtech Bench puts a computer with a GTX 560 Ti (quad core CPU, single HDD) at 329W from the wall.

that card maxed is good for 165- 175 watts so

Official specs peg it at 170W max.

More interesting is the 99°C max. :p
 
Last edited:

PC Perv

Member
Nov 6, 2009
41
0
0
500W should be enough for almost all single-GPU configurations.


Posted from Anandtech.com App for Android
 

pauldun170

Diamond Member
Sep 26, 2011
9,264
5,315
136
Killawatt is great. First week I had I hooked it up to just about everything.
 

riva2model64

Member
Dec 13, 2012
47
1
71
To get best efficiency from psu, 50% usage at typical full load is a good optimal parameter.

Common sense would also dictate that running at less than the fully capacity of power supply will increase lifespan.

Going by phillipma's 400w estimation it's close enough to the 50% efficiency optimal.

A 'typical' full load is less than a super unrealistic load (furmark with with intel stress test running in the bg). Such an unrealistic situation usually does not occur, but even if it does you still have power to spare. Seasonic under-rates their power supply's ability anyway.

I remember when I got a kill-a-watt. My E8400 w/HD4830 took 175w from the wall while playing Crysis. And 220w when running furmark 0_o
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
To get best efficiency from psu, 50% usage at typical full load is a good optimal parameter.

No. You want 50% usage at whatever load your computer will spend most of its time at.

For instance gaming might take 350W on some random imaginary system, but what if the user spends 60% of the time NOT in games? The idling wattage may be 130W so an ideal PSU for purely efficiency would be 260W, which of course wouldn't work. However, a 400W PSU would put load at around 90% and idle around 30%, which is a reasonable split.

tl;dr

Unless you are ALWAYS AND ABSOLUTELY 100% of the time running at full load, don't bother buying more wattage than you need.
 

Fallengod

Diamond Member
Jul 2, 2001
5,908
19
81
I agree with others. Computer power consumption and PSUs is something that is severely overestimated. Its kind of a good thing so people dont end up with systems running at the exact max of a PSU(nor do you want that even if it was stable) but still..

You should buy yourself a Kill-A-Watt meter and youll learn a lot about true power consumption. PSU calculators arnt very accurate.....

And Kill-A-Watt meters, some only cost like $15....
 

GAO

Member
Dec 10, 2009
96
1
71
No. You want 50% usage at whatever load your computer will spend most of its time at.

For instance gaming might take 350W on some random imaginary system, but what if the user spends 60% of the time NOT in games? The idling wattage may be 130W so an ideal PSU for purely efficiency would be 260W, which of course wouldn't work. However, a 400W PSU would put load at around 90% and idle around 30%, which is a reasonable split.

tl;dr

Unless you are ALWAYS AND ABSOLUTELY 100% of the time running at full load, don't bother buying more wattage than you need.

When you are down to idling at 130W, even for a figh wattage CPU, the Watts you waste in inefficiency in that region would probably amount to less than $10/year in electrical costs. However, when you do game and if you game a lot or use lots of power at a moderate frequency, if you are inefficient in that region, the electrical cost will be much more. This leads me to conclude that you want your highest efficiency, usually at 50-60% of the rating of the PSU, to be where you will expend your maximum load - not at the low end.

Say you run your computer 12 hours per dayt and electicity costs $0.11 / kw-hour and that peak your processor is 95% efficient, but is 80% at the near low and max power (the difference between peak and max is 15%).

If idle at 130W and you are only 80% efficient the difference is 0.15 * 135 *12 *365 / 1000 *0.11 = $9.4/yr

At 750W it is .15 *750 * 12 * 365 /1000 * 0.11 = $54/yr.

You wouldn't run at this fictitious usage, so the efficiency doesn't amount to much unless your are running flat out 24x7 - then you want your load to fall into the 60% of the supplies maximum.

So I think it sis best to not cut your headroom too close and using only 60% of your supply only costs you in the initial outlay of the PSU.
 
Last edited:

philipma1957

Golden Member
Jan 8, 2012
1,714
0
76
No. You want 50% usage at whatever load your computer will spend most of its time at.

For instance gaming might take 350W on some random imaginary system, but what if the user spends 60% of the time NOT in games? The idling wattage may be 130W so an ideal PSU for purely efficiency would be 260W, which of course wouldn't work. However, a 400W PSU would put load at around 90% and idle around 30%, which is a reasonable split.

tl;dr

Unless you are ALWAYS AND ABSOLUTELY 100% of the time running at full load, don't bother buying more wattage than you need.


YEAH his system a good choice is an antec ea-430 or a ea-500


-

http://www.amazon.com/Antec-EA-430-A...s=antec+ea+430



http://www.newegg.com/Product/Produc...82E16817371063

i am running 7 pcs with 16 gpus for bit coin. {nice to be paid to heat my home} they run 24/7 none are really pushed.

the two 3x 7970 systems use seasonic 1000watt platinums these systems clock about 550 watts on the kill-a-watt I do run all gpus at a 10% under clock for best watt to hash rate. the 2x 7970's pull about 395 watts on seasonic x660's but it is 24/7 and since the gpus do put out heat cooler psu temps are good.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Here is how you determine wattage of PSUs necessary.

1. Post a question asking for advice about sizing.

2. Take the most combatitive and derisive post that you see, and divide the suggested value in that post by two ;)
 
Last edited:

Vinwiesel

Member
Jan 26, 2011
163
0
0
I prefer overkill on the supply because that way it stays silent at any load. As long as it is 80+ bronze rated, it will be a minimum 82% efficient between 10% and 90% load, so it won't waste any power from being underutilized.

TXjohnny probably has 350W of headroom in that system, but that certainly isn't a bad thing. Maybe he could have saved $20 bucks and bought a 500W supply, and tolerated a little fan noise when the system was at 100%.

Everyone should have a kill-a-watt. Comes in handy for troubleshooting too, for example if you have a phone that is "charging" but not charging, you can plug the charger into the kill-a-watt and see if it is actually drawing power. Or you can see that your cable box is using 25W of power when on standby.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
When you are down to idling at 130W, even for a figh wattage CPU, the Watts you waste in inefficiency in that region would probably amount to less than $10/year in electrical costs. However, when you do game and if you game a lot or use lots of power at a moderate frequency, if you are inefficient in that region, the electrical cost will be much more. This leads me to conclude that you want your highest efficiency, usually at 50-60% of the rating of the PSU, to be where you will expend your maximum load - not at the low end.

You wouldn't run at this fictitious usage, so the efficiency doesn't amount to much unless your are running flat out 24x7 - then you want your load to fall into the 60% of the supplies maximum.

So I think it sis best to not cut your headroom too close and using only 60% of your supply only costs you in the initial outlay of the PSU.

Which is where most of the cost will be.
And no, you don't want highest efficiency at max load when it won't run max load that often, like was already said.
The difference at 50% and 80% might be 0.5% to 1% efficiency, and maybe 2% from 20% to 50%.
If you buy a PSU where max load is 50%, you're losing 2%, if you buy a PSU where max load is 80%, you're losing 1%, but you're also buying a much cheaper PSU and saving actual money.

Given that when it comes time to upgrade, the newer components these days will probably use LESS power, upgrade headroom doesn't really matter except for something like going from a single GPU to dual GPU, in which case you do need headroom.

And idle nowadays is often significantly lower than load, meaning you lose more efficiency because there's a larger difference in load vs idle.
My system is something like 80w idle, and 300w load. I've got way too much of a PSU for the system, which puts me in the situation you suggest, max load of 50% PSU power.
That means my idle is about 15% of total power, which is getting close to seriously inefficient, and that's where my computer spends 80% of its time, so I wasted money overbuying on the PSU, and I lose efficiency unnecessarily.
 

GAO

Member
Dec 10, 2009
96
1
71
The difference in proce per hundred watts isn;t that much until you get up to around $1KW.

Even with a 10-15% loss in efficiency at an idle of near 100W, the difference in electricity cost would be a blip on your yearly power bill.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
You're missing the point: Why pay more for a PSU for the privilege of paying more on your power bill?
 

GAO

Member
Dec 10, 2009
96
1
71
You're missing the point: Why pay more for a PSU for the privilege of paying more on your power bill?

No, I did not miss the point. I really don't think it matters much.

Depending on your plans, more headroom will give you elbow room for the future. And we are talking about at most a few dollars a year difference in electricity costs and often 10 or 20 dollars for 100 watts more for the PSU purchase. In addition, iof your high power usage falls in the middle of the range instead of the top - your PSU will be more efficient - where it count more than idle usage.

A good investment in a power supply can last two time more than the much costlier aand shorter lived investment in CPU/MB and GPU. There won't be much evolution of a good efficient PSU over time - they can't get any better that it matters much while CPU and GPU are disposable because the performance is always improving. So why not pay a fraction more and consider it an investment?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
You're missing the point: Why pay more for a PSU for the privilege of paying more on your power bill?
lol, a few cents more maybe? even if you monitored that with precise equipment, you would never notice any real world difference in power consumption. and I believe its best to have plenty of wiggle room. I had a neo eco 520 watt psu for my system when I had a gtx570 and that thing worked like crazy in demanding games even though it never was more than 70% loaded. going with a larger power supply that does not have to work hard and ramp up the fan has made me much happier. and really the price difference between a borderline psu and one with wiggle room can be only 10 or 15 bucks which means nothing for a component that will be around for years.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
No, I did not miss the point. I really don't think it matters much.

Depending on your plans, more headroom will give you elbow room for the future. And we are talking about at most a few dollars a year difference in electricity costs and often 10 or 20 dollars for 100 watts more for the PSU purchase. In addition, iof your high power usage falls in the middle of the range instead of the top - your PSU will be more efficient - where it count more than idle usage.

A good investment in a power supply can last two time more than the much costlier aand shorter lived investment in CPU/MB and GPU. There won't be much evolution of a good efficient PSU over time - they can't get any better that it matters much while CPU and GPU are disposable because the performance is always improving. So why not pay a fraction more and consider it an investment?

Because there has yet to be proof anywhere that your investment is actually getting ANY kind of return. We're not talking about lining the max capacity right up to what you will use here. Power supplies degrade marginally per year (this is seriously overblown) but power requirements for newer and newer tech keeps dropping WAY faster than any degradation.

Yes, it might be just 10-15 bucks higher (usually more) but you are getting NOTHING out of it other than a "few more pennies a year on your electric bill." A quality right-sized unit will last you just as long as an oversized one. You are paying more for the privilege of paying more. Would you buy elephant insurance for $10-$15? Because the benefit from that amounts to the same.
 

riva2model64

Member
Dec 13, 2012
47
1
71
Because there has yet to be proof anywhere that your investment is actually getting ANY kind of return. We're not talking about lining the max capacity right up to what you will use here. Power supplies degrade marginally per year (this is seriously overblown) but power requirements for newer and newer tech keeps dropping WAY faster than any degradation.

Yes, it might be just 10-15 bucks higher (usually more) but you are getting NOTHING out of it other than a "few more pennies a year on your electric bill." A quality right-sized unit will last you just as long as an oversized one. You are paying more for the privilege of paying more. Would you buy elephant insurance for $10-$15? Because the benefit from that amounts to the same.

It's true that technology is getting more and more power efficient, but it doesn't necessarily mean that your personal power requirements go down.

For example, someone may start a new build with a budget video card, but may want to get a 250w flagship video card when they get the money. Or maybe swap in a 130w six-core processor/8 core when prices drop. If a person gets a power supply with extra headroom, they won't have to worry at all and may actually be getting better efficiency when they do upgrade.

I Personally started my build with an 8800gt and went to a GTX 470, all on the same PSU.

That said, if you don't transfer your PSU to your new system or anticipate an upgrade, then it is indeed financially wiser to get a quality PSU wattage-matched.

Some larger PSUs are very efficient at lower loads as well, so sometimes it doesn't make a difference. You have to research well and look at efficiency curves.
For example, assuming 24/7 use and 115w average usage, lets assume that the wattage matched PSU is 2% more efficient than a 200w larger one. You'd save a about $1.80 a year if my math is correct. It's understandable if someone doesn't want to do the research and just get a larger than necessary PSU.

It all boils down to this: if I were building a comp for a customer using it only for office work, I'd save the $10 and get a smaller, efficient psu.

If I were building a system for myself, knowing that I swap parts often and transfer PSU's from one system to another, I am going to spend the money on a beefy PSU.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Because there has yet to be proof anywhere that your investment is actually getting ANY kind of return. We're not talking about lining the max capacity right up to what you will use here. Power supplies degrade marginally per year (this is seriously overblown) but power requirements for newer and newer tech keeps dropping WAY faster than any degradation.

Yes, it might be just 10-15 bucks higher (usually more) but you are getting NOTHING out of it other than a "few more pennies a year on your electric bill." A quality right-sized unit will last you just as long as an oversized one. You are paying more for the privilege of paying more. Would you buy elephant insurance for $10-$15? Because the benefit from that amounts to the same.
AGAIN I already gave a you a clear example where spending a little more is useful. and I hate to tell you but requirements dont always go down and easily flip flop. telling someone to limit themselves and have a hotter, louder psu just to save 10-15 bucks is silly.
 

GAO

Member
Dec 10, 2009
96
1
71
Yes, it might be just 10-15 bucks higher (usually more) but you are getting NOTHING out of it other than a "few more pennies a year on your electric bill."
[/QUOTE

Yes the incremental outlay is not significant at all and the cost in electricity is pennies - so why do you feel so strongly about this? I think it just doesn't really matter so I pay a little bit more. I was planning on getting an X-750, which did fit my needs for the future with some comfort, but got an X-850 instead because it was on sale for $30 less than the X-750 :D. I am very satisfied because it is very quiet and it is 85% efficient at 100W and my idle is above that.


If I were a system builder I would have a different perspective.
 
Last edited: