Win $3! Help me prove my roommate wrong!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
Yeah, 3-4 years ago that efficiency wouldn't have been unheard of, but modern PSUs (especially 80 Plus certified units) are pretty efficient at 20% load. Even for 80 Plus units, though, efficiency can drop off pretty quickly below 20% load. It's definitely something to keep in mind when sizing your PSU, although it's unlikely to make as large of a difference as mwmorph suggests.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: Beanie46
Originally posted by: mwmorph


Not at all, unless you aren't looking at the long term or you don't pay electric bills.

A 600W PSU running a 120W load at idle is going to be working at maybe 60% efficiency.

A quality 400W will be running at 80%

Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

If you're going to build a pc, there shouldn't be any estimating. You should do the research and figure out how much your pc actually needs.

Besides, a 45nm Intel Core 2 Duo (overclocked), 2 HDDs, 2 opticals a sound card and a single chip graphics card and a bevy of fans should never overtax a quality 400 or 450W PSU(like a Corsair).

buying a 600W+ PSU is not only overkill, it kills you wallet as well.


Do you have any idea what you're talking about? Really?

Just because you have a 600W power supply and it's only outputting 120W doesn't automatically mean it's running at 60% efficiency. In fact, no matter how hard I tried to find a power supply that did run that poorly at a 20% load, I couldn't.....and I scoured HardOCP, JonnyGuru, and Hardware Secrets. Even the junkiest, most poorly made power supplies, like a Cool Max or Dynex or HuntKey, all had efficiency ratings of 70% or greater at 20% load.

In all reality, it doesn't matter what the total wattage output rating of the power supply is.....its efficiency is NOT directly related to its output.....although it is true that max. efficiency from a power supply is at typically 50% draw. But, you can have a 1000W power supply and have it outputting 200W, or 20% of rated capacity, at 85% efficiency....they exist, and aren't even rare.

True, poorly designed junk power supplies typically have poor efficiency ratings, but to imply that you can only expect a 60% efficiency from a 600W power supply just because you're drawing only at 20% of rated capacity is idiocy.

If what you imply is true, then the whole 80 Plus rating system is a sham, because to attain any of the 80 Plus ratings, even the lowest, the power supply has to have a minimum of 80% efficiency at a 20%, 50% and 100% load.

So, please, show me where this information you spewed above comes from.....other than being pulled directly out of your rectum.

Thanks for trolling :confused:

80+ are tested at 25*C. Let me know when you find ambient temps in a computer at 25*C. In a real live situation, even while idle, the average temperature in a case rarely dips below 36-38*C and efficiency is directly affected by temperature.

Also, efficiency is not directly related to output is a complete lie. If efficiency wasn't related to output, we wouldn't have efficiency curves.

Also the vast majority of power supplies are 80+ regular, 80+ bronze is uncommon, 80+silver is rare and 80+ gold made by 1 manufacturer right now IIRC(dell OEM).

Geez, you're reacting like i kicked your dog and raped your mother, how about you go sit in your corner and calm down? Maybe take some time to sober up.

I'd be happy to talk to you once you start behaving like a human being and not some rabid animal.
 

Fayd

Diamond Member
Jun 28, 2001
7,970
2
76
www.manwhoring.com
Originally posted by: mwmorph
Originally posted by: Beanie46
Originally posted by: mwmorph


Not at all, unless you aren't looking at the long term or you don't pay electric bills.

A 600W PSU running a 120W load at idle is going to be working at maybe 60% efficiency.

A quality 400W will be running at 80%

Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

If you're going to build a pc, there shouldn't be any estimating. You should do the research and figure out how much your pc actually needs.

Besides, a 45nm Intel Core 2 Duo (overclocked), 2 HDDs, 2 opticals a sound card and a single chip graphics card and a bevy of fans should never overtax a quality 400 or 450W PSU(like a Corsair).

buying a 600W+ PSU is not only overkill, it kills you wallet as well.


Do you have any idea what you're talking about? Really?

Just because you have a 600W power supply and it's only outputting 120W doesn't automatically mean it's running at 60% efficiency. In fact, no matter how hard I tried to find a power supply that did run that poorly at a 20% load, I couldn't.....and I scoured HardOCP, JonnyGuru, and Hardware Secrets. Even the junkiest, most poorly made power supplies, like a Cool Max or Dynex or HuntKey, all had efficiency ratings of 70% or greater at 20% load.

In all reality, it doesn't matter what the total wattage output rating of the power supply is.....its efficiency is NOT directly related to its output.....although it is true that max. efficiency from a power supply is at typically 50% draw. But, you can have a 1000W power supply and have it outputting 200W, or 20% of rated capacity, at 85% efficiency....they exist, and aren't even rare.

True, poorly designed junk power supplies typically have poor efficiency ratings, but to imply that you can only expect a 60% efficiency from a 600W power supply just because you're drawing only at 20% of rated capacity is idiocy.

If what you imply is true, then the whole 80 Plus rating system is a sham, because to attain any of the 80 Plus ratings, even the lowest, the power supply has to have a minimum of 80% efficiency at a 20%, 50% and 100% load.

So, please, show me where this information you spewed above comes from.....other than being pulled directly out of your rectum.

Thanks for trolling :confused:

80+ are tested at 25*C. Let me know when you find ambient temps in a computer at 25*C. In a real live situation, even while idle, the average temperature in a case rarely dips below 36-38*C and efficiency is directly affected by temperature.

Also, efficiency is not directly related to output is a complete lie. If efficiency wasn't related to output, we wouldn't have efficiency curves.

Also the vast majority of power supplies are 80+ regular, 80+ bronze is uncommon, 80+silver is rare and 80+ gold made by 1 manufacturer right now IIRC(dell OEM).

Geez, you're reacting like i kicked your dog and raped your mother, how about you go sit in your corner and calm down? Maybe take some time to sober up.

I'd be happy to talk to you once you start behaving like a human being and not some rabid animal.

well, fwiw, 20% load is a point that power supplies are tested at to determine efficiency. if they dont make 80%+ at 20% load, they dont get the 80+ cert.

that said, vastly overspeccing power supplies does the consumer more harm than good because it means they're paying for that extra wasted wattage (though not quite to what you stated.)
 

DayLaPaul

Platinum Member
Apr 6, 2001
2,072
0
76
Originally posted by: masteryoda34
Originally posted by: mwmorph
Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

Woah there. Watts is a RATE of electricity consumption, not a quantity.

e.g. Miles per hour is a rate. Miles is a total distance. For electricity, watts is to kiloWatt-hours or Joules as miles per hour is to miles.

Wait, I re-read that last sentence for a light year and still don't understand it. ;)