• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Win $3! Help me prove my roommate wrong!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

i've got a quad core intel and a fanless 7300 le pci-e (integrated video is horrible quality) with 3 hard drives. when using it to convert video (so, 100% cpu or close) it draws ~125 watts from the wall. 350 watt should be very doable (i'm using a 400 watt myself)
 
Yeah, I'd have no qualms about running my system off a *quality* 350w PSU (I personally use a Seasonic 300w, I have an Athlon64 X2 and 7600GT, but don't overclock). I'd never trust my system to a generic, though, no matter what power output it's rated for.
 
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

No. I have a Pentium D, a low-end PCI-e video card, 4 hard drives and the 4 fans that come in an Antec Nine Hundred running off a 350 watt power supply.
 
Originally posted by: Newbian
Originally posted by: Acanthus
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

See sig, and thats with 2 dvd burners and 3 HDs, and a crapton of peripherals.

And?

Yours is a 520 watt isn't it?

Its an overclocked and heavily overvolted quad core with 8GB of memory... 5 large fans and a water pump... etc.

The point being people often overshoot what they actually need by a mile.
 
Originally posted by: Acanthus
Originally posted by: Newbian
Originally posted by: Acanthus
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

See sig, and thats with 2 dvd burners and 3 HDs, and a crapton of peripherals.

And?

Yours is a 520 watt isn't it?

Its an overclocked and heavily overvolted quad core with 8GB of memory... 5 large fans and a water pump... etc.

The point being people often overshoot what they actually need by a mile.

Absolutely, people always overdo with their power supplies. A quality 400W psu will power almost any single GPU dual core pc out there.

As for the roommate, does he mean overall? I mean if he uses his pc 2 hours a day and your macbook is on 24/7, theres no doubt it uses much more electricity.
 
the difference is negligible. if you want to save power check for your cooling/heating costs and unplug your fridge while your at it.
 
Originally posted by: mwmorph
Originally posted by: Acanthus
Originally posted by: Newbian
Originally posted by: Acanthus
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

See sig, and thats with 2 dvd burners and 3 HDs, and a crapton of peripherals.

And?

Yours is a 520 watt isn't it?

Its an overclocked and heavily overvolted quad core with 8GB of memory... 5 large fans and a water pump... etc.

The point being people often overshoot what they actually need by a mile.

Absolutely, people always overdo with their power supplies. A quality 400W psu will power almost any single GPU dual core pc out there.

As for the roommate, does he mean overall? I mean if he uses his pc 2 hours a day and your macbook is on 24/7, theres no doubt it uses much more electricity.

Better to overdo than to guess what is enough, and be wrong.
 
Keeping your laptop plugged in doesn't mean your P/S brick is doing anything noteworthy. Like most laptops, it goes to sleep or shuts off the screen when the lid is closed. After a while, the battery stop charging and is being "tended" sort of speak.
Also, your roommate's computer isn't drawing 350W. That is a rating for the maximum. Even, if his CPU was pegged at 100% because he has a queued list of videos he is encoding, he won't be pulling his P/S threshold, otherwise the damn thing would probably crash. A quality 350W is more than enough headroom for a dual core CPU, economy GPU, HD's and optical drives.

The argument sucks concerning the expense debate. It's only an argument for argument's sake since it will only cost a few bucks a month. If electricity is suddenly a concern I would look into other appliances. Electric stoves, A/C's, lighting (any fancy halogen floor lamps?), can be big power draws.
 
According to the nifty little screen on my APC UPS, my computer + 22" LCD is drawing somewhere around 247-301W. I'm neffing, and Folding@Home is running at 80% load on my overclocked Core 2 Quad and full blast on my GTX260.

On the other hand, my Dell Inspiron 9300 (17", GeForce 6800, power hungry as far as laptops are concerned) has a 90W power brick that can power it under full load with no battery in there. Your MacBook is more likely more efficient. Tell your roommate to suck eggs.
 
Originally posted by: finite automaton
Originally posted by: mwmorph
Originally posted by: Acanthus
Originally posted by: Newbian
Originally posted by: Acanthus
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

See sig, and thats with 2 dvd burners and 3 HDs, and a crapton of peripherals.

And?

Yours is a 520 watt isn't it?

Its an overclocked and heavily overvolted quad core with 8GB of memory... 5 large fans and a water pump... etc.

The point being people often overshoot what they actually need by a mile.

Absolutely, people always overdo with their power supplies. A quality 400W psu will power almost any single GPU dual core pc out there.

As for the roommate, does he mean overall? I mean if he uses his pc 2 hours a day and your macbook is on 24/7, theres no doubt it uses much more electricity.

Better to overdo than to guess what is enough, and be wrong.

Not at all, unless you aren't looking at the long term or you don't pay electric bills.

A 600W PSU running a 120W load at idle is going to be working at maybe 60% efficiency.

A quality 400W will be running at 80%

Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

If you're going to build a pc, there shouldn't be any estimating. You should do the research and figure out how much your pc actually needs.

Besides, a 45nm Intel Core 2 Duo (overclocked), 2 HDDs, 2 opticals a sound card and a single chip graphics card and a bevy of fans should never overtax a quality 400 or 450W PSU(like a Corsair).

buying a 600W+ PSU is not only overkill, it kills you wallet as well.
 
Ok, so lets assume the laptop draws 60W 24/7. (I know it doesn't actually.)

24*30=720Hours/Month
60W = 0.06kW
720Hours * 0.06kW = 43.2kW-hrs

Now what do you pay for electricity? US average is $0.11/kW-hr. I'll use this number for now.

43.2kW-hrs * $0.11/kW-hr = $4.75

This is the most your laptop could possibly cost per month. $4.75. That is assuming 11c/kW-hr rate for electricity, AND assuming laptop runs at 100% load 24/7. Your actual cost is going to be much lower because you laptop doesn't run at full load probably even 10% of the time.

$3 prize???
 
Originally posted by: mwmorph
Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

Woah there. Watts is a RATE of electricity consumption, not a quantity.

e.g. Miles per hour is a rate. Miles is a total distance. For electricity, watts is to kiloWatt-hours or Joules as miles per hour is to miles.
 
DING DING DING DING! I'm calling it right there! masteryoda34, you win!

Please PM with your paypal address to claim your prize!

Everyone else, thanks a ton! I'll update this thread once I stick it to my roommate! Thanks!
 
Originally posted by: mwmorph
Originally posted by: finite automaton
Originally posted by: mwmorph
Originally posted by: Acanthus
Originally posted by: Newbian
Originally posted by: Acanthus
Originally posted by: Newbian
Isn't a 350 watt psu a bit low for a dual core and pci-e?

See sig, and thats with 2 dvd burners and 3 HDs, and a crapton of peripherals.

And?

Yours is a 520 watt isn't it?

Its an overclocked and heavily overvolted quad core with 8GB of memory... 5 large fans and a water pump... etc.

The point being people often overshoot what they actually need by a mile.

Absolutely, people always overdo with their power supplies. A quality 400W psu will power almost any single GPU dual core pc out there.

As for the roommate, does he mean overall? I mean if he uses his pc 2 hours a day and your macbook is on 24/7, theres no doubt it uses much more electricity.

Better to overdo than to guess what is enough, and be wrong.

Not at all, unless you aren't looking at the long term or you don't pay electric bills.

A 600W PSU running a 120W load at idle is going to be working at maybe 60% efficiency.

A quality 400W will be running at 80%

Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

If you're going to build a pc, there shouldn't be any estimating. You should do the research and figure out how much your pc actually needs.

Besides, a 45nm Intel Core 2 Duo (overclocked), 2 HDDs, 2 opticals a sound card and a single chip graphics card and a bevy of fans should never overtax a quality 400 or 450W PSU(like a Corsair).

buying a 600W+ PSU is not only overkill, it kills you wallet as well.

The only problem is you may want to upgrade a few parts over time and having to pay for 2 psu will cost more then getting a decent one.
 
Originally posted by: mwmorph


Not at all, unless you aren't looking at the long term or you don't pay electric bills.

A 600W PSU running a 120W load at idle is going to be working at maybe 60% efficiency.

A quality 400W will be running at 80%

Thats a 50W idle difference over years, if you have a pc on for work or neffing, say 9 hours a day, that's 657,000watts wasted over 4 years.

If you're going to build a pc, there shouldn't be any estimating. You should do the research and figure out how much your pc actually needs.

Besides, a 45nm Intel Core 2 Duo (overclocked), 2 HDDs, 2 opticals a sound card and a single chip graphics card and a bevy of fans should never overtax a quality 400 or 450W PSU(like a Corsair).

buying a 600W+ PSU is not only overkill, it kills you wallet as well.


Do you have any idea what you're talking about? Really?

Just because you have a 600W power supply and it's only outputting 120W doesn't automatically mean it's running at 60% efficiency. In fact, no matter how hard I tried to find a power supply that did run that poorly at a 20% load, I couldn't.....and I scoured HardOCP, JonnyGuru, and Hardware Secrets. Even the junkiest, most poorly made power supplies, like a Cool Max or Dynex or HuntKey, all had efficiency ratings of 70% or greater at 20% load.

In all reality, it doesn't matter what the total wattage output rating of the power supply is.....its efficiency is NOT directly related to its output.....although it is true that max. efficiency from a power supply is at typically 50% draw. But, you can have a 1000W power supply and have it outputting 200W, or 20% of rated capacity, at 85% efficiency....they exist, and aren't even rare.

True, poorly designed junk power supplies typically have poor efficiency ratings, but to imply that you can only expect a 60% efficiency from a 600W power supply just because you're drawing only at 20% of rated capacity is idiocy.

If what you imply is true, then the whole 80 Plus rating system is a sham, because to attain any of the 80 Plus ratings, even the lowest, the power supply has to have a minimum of 80% efficiency at a 20%, 50% and 100% load.

So, please, show me where this information you spewed above comes from.....other than being pulled directly out of your rectum.


 
Back
Top