Anybody know how much power 4 computers uses?

robisc

Platinum Member
Oct 13, 1999
2,664
0
76
I leave my computers on all the time, I know there two schools of thought on this regarding turning them on and off, but I don't want to get in to that here, my question is does anyone know of anyway to figure out how much energy they are using by just sitting there idle? I would like to even know what they cost me to operate per day, year, kw/h or whatever. Anyone know anything about this. They are:

333 Cel 235 watt PS
466 Cel 235 watt PS
500 Cel 235 watt PS
700 Athlon 300 watt PS
 

StageLeft

No Lifer
Sep 29, 2000
70,150
5
0
fkloster is right of course...

If you have 4 computers sitting there and you're running them all through one monitor (or the monitor is totally turned off) I can't tell you how much power other than to say that its not much !
 

Alphacowboy

Senior member
Oct 11, 1999
482
0
0
probably no more than a 75-100watt light bulb a peice... in sleep or idle... maybe not even that... now with the monitors on... it could be a considerable more but I could be wrong. A 235WATT PWS is 235WATT max, meaning it would pull 235watts at its peak, if I remember right, so in idle its not much.
 

robisc

Platinum Member
Oct 13, 1999
2,664
0
76
I guess I shouldn't have said idle since they aren't idle all the time, maybe being used 2-3 hours a day. Otherwise they are idle with the monitors turned off. Now if I could just find out how much power they use then I could find out how much my utiliy company charges me per kw/h and then do the math huh? Don't ask why I'm wondering about this it's not like I'm that tight or anything but maybe if they are costing me a lot I could turn a couple of them off like the ones that boot fast like my BeOS machine.
 

samgau

Platinum Member
Oct 11, 1999
2,403
0
0
I would guess not much when totally idle....a wild guess would be about 25w each...cause the only thing running would be the cpu and your fans... HD would be sleeping... now i think the athlon would consume more than the others...since the cpu has a high wattage...

just my 2c
 

Alphacowboy

Senior member
Oct 11, 1999
482
0
0
Still, even not in idle, if you aren't burning CD's 24/7 or playing Q3A all day long, its not that much, I know for a fact that your monitors will pull a ton of power over your system... IE I have a Model CPS900AVR (VA Rating 900VA Watts 500W) UPS it lasts 15-20 min. with my monitor on, it will last 50-70 without the monitor if that helps?
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0
I thought it depends on your PS? Does the monitor get it's power from the casing PS?
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0
I thought if you've a 300Watt supply, the max that your computer pulls is 300 watts meaning 0.3 kilowatts?
 

bigjon

Senior member
Mar 24, 2000
945
0
0
Without the monitor I'd say 20-50 watts (just a guess though). It seems that I've heard a P2 CPU uses about 20 watts at 100% utilization. Anyone care to correct me on that?

Think about it though - a laptop, which has pretty much the same hardware (a little more efficient obviously) can run on a small battery for a couple hours.

I leave 4 computers running at home 24/7 and my electric bill is not noticably different from when I just had one.
 

MWink

Diamond Member
Oct 9, 1999
3,642
1
76
My heavily configured Athlon 650@715 system + 17" monitor only uses 200W. With just the comuter on the power usage drops to 120W.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,389
8,547
126
you could always run around the house, turn everything off but the computers, then go out back and note the electric dial and time. leave for an hour, then note the consumption again. divide the consumption (KwH) by the hours gone to get just kilowatts. that should be a pretty accurate view.

why are your computers doing nothing! crack rc-5 for team anandtech! they need your help to defeat the evil dutch power cows! head on over to the distributed computer forum!
 

Cybordolphin

Platinum Member
Oct 25, 1999
2,813
0
0
Forget the "watts".... figure out/look at the amps being drawn. An 8 amp draw for 8 hours a day will run about $20 per month. So figure off that.
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0
Well, i figured that given that each Kilowatt hour of electricity is charged at $0.1353 (Singapore dollar), leaving my computer on (AND assuming that it uses ALL of the 350Watt power supply, the cost of leaving my computer on for 24/7 per day is $1.13652.

It would cost me $35.23 to leave my computer on for the whole month. $422.7855 for a whole year. $20.72 in US dollars to leave it on for a whole month and $248.69 in US dollars to leave on for the whole year.
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0
Does the monitor get it's power form the computer's powersupply? Does the 350Watt rated in my power supply include the power drawn by the monitor too?
 

wesman2

Member
Sep 15, 2000
116
0
0
It's very difficult to get a system to pull all 350 watts, four or five active drives, half a dozen case fans, and a fully loaded CPU and you'll be on your way. Most sytems don't crack the 200W mark. The reason you want a larger power supply is so it can deliver the watts that you do use clean. Ever turn a 100watt stereo receiver to the max, it sounds like crap, but it sounds a lot better at 30watts that a receiver with a 30watt max.
 

Mandrill

Golden Member
Feb 7, 2000
1,009
0
0

Your monitor's are where the big power consumption come in. I don't know off hand how much it would cost you, but probably not much. Try asking over in the Distributed Computing forum. Alot of those guys run multiple systems at home and I am very sure that someone there can answer your question with a definate answer.

That is a nice group of pc's there. Ever thought about joining the Team Anandtech Distributed computing team? We could use you. Since you leave them on all the time, why not give them something to do ? :)



 

Zorba

Lifer
Oct 22, 1999
15,613
11,254
136
Cybordolphin: You are charged by Kilowatt/hour, that is why everyone is talking watts and not amps. Watts are power (which is what you pay for) amps are just current (Although of course the more amps, the more watts).

jinsonxu: Monitors have their own power source nowadays.
 

4824guy

Diamond Member
Oct 9, 1999
3,102
0
0
Power = current (amps) times supply voltage

To find out accurately how much power is being used, someone will have to measure the current thru the PC's power cord when the PC is on. Then muliply that times the voltage of the supply (117v rms here in the US)
 

bigjon

Senior member
Mar 24, 2000
945
0
0
A monitor has its own power supply and does not draw anything from the computer power supply. A typical 15" monitor draws roughly 100 watts, which is more than most computers take.

P (power) = I (current) * E (voltage) for DC circuits, but there's a fun thing called Power Factor for AC circuits that screws up the equation a little :p For all practical purposes though you could figure out the power by measuring the input current and multiplying the input voltage. Of course, this is rather hard to do safely ;)

A somewhat easier (though still time-consuming) approach would be to measure each of the DC currents inside the case and convert them to power and add them all up. This will be close to the total power (the power supply is not 100% efficent though). It would at least be interesting to see how much power the motherboard uses versus the hard disk, floppy drive, and cd-rom. Has anyone actually done this before?

I'm looking at a hard disk right now that has a rated (max) current draw of 0.4 amps at 12v and 0.5 amps at 5v. This means that the max power for this drive is 0.4*12 + 0.5*5 = 7.3 watts! The most power is used when the drive is spinning up, not when it is moving along steadily.
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0


<< A monitor has its own power supply and does not draw anything from the computer power supply. A typical 15&quot; monitor draws roughly 100 watts, which is more than most computers take. >>



But my Philips 105S plugs into my power supply instead of the wall outlet. I'm aware that most monitors plug into the wall outlet, but mine doesn't. So is it drawing from the PSU?

Also, someone said this

<< Today I fixed a AMD TB 1000 MHz FCPGA processor to a mobo. With 128M of RAM and a 32M TNT card, the whole mobo consumed 5v at 9 amp. or 45 watt. I measured it personally with a Hall effect DC clamp ampere meter.

So for those who have been told to get very large power supply, there is no truth to it. The 5v standby was 5v at 0.75 amp. So any 200 to 250 watt power supply with a 5v standby at 1 amp will be able to power the system.
Given that there was no HDD or CD ROM drive but these are about 10 to 15 watt each. So you can figure out just how power you need.
>>

 

ApacheXMD

Platinum Member
Oct 9, 1999
2,765
0
0
folks, please don't say kilowatt/hour
kilowatt/hour = kilowatt per hour = kilojoules per second per hour = nothing
it's kilowatt hours, as in multiplication

and you're charged by how much energy you use, not how much power you use.
power being energy over time

sorry if i'm just nitpicking
:)

-patchy
 

bigjon

Senior member
Mar 24, 2000
945
0
0
jinsonxu, as far as I know the monitor power socket on the power supply is simply a pass-through of the 115v AC supply current - the power supply is only concerned with supplying low voltage DC, so 250w is supplied to the inside of the PC, while any additional amount may be passed through to the monitor. So essentially your monitor is plugged into the wall.
 

jinsonxu

Golden Member
Aug 19, 2000
1,370
0
0
Dug up AMD's tech doc on the Duron and estimated that the amount of power used at 1.55volts and 900Mhz for my chip is 40.114 Watts.

So when my system has it's monitor off, harddisk spun down (how much power?), for RC5, what are the other components using power?

The MB, CPU, the CPU fan, Ram, network card (think i should switch off my cable modem?)
and what else? Does the video card suck power when screen is blcnked and monitor off?

How much wattage would the these components use? Excepting the CPU, anyone has any ideas?