Heating your home with a PC? Help me calculate BTU's

joe4324

Senior member
Jun 25, 2001
446
0
0

I've been doing alot of research lately into sustainable housing. (earthen homes, super high effeciency insulations, Airrated concrete foams and even underground homes) These super effecient structures can be built to require VERY low amounts of heat to maintian proper tempatures. As little as 5-10,000 BTU per hour in some of the most effecient designs (assuming 20F tempatures I believe) I've read that the human body produces between 5-800 BTU per hour of heat. (brings a new meaning to the term "house warming party" eh? :)

Anyway, I would like to know how best to calculate the heat produced by a personal computer. I would like it to be fairly accurate and take in account differences like monitor sizes and even Megahertz on your cpu's.

How would you approach this situation? And what are some good resources to try to figure this out? It might be possible to maintian proper temps inside your home while playing your favorite video game! I hope!!
 

AbsolutDealage

Platinum Member
Dec 20, 2002
2,675
0
0
There are so many variables in this system that the only way to get an accurate idea is with a test computer and a test apparatus. There is virtually no way to predict how much heat a particular hardware setup will create. If you are still interested, try something like this:

Take your computer and put it inside of a thermally closed system (try a good sized styrofoam cooler for a rough idea). Poke a hole just large enough for your cords to poke out and seal up the hole around the cords with some duct tape or something. Put the top on the cooler and seal that off too. Stick a decent thermistor in there and record the temp. Boot your computer up and leave it running for exactly one hour. Maybe put it on SETI or something to keep activity up. Record the tempurature inside the cooler after the 1 hour run.

Anyway, I forget all of those conversion factors... and It's been waaaay too long since I did any thermo calculations... maybe someone else can come up w/ the correct formula for delta T in air to BTUs???

Looking around, I found a server power calculator for HP servers that has a BTU spec on it here. It seems to hover around 1000-1500 for a low end server (which is still putting out waaay more heat then your average desktop... check out the size of the beast, but it gives you a ballpark figure).
 

joe4324

Senior member
Jun 25, 2001
446
0
0
I've thought about that idea, Enclosing the whole system inside a box and seeing how fast it raised the temps. But is there a simplier way? how about figureing out the electricity being used? Wouldnt heat be the only real by-product of all the electricity the computer sucks up? (wich isnt that much really, I have 2-4 computers running nearly 24/7 and my electric bill was only $60 this month)

Would this be a simplier approach? I wish I had a Kilo-watt meter or something to measure how many watt hours an hour my computer was sucking up.
 

AbsolutDealage

Platinum Member
Dec 20, 2002
2,675
0
0
Originally posted by: joe4324
I've thought about that idea, Enclosing the whole system inside a box and seeing how fast it raised the temps. But is there a simplier way? how about figureing out the electricity being used? Wouldnt heat be the only real by-product of all the electricity the computer sucks up? (wich isnt that much really, I have 2-4 computers running nearly 24/7 and my electric bill was only $60 this month)


Something tells me that this idea is not going to be nearly as accurate... there is definately going to be some inaccuracies here. For instance, fans moving is electricity being converted into kinetic energy. Hard drives spinning, CDs reading, floppies seeking, etc. will all suck up energy.


Would this be a simplier approach? I wish I had a Kilo-watt meter or something to measure how many watt hours an hour my computer was sucking up.

If you have a beefy multimeter you can just measure the total amperage drawn from your outlet... then its just math to get the wattage.
 

joe4324

Senior member
Jun 25, 2001
446
0
0
I've got a cheap $15 multimeter at home, I dont know if it can handle being plugged into to something that hot. You are correct about the kinetic energy though, It would change the results. If I could get with in 10-20% accurate it would be good enough for me. That would tell me if having a few computer systems running all the time would be enough to heat an efficient home I'd like to create. thats my ultimate goal. If the math works out then it could change my construction plans.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: AbsolutDealage
Originally posted by: joe4324
I've thought about that idea, Enclosing the whole system inside a box and seeing how fast it raised the temps. But is there a simplier way? how about figureing out the electricity being used? Wouldnt heat be the only real by-product of all the electricity the computer sucks up? (wich isnt that much really, I have 2-4 computers running nearly 24/7 and my electric bill was only $60 this month)


Something tells me that this idea is not going to be nearly as accurate... there is definately going to be some inaccuracies here. For instance, fans moving is electricity being converted into kinetic energy. Hard drives spinning, CDs reading, floppies seeking, etc. will all suck up energy.

Yes, the fan moves air, but the air eventually stops due to friction.... the hard drive has to fight friction... basically all that motion turns into heat anyway.
 

joe4324

Senior member
Jun 25, 2001
446
0
0
I'm going to repost this in general hardware, I'll probably get alot of crap posts but maybe someone will have some more input too. I'm trying to find a refrigerator box and some insulation hehe.
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
Why don't you just measure the current? Get a decent multimeter that can measure alternating current, most multimeters than can measure AC are rated 10 A so it should be ok. And the result will be very accurate.
Just be carefull...



 

joe4324

Senior member
Jun 25, 2001
446
0
0
were am I measuring the current at? the wall or on the psu? how would you do it?
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Why don't you just measure the current?

This is fine, if you don't mind working with live cables. However, given that you had to ask, I would advise that you do not follow up on this method.

This does not give an accurate measure of power at all, but should be good enough for a rough estimate.

The best way really is to use a watt-meter - you can get them for less than $30 - far safer than using a multimeter - and hugely more accurate, with more sophisticated features like the ability to measure total energy consumption and cost of electricity without needing to do the calculations yourself.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
ALL energy used by the computer is converted to heat, its that simple. And any good SmartUPS will tell you how much power your computer is using.

While you could design buildings well insulated enough that one computer per person heats the place up (in fact most modern offices are this way), its not really desirable because such a slow temp bleeding implies poor enough ventilation you will suffocate.
 

ScottMac

Moderator<br>Networking<br>Elite member
Mar 19, 2001
5,471
2
0
BTU / Hr = Watts * 3.413

BTU / minute = Watts *.05689

Keep in mind that the "Watts" value is actual watts. If you have a 400 Watt supply, it's not putting out 400 Watts all the time, that would be your peak (or total), assuming you could drive every rail to maximum (and you probably can't).

If you're going to do airconditioning for a room, you can count the circuit breaker amperage (for that space), multiply by the supply voltage and have a "total possible" value for the room.

FWIW

Scott


 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Well, a rough rule of thumb is something like 100W for a CRT, 30W for a LCD and 100 - 300 W for a computer depending on the hardware. The easiest way might just be to have the computer off and see how fast your electricity meter is moving, then turn it on and run Prime and see how fast its moving. I would say that 250W is typical of a computer so that would be aroun 800BTU.
 

Geniere

Senior member
Sep 3, 2002
336
0
0
Glugglug got it right, almost all energy is dissapated by the PC as heat. There are too many variables but my guess is that a PC would average about 600 - 700 btu/hr and maybe 100 - 150 btu/hr for the monitor.

Regards
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
I guarantee those estimates are WAY too high.

Most modern PSUs would die within a week if your actual usage was that high.

An Athlon 3000+ with a Geforce FX will still be under 200W with both CPU and GPU pegged, including fans & drives, and the monitor is less than another 200 (in order to be energystar compliant).
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
200 watts would be a 3000+ with Geforce FX and a moderate sized RAID array running 3D benchmarks. (Without the raid its more like 160W, maybe 175 with lots of fans).
Far from a typical user's system, and even then only when you keep it heavily loaded.

A normal "high-end" system would be about 100W under load, less than 50 when idle. Not 200, and not the 250 that Shalmanese suggested.
 

joe4324

Senior member
Jun 25, 2001
446
0
0
Originally posted by: glugglug
ALL energy used by the computer is converted to heat, its that simple. And any good SmartUPS will tell you how much power your computer is using.

While you could design buildings well insulated enough that one computer per person heats the place up (in fact most modern offices are this way), its not really desirable because such a slow temp bleeding implies poor enough ventilation you will suffocate.


Of course proper ventilation is important, If houses were 100% air tight it would still take days to use up all the oxygen, But thats not the point, the point is to controll the influx of outside air. If I can make all air movement occur through a certian space I can use a heat exchanger to use the preheated/cooled inside air, to temper the incomming outside air. Theoretically a properly designed heat exchanger would reducing energy required to treat the new air by 50%. I'm sure its hard to get it close to that, But it wouldnt take long to pay for itself if it was implemented properly.
 

Boutique

Junior Member
Mar 13, 2003
3
0
0
I once borrowed a digital power consumption meter (1 watt resolution) from the power company.

My Athlon XP1600+ and XP1700+ machines at that time drew a little over 100W each. Hard disk activity increases the power consumption by about 10W. CPU activity (running a 3D game) also measurably increased the power consumption, I think also by about 10W.

The 21" Hitachi CRT drew about 110W.

Bottom line for me was a total power consumption of 220W at night with those two servers running and the CRT sleeping. And 330W while working with the CRT on.

EDIT: And BTW, the 500MHz Celeron only consumed 50W.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Originally posted by: glugglug
200 watts would be a 3000+ with Geforce FX and a moderate sized RAID array running 3D benchmarks. (Without the raid its more like 160W, maybe 175 with lots of fans).
Far from a typical user's system, and even then only when you keep it heavily loaded.

A normal "high-end" system would be about 100W under load, less than 50 when idle. Not 200, and not the 250 that Shalmanese suggested.

I was refering to the combined load, ie Monitor + Tower + Speakers etc.
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
well good speakers like klipsch only draw much of anything when given a signal. logitechs have a heatsink that remains prtty warm no matter what though.