• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How much heat would 20 computers put out?

I need to find specs on how much heat an average Apple, Gateway, and IBM computer puts out. I don't have specific models, so I would the most common model used for schools would do.

This information is needed to get room temperature averages when multiple computers are in an enclosed area.

Any ideas on how to find this information?

Thanks in advance.

*edited to change title*
 
My home computer heats my single bedroom apartment. I rarely need to turn on the heat. Just curious, what are you looking for that info? 😀
 
With the information you have given you will never get any information. A good place to start would be finding out the computer specs and the room size.
 
Getting the temperature of a room with multiple computers takes a lot more information than just how much heat each puts out. Though that's obviously required as well.

Now, determining the capacity of a cooling system for that room...you can do with just that.

I'd venture to guess that average idle heat output, with LCD monitors, will be on the range of 200W per machine.
 
Originally posted by: jagec
Getting the temperature of a room with multiple computers takes a lot more information than just how much heat each puts out. Though that's obviously required as well.

Now, determining the capacity of a cooling system for that room...you can do with just that.

I'd venture to guess that average idle heat output, with LCD monitors, will be on the range of 200W per machine.
My Athlon XP-M 2600+(2.0GHz) at 2.4GHz @ 1.65v, 768mb ram and 5 HDDs draws ~130W while idling and ~168W at 100% load. 19" LCD draws another 25W.

I'd assume not all of that is output as heat.

I guess that's a fair estimate, though. 150-200W...
 
Long story... my son can't go to class because of the heat due to medical issues. The school won't work with me on a solution. The teacher said the room is never below 82, and they aren't agreeing that the computers are putting out enough heat to cause an issue. (Dr. said 72 degrees.)

Thanks.
 
Originally posted by: Eli
My Athlon XP-M 2600+(2.0GHz) at 2.4GHz @ 1.65v, 768mb ram and 5 HDDs draws ~130W while idling and ~168W at 100% load. 19" LCD draws another 25W.

I'd assume not all of that is output as heat.

I guess that's a fair estimate, though. 150-200W...

100% of the electricity that your devices draw will, eventually, be output as heat.

Thanks for some actual numbers, mine were total conjecture....

Originally posted by: aroundincircles
Long story... my son can't go to class because of the heat due to medical issues. The school won't work with me on a solution. The teacher said the room is never below 82, and they aren't agreeing that the computers are putting out enough heat to cause an issue. (Dr. said 72 degrees.)

Thanks.

Ah...I see. Well, if ambient temperature is presumably near ~70 degrees, then unless the room has pretty good ventilation and/or is large, the equivalent of a 4000W heater (more if they use CRTs) could heat it up pretty easily. 10 degrees isn't too much of a driving force for heat transfer.
 
Originally posted by: Eli

I'd assume not all of that is output as heat.

I guess that's a fair estimate, though. 150-200W...

ALL of the energy usage is output as heat, its called conservation of energy. What else would the energy be converted to? The fans and drives don't accelerate to infinity they reach a max speed and level off. So they aren't gaining any KE.

Figure about 125 watts for new 17-19" CRTs, 150-200W for new 20" inchers, 3-500W for monitors made before 1996 and the energystar initiative (this IS a school we're talking about....)
 
Our "bright" district put the computers in the one room that has poor ventilation and no windows. I can't tell you how many pcs have been trashed! 🙂
 
At what power level? Full processor use? Or just running the screen saver? Number crunching using the full CPU time but not the video card? Rendering on the fly that uses both?

Way too many variable to give an easy answer.

ZV
 
Originally posted by: Zenmervolt
At what power level? Full processor use? Or just running the screen saver? Number crunching using the full CPU time but not the video card? Rendering on the fly that uses both?

Way too many variable to give an easy answer.

ZV

If you look at Eli's numbers, it really doesn't make a big difference...especially with a heat transfer problem where additional energy does less and less to increase the temperature as the driving force for heat transfer increases.
 
Originally posted by: glugglug
Originally posted by: Eli

I'd assume not all of that is output as heat.

I guess that's a fair estimate, though. 150-200W...

ALL of the energy usage is output as heat, its called conservation of energy. What else would the energy be converted to? The fans and drives don't accelerate to infinity they reach a max speed and level off. So they aren't gaining any KE.

Figure about 125 watts for new 17-19" CRTs, 150-200W for new 20" inchers, 3-500W for monitors made before 1996 and the energystar initiative (this IS a school we're talking about....)
Err, uhh, err..

Well I understand what you're saying, but am a little confused.

Isn't some of the electrical energy converted into mechanical movement and not heat? In the case of fans and HDDs... ?
 
Originally posted by: Eli
Err, uhh, err..

Well I understand what you're saying, but am a little confused.

Isn't some of the electrical energy converted into mechanical movement and not heat? In the case of fans and HDDs... ?

you're right, the first thing that happens to it is that it gets converted to mechanical movement. But once the hard drives and fans are spun up, friction starts to slow them down, and all the power that they take while running is used to keep them up to speed...feed in power at one end to fight friction, get exactly the same amount of power back as heat due to the friction.

Not 100% of the energy taken by a computer goes DIRECTLY to heat, but it all will eventually.
 
Back
Top