Assume my computer with video card running 24/7 pulls 400W at the outlet.
Obviously, calculating the electrical costs of just this power pull is easy == 1000/400w= .4, x $.10 (price per kwh)= cost per hour $0.04 x 24 hours = $0.96 per day x 365 days = $350.40 per year or $29.20 monthly.
But during summer months, all the heat that's produced has to be dealt with by the air conditioner. Here, I'm a little fuzzy. I assume most of the 400W is being converted to heat. I think my cooling system is running at about a COP(coefficient of performance) of 3.0 (roughly a SEER of 13). Meaning it takes 1 watt of energy to remove 3 watts of heat from the house.
So to remove the heat that the system produces, it would cost 1/3 of the direct electrical costs. so $29.20/3 or $9.73/month for a total cost of $38.93/month during the cooling months.
Does this look right? Am I missing something or miscalculating anything?
Obviously, calculating the electrical costs of just this power pull is easy == 1000/400w= .4, x $.10 (price per kwh)= cost per hour $0.04 x 24 hours = $0.96 per day x 365 days = $350.40 per year or $29.20 monthly.
But during summer months, all the heat that's produced has to be dealt with by the air conditioner. Here, I'm a little fuzzy. I assume most of the 400W is being converted to heat. I think my cooling system is running at about a COP(coefficient of performance) of 3.0 (roughly a SEER of 13). Meaning it takes 1 watt of energy to remove 3 watts of heat from the house.
So to remove the heat that the system produces, it would cost 1/3 of the direct electrical costs. so $29.20/3 or $9.73/month for a total cost of $38.93/month during the cooling months.
Does this look right? Am I missing something or miscalculating anything?