I'd check the power output of it. If you are going to be leaving it running 24/7, then it may be worth tinkering to lower the power or replacing it.
For example, switching to something that runs on something very low power, could end up paying for itself fairly quickly depending on what your local electricity costs and how much you end up having to spend.
A typical PC will idle at ~100-200W (depending on a lot of variables - power supply efficiency, CPU power, video card, quantity of memory, chipset, etc.). So choose a reasonable number like 120W for a conservative estimate.
(120W * 24/hours per day * 365 days per year) / 1000W per kW = 1051kW/year
If electricity costs $0.1121/kWh then this is $118/year to run (1051*$0.1121). $0.1121 is the national average for 2008 from the DOE.
http://www.eia.doe.gov/cneaf/e...ty/epm/table5_6_b.html
If you are in California or in New England, then the cost is even more. I think my parents in California are playing something like $0.17/kWh.
If you can cut the power utilization in half, then you save $58 per year. I built a computer from Pentium M 750 ($25 on Ebay 8 months ago) and a A-Open i915GA-HFS ($60 on Ebay 8 months ago) which idles at 42W and has plenty of performance for a file server. It cost me about as much to upgrade as I will save in a year over the Pentium 4 that I started with. I'm using mine as an HD DVR, file server, media center, web server, home VPN server... and it seems to handle the load fine. I plan on running it for a lot longer than a year. My previous 733MHz Pentium 3 based server stayed on for I'd say at least 5 years straight. The prices add up... if you are like me, when you first set it up you think you'll be running it for a year or so... and then 5-6 years later you look at it and think "wow, it's kind of slow". If you can cut the power by 50% over that typical 120W number... that's almost $300 savings over 5 years.