This will figure it pretty close:
System energy consumption: not more than
100W (probably closer to 50W-75W) not counting the monitor (a 300W PS does not mean the system pulls anywhere even close to 300W!) According to Dell lab tests, a "typical" PC consumes 47.3W - although what Dell considers "typical" is anybody's guess.
A PC will pull it's max watts right at power-on and bootup. Having looked through several threads that discussed this topic
extensively, the consensus was a modern PC (Athlon, P4) with one hard drive, maybe 2 PCI cards, and a mid-range CPU will pull about 80W-100W when idle. A 17" CRT will pull about 75W more. This is coming from people who like to stick their multimeters everywhere they can...
-ANYWAY-
Long Island Power Authority Consumer rates:
~$0.13 kWh (it varies from $0.12 - $0.13 depending on how much you use every month)
100W x 24 hours = 2.4kW x $0.13 = ($0.31 / day) x 31 days = (
$9.67 / month) x 12 = ($116.06) / year
100W = $9.67 / month
75W = $7.25 / month
65W = $6.29 / month
50W = $4.84 / month
If your dad is a resonable person, just sit down with him when he gets the power bill, show him the electricity rate (kWh), and explain that your PC uses the same as a 100W lightbulb. If he doesn't realize what that adds up to in dollar terms, just do the math for him,
flip him a $10, and tell him to relax! Ten bucks is probably too much, but as long as he understands that, then you should have no problems.
Is it worth $10/month to you to keep your Linux box running 24/7?
Useless Fact
I'm working in China right now, and the electricity here costs
~$0.06 / kWh - of course, that only applies when the electricy is WORKING!