'm not going to bother with numbers now - I'm just not in the mood. But you need to consider the efficiency of the power supply. If your CPU is consuming 40W, then most likely your power supply is consuming on the order of 70W... Thus you need to nearly double the cost of everything in this thread.
40W is NOT idle power used by the CPU only. You can see how much a test 2GHz P4 used when running different programs here (someone did a study to see the effect of compiler optimization on power consumption). The idle CPU consumption is only 10.5W. The 40W is figuring that plus another 10 or so for the video card, plus another 10 or so for the HD, 5 for the mobo & RAM, plus some slack.
Plus you have to consider if your house is using the heater or the AC. If it is using the heater, the computer power use produces heat which is a savings on your heating bill. So in cold climates turning off your computer in winter won't help your bills. However the opposite happens in summer where you use the AC. Then for every watt of power your computer produces, your AC must cool another watt. Thus in summer when you use the AC you must double the costs.
Any decent AC has at least a 10:1 ratio of heat moved to energy consumed. (The ratio is the EER rating on the yellow sticker it came with). So the additional AC needed to offset the heat generated by the computer is only an extra 10% added onto the computer's power consumption.
40W x 24 hours/day x 30 days/month = 28.8kWh/month. Multiply by your local rate/kWh (actually extremely high cause you are in NY) it comes to $3.74 per month in the city (or $1.25 for 8 hours/day), not sure on the island. Clearly not worth shutting down.