This is pretty easy to figure out yourself. Electric companies usually charge for power usage in kilowatt-hours (kWh). You just need to find out how much you are being charged per kWh (this should be printed on your bill).
Then you need to find out how much your power your computer is consuming on average with all connnected components. This is the trivial part, because if you use power management of any kind, certain components can deactivate themselves to idle to conserve energy. Remember that a 300W power supply isn't going to always be pushing 300W unless you have it maxed out. You can get a rough estimate of how much power your system consumes by adding up the rated power usage of each individual component. Then you can proceed to figure out what it costs per unit of time by working out something similar to this:
(added power consumption of all connected components when NOT idle, in watts) * (1 kW/1000 W) * (price per kWh) * (time in hours) = COST PER HOUR OF USAGE
I'm pretty sure my math is right, but anyone is entitled to correct me if I am wrong.