Originally posted by: jelifah
The power button on my 21' CRT monitor recently broke. After taking it apart I got it to at least be in the On position. So now when I turn the computer on I just let the monitor turn to sleep mode.
How much money am I wasting leaving it in sleep mode? Or how many watts am I burning through a day?
There are actually 3 possible sleep modes. Standby uses about 2/3 normal power at under 100W (160W is maximum on a Trinitron 21 inch), Suspend uses under 15W, and active-off uses under 5W. Most CRTs and LCDs use about the same for active-off, as they are just running the electronics, while LCDs naturally use around half the full power of a CRT of equal size.
Here is a description of the modes.
Oh yeah, if you left a 5W device on at all times, that's 3.6kilowatthours per 30 day month. At my current energy cost of 8.2 cents per KWH, that'd cost me all of 29.52 cents a month. At full 160W power all the time, 115.2KWH per month, it's all of $9.45.
And just because it comes up a lot, people wonder how much a PC running 24x7 costs them. If I ran a 300W-using monster (PCs don't generally use nearly the 500W people think they do when they buy such big power supplies), it'd cost me $17.71 a month. Naturally if you live somewhere that electricity is like 25 cents per KWH, it'd start to become a signficant cost drain to run multiple PCs 24x7, but a single normal PC with a big monitor, both running 24x7 at full power, would probably be around $50 a month even at that.