30 watt * 1 kilowatt/ 1000 watt * 12 hours/ 1 day * 0.26$/kwh * 365 days/year = 34.164 $/year in electricity... now multiply by 3 to account for cooling costs = 102.492 $/year
you would pay an additional $70 a year to cool the heat from your 30W@ idle graphics card? How about using a window?
maybe if you had enough of a breeze you could survive the heat coming off of that power hog
Because you tell us that it takes 70$ a year to cool less than 30W of heat dissipation, which I honestly I've got no idea how you get that numbers, because if that were true you'd be paying astronomic sums for cooling.Anyways, cooling costs are completely standard issue with computers, I don't know why you scoff at them so.
My idea of a great PC is one that is totally silent when idle / browsing the Web. Some of you may have really quiet rigs (as do I), but silence is another thing. Mine still makes sounds when I sit and browse (most of my time spent at a PC). The drives vibrate (I can hear that even though the drives are one of the most quiet ones), I can hear the fans pushing air (Nexus fans - again, there's nothing better for sound). You get the idea...
So a PC that automatically shuts down everything not needed when not in use would be one giant step towards that perfect PC - 15-20W less idle power usage (more if you're on an nVidia card), no sound at all from the shut down video card...
I may be in the minority, cause I have really good hearing, but one can dream. I always get a chuckle when reading "barely can hear the card above system fans" - I can hear Nexus fans @ 5-6V
My idea of a great PC is one that is totally silent when idle / browsing the Web. Some of you may have really quiet rigs (as do I), but silence is another thing. Mine still makes sounds when I sit and browse (most of my time spent at a PC). The drives vibrate (I can hear that even though the drives are one of the most quiet ones), I can hear the fans pushing air (Nexus fans - again, there's nothing better for sound). You get the idea...
So a PC that automatically shuts down everything not needed when not in use would be one giant step towards that perfect PC - 15-20W less idle power usage (more if you're on an nVidia card), no sound at all from the shut down video card...
I may be in the minority, cause I have really good hearing, but one can dream. I always get a chuckle when reading "barely can hear the card above system fans" - I can hear Nexus fans @ 5-6V
Because you tell us that it takes 70$ a year to cool less than 30W of heat dissipation, which I honestly I've got no idea how you get that numbers, because if that were true you'd be paying astronomic sums for cooling.
Which is correct. I was actually being generous, it is likely even more. As for how I got it, I showed my math.
Are you trolling us?
Do you really think it costs anywhere CLOSE to that much?
Do you seriously think that it costs 120w to dissipate 30w of heat for the 6 months of the year you turn on the AC? Because thats what your assumption says
Correct me if I am wrong, but isn't Optimus entirely software based? No additional hardware is needed right?
You are using the worst case scenario. In the winter, where I live, it acts as a space heater for 6 months out of the year... The energy doesn't go wasted... Honestly, I would start looking into energy saving appliances before I would invest in trying to save a few watts on a computer.
if you use heating for 6 months of the year, then multiply cost by 2.5
And yes, I was looking at a worse case scenario... worse case being "anyone living in hawaii"... btw... hawaiians don't need to heat their house 6 months out of the year...
maybe you live somewhere where electricity is 7 cents a KWH and you need to heat up your house 12 month a year... lucky you. the only benefit to you is the noise reduction.
There is also this incredable thing called 'sleep' that computers have.
I have optimized those to near perfection, please stop making assumptions about me.I would say to play with your power management options before you start spending money in the name of saving money...
this is why I only have it on for 12 hours out of 24 hours per day.
I put it to sleep whenever I walk away from it.
I have optimized those to near perfection, please stop making assumptions about me.
The only way I can get it lower is if I switch to a nehalem for its power savings, or if nvidia gives me optimus for desktop (or ATI does something similar)
I agree it would be a nice option, but an idle computer with proper power savings really doesn't consume that much. If you that worried about a few watts, you better be unplugging most electronics in your house when not in use, as they can consume small amounts of power even when off. Let's not forget shut every light off you don't need on, etc.
No it doesn't, it says it costs 60w on average used by the AC to remove 30w of heat.
And no I am not trolling you... do you believe me to be wrong? I am basing it on having read that it takes 3x the amount to cool as it does to heat... so:
it takes 3x watts + x watts when its hot, it takes x watts but saves x watts on heating when its cold... (not exactly, but close enough).
now, how much of the year do you run cooling and how much heating? for me its about 8 months of cooling and 4 of heating in the year... so it comes out to 2/3 * (3x+x) + 1/3 * (x-x) = 2.7x watts... x is 30.
If you do a more normal 6 month cool 6 months heat it comes out to 2.5x
so i guess I was off on multiplying it by 3, I should have multiplied it by 2.5 or 2.7...
BTW, to claim it takes 120w to remove 30w of heat would be a result of 5x, and I never claimed 5x... and to get the average per month electricity requirement (which is what you were going at) to take care of that you do 2.7-1 = 1.7x, if x = 30 watt, then it takes 1.7*30 watt on average per month to get rid of it, which is 51 watts... but only because you are averaging -30 watts for 4 months of the year and +90 watts for 8 months.
Anyways, this is going on a wild tangent... point is, total yearly cost is 2.5x to 2.7x the base amounts if you account for cooling if you start off with cooling taking 3 times as much energy.
The source of the info that it takes 3 times as much energy to remove the heat via conventional AC then it does to produce it is not recalled to me and could be wrong and I would be glad to be shown it was wrong. But there is no trolling intended here.
if you use heating for 6 months of the year, then multiply cost by 2.5
And yes, I was looking at a worse case scenario... worse case being "anyone living in hawaii"... btw... hawaiians don't need to heat their house 6 months out of the year...
maybe you live somewhere where electricity is 7 cents a KWH and you need to heat up your house 12 month a year... lucky you. the only benefit to you is the noise reduction.
You said that it costs 2x the energy cost to account for the energy cost itself and that needed to cool it off. If you say it costs $34 to run 30w for a year, and another 67 to cool it for the half of the year it even needs cooling then you are saying it takes 4x the cost to cool it,
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.ACs are usually a lot more than 100% efficient
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.
Come on taltamir you've got at least to admit that it's really highly unlikely that you'll need 240W by an air cooler to nullify the heat dissipated by one small light bulb. If that was true, I'd need more than a kW to cool the lighting in my living room alone. I mean I don't have to make complicated calculations to see that that's just not possible
Ouch, that must have sucked...And, I seen that you replied to my other post and I will just say this: You are a fanatic over power savings. It is like a friend of mine who calculated how his Kindle would save him money in the end. He justified it with his elaborate calculations only for it to get stolen 1 month after he had it... Oops, there goes the equation.
I always take into account the unexpected... even if I calculate "this will save me X" I will be cautious about making a switch over, I only do so if its a truly very high amount.And here it is, why if, and I mean truly mean if, this is purely hypothetical, but here goes... What if this new technology that nVidia has suffers from their solder joints fiasco? So now you have a motherboard that would otherwise last a solid 3+ years, lasts 1 year and has to be replaced... Throws off the equation, doesn't it? And yes, that is hypothetical. But there is a reason why a budget has to have a 'slush' fund... Nothing goes as planned.
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.ACs are usually a lot more than 100% efficient