Optimus, a waste on laptop, badly needed on desktop!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
Correct me if I am wrong, but isn't Optimus entirely software based? No additional hardware is needed right?
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
I have an Optimus laptop (N61J), and love it - 4 hours of battery life as opposed to 2.5-3 is definitely appreciated, and I can game when I'm stuck in my lab long hours. Driver support has left much to be desired, though.

I'll echo the sentiments of others here... Great for laptops, but I really don't care about adding such a thing to my desktop.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
30 watt * 1 kilowatt/ 1000 watt * 12 hours/ 1 day * 0.26$/kwh * 365 days/year = 34.164 $/year in electricity... now multiply by 3 to account for cooling costs = 102.492 $/year


you would pay an additional $70 a year to cool the heat from your 30W@ idle graphics card? How about using a window?

maybe if you had enough of a breeze you could survive the heat coming off of that power hog
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
you would pay an additional $70 a year to cool the heat from your 30W@ idle graphics card? How about using a window?

maybe if you had enough of a breeze you could survive the heat coming off of that power hog

I can't open a window because its very hot outside... I cool the inside of my house to be colder then the outside in the summer, and I heat it to be hotter then the outside in the winter...

Venting the house means dying of exposure (or on nicer days, just being uncomfortable). It also means losing all the heat / cool air... it costs LOTS of money to open a window...

Anyways, cooling costs are completely standard issue with computers, I don't know why you scoff at them so.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Anyways, cooling costs are completely standard issue with computers, I don't know why you scoff at them so.
Because you tell us that it takes 70$ a year to cool less than 30W of heat dissipation, which I honestly I've got no idea how you get that numbers, because if that were true you'd be paying astronomic sums for cooling.
 
Last edited:

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
My idea of a great PC is one that is totally silent when idle / browsing the Web. Some of you may have really quiet rigs (as do I), but silence is another thing. Mine still makes sounds when I sit and browse (most of my time spent at a PC). The drives vibrate (I can hear that even though the drives are one of the most quiet ones), I can hear the fans pushing air (Nexus fans - again, there's nothing better for sound). You get the idea...

So a PC that automatically shuts down everything not needed when not in use would be one giant step towards that perfect PC - 15-20W less idle power usage (more if you're on an nVidia card), no sound at all from the shut down video card...

I may be in the minority, cause I have really good hearing, but one can dream. I always get a chuckle when reading "barely can hear the card above system fans" - I can hear Nexus fans @ 5-6V :p
 

pcslookout

Lifer
Mar 18, 2007
11,958
155
106
My idea of a great PC is one that is totally silent when idle / browsing the Web. Some of you may have really quiet rigs (as do I), but silence is another thing. Mine still makes sounds when I sit and browse (most of my time spent at a PC). The drives vibrate (I can hear that even though the drives are one of the most quiet ones), I can hear the fans pushing air (Nexus fans - again, there's nothing better for sound). You get the idea...

So a PC that automatically shuts down everything not needed when not in use would be one giant step towards that perfect PC - 15-20W less idle power usage (more if you're on an nVidia card), no sound at all from the shut down video card...

I may be in the minority, cause I have really good hearing, but one can dream. I always get a chuckle when reading "barely can hear the card above system fans" - I can hear Nexus fans @ 5-6V :p

haha same here. I have ears like a cat. My computer is far from quiet though but even if it was I would be able to easily hear it too. Having a loud pc, at least right now, does not bother me. Mainly because I use headphones. So it doesn't really matter how loud my pc is. Have you ever tried those silent boxes they make for pcs to make them completely noiseless ? I believe they exist. Wouldn't use them because I am scared it would overheat my pc.

Your right though no matter what kind of fans or hard drives you are there is never a way to have a pc completely silent. If your house is quiet enough and your ears really good you can still hear them.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I thought Nvidia desktop GPUs since the 6800 series had the ability to shut down clusters when they are not needed?
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
I have never been willing to shell out for a lap top, despite many opportunities. If I did get one, though, Optimus would be a selling point for me.

I'm fine with automatic down-clocking in 2-D for my desktop.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
My idea of a great PC is one that is totally silent when idle / browsing the Web. Some of you may have really quiet rigs (as do I), but silence is another thing. Mine still makes sounds when I sit and browse (most of my time spent at a PC). The drives vibrate (I can hear that even though the drives are one of the most quiet ones), I can hear the fans pushing air (Nexus fans - again, there's nothing better for sound). You get the idea...

So a PC that automatically shuts down everything not needed when not in use would be one giant step towards that perfect PC - 15-20W less idle power usage (more if you're on an nVidia card), no sound at all from the shut down video card...

I may be in the minority, cause I have really good hearing, but one can dream. I always get a chuckle when reading "barely can hear the card above system fans" - I can hear Nexus fans @ 5-6V :p

I also love silence... the biggest improvement IMAO from my intel SSD is that it is utterly silent... I can distinctly hear my secondary drive revving up and spinning when I access it for some reason or another. (Actually, I can hear the same happening on my file server two rooms over)
The GPU being completely off will help silence the PC even more.

Because you tell us that it takes 70$ a year to cool less than 30W of heat dissipation, which I honestly I've got no idea how you get that numbers, because if that were true you'd be paying astronomic sums for cooling.

Which is correct. I was actually being generous, it is likely even more. As for how I got it, I showed my math.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Which is correct. I was actually being generous, it is likely even more. As for how I got it, I showed my math.

Are you trolling us?


Do you really think it costs anywhere CLOSE to that much?



Do you seriously think that it costs 120w to dissipate 30w of heat for the 6 months of the year you turn on the AC? Because thats what your assumption says
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Are you trolling us?

Do you really think it costs anywhere CLOSE to that much?

Do you seriously think that it costs 120w to dissipate 30w of heat for the 6 months of the year you turn on the AC? Because thats what your assumption says

No it doesn't, it says it costs 60w on average used by the AC to remove 30w of heat.

And no I am not trolling you... do you believe me to be wrong? I am basing it on having read that it takes 3x the amount to cool as it does to heat... so:
it takes 3x watts + x watts when its hot, it takes x watts but saves x watts on heating when its cold... (not exactly, but close enough).
now, how much of the year do you run cooling and how much heating? for me its about 8 months of cooling and 4 of heating in the year... so it comes out to 2/3 * (3x+x) + 1/3 * (x-x) = 2.7x watts... x is 30.
If you do a more normal 6 month cool 6 months heat it comes out to 2.5x

so i guess I was off on multiplying it by 3, I should have multiplied it by 2.5 or 2.7...
BTW, to claim it takes 120w to remove 30w of heat would be a result of 5x, and I never claimed 5x... and to get the average per month electricity requirement (which is what you were going at) to take care of that you do 2.7-1 = 1.7x, if x = 30 watt, then it takes 1.7*30 watt on average per month to get rid of it, which is 51 watts... but only because you are averaging -30 watts for 4 months of the year and +90 watts for 8 months.
Anyways, this is going on a wild tangent... point is, total yearly cost is 2.5x to 2.7x the base amounts if you account for cooling if you start off with cooling taking 3 times as much energy.

The source of the info that it takes 3 times as much energy to remove the heat via conventional AC then it does to produce it is not recalled to me and could be wrong and I would be glad to be shown it was wrong. But there is no trolling intended here.

Correct me if I am wrong, but isn't Optimus entirely software based? No additional hardware is needed right?

supposedly there is a hardware component... that has been included in every nvidia GPU for the past few years while they were working on the software.

You are using the worst case scenario. In the winter, where I live, it acts as a space heater for 6 months out of the year... The energy doesn't go wasted... Honestly, I would start looking into energy saving appliances before I would invest in trying to save a few watts on a computer.

if you use heating for 6 months of the year, then multiply cost by 2.5
And yes, I was looking at a worse case scenario... worse case being "anyone living in hawaii"... btw... hawaiians don't need to heat their house 6 months out of the year...
maybe you live somewhere where electricity is 7 cents a KWH and you need to heat up your house 12 month a year... lucky you. the only benefit to you is the noise reduction.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
if you use heating for 6 months of the year, then multiply cost by 2.5
And yes, I was looking at a worse case scenario... worse case being "anyone living in hawaii"... btw... hawaiians don't need to heat their house 6 months out of the year...
maybe you live somewhere where electricity is 7 cents a KWH and you need to heat up your house 12 month a year... lucky you. the only benefit to you is the noise reduction.

Actually, it isn't lucky of me at all. I chose where I live, and well, unlucky those in HI because most of the nation doesn't pay the crazy premiums that HI does. If you can't afford to live there, then don't.

There is also this incredable thing called 'sleep' that computers have. My idle power consumption goes from ~150 watts to around 5 watts during this mode. I hit a button and in less than 3 seconds I have my desktop back where I need it to be.

I would say to play with your power management options before you start spending money in the name of saving money...
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
There is also this incredable thing called 'sleep' that computers have.

this is why I only counted 12 hours a day of idle GPU time out of 24 hours per day.
I put it to sleep whenever I walk away from it.

I would say to play with your power management options before you start spending money in the name of saving money...
I have optimized those to near perfection, please stop making assumptions about me.
The only way I can get it lower is if I switch to a nehalem for its power savings, or if nvidia gives me optimus for desktop (or ATI does something similar)
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
this is why I only have it on for 12 hours out of 24 hours per day.
I put it to sleep whenever I walk away from it.


I have optimized those to near perfection, please stop making assumptions about me.
The only way I can get it lower is if I switch to a nehalem for its power savings, or if nvidia gives me optimus for desktop (or ATI does something similar)

Well, maybe you should just sell your desktop and get a laptop? My Asus UL30VT uses less than 30 watts on load, including the nVidia GPU be enabled... I can play current games on it. That also includes the display. Seems like that would be the perfect machine for you. 30 watts load, 2-3 watts idle and less than .1 watt in sleep mode. Sounds like you have a solution.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Laptops are:
1. vastly inferior hardware
2. Super uncomfortable keyboard
3. vastly more expensive hardware
4. Preassembled, I want to build it from parts I bought.
5. Loud and hot if they are any good
6. Have way too few ports
7. Don't fit neatly under my desk
8. Don't typically have DVI ports for a real monitor.

I hate laptops, I don't want to use one for my day to day stuff (I have a laptop btw, which I carry with me to class / when I travel).

I don't see how buying this: http://www.google.com/products/cata...og_result&ct=result&resnum=4&ved=0CDAQ8wIwAw#
for $720 is supposed to be a "solution"
 
Last edited:

pmv

Lifer
May 30, 2008
13,787
8,685
136
I agree it would be a nice option, but an idle computer with proper power savings really doesn't consume that much. If you that worried about a few watts, you better be unplugging most electronics in your house when not in use, as they can consume small amounts of power even when off. Let's not forget shut every light off you don't need on, etc.

Hmmm, I do actually do just that. In fact the 'standby' power consumed by various gadgets adds up to quite a bit, I find (two PCs and a stereo plugged into the same point with a surge protector use > 40W even when they are all switched off, if the powerpoint power measuring widget is to be believed). I mean, why leave 40Watts going all night or when you are out of the house? Why not just turn it off at the plug? (or, unplug it if you are in the US where I guess you don't have point switches).

I'm not convinced switching to IGP is as significant for desktops as laptops though. I tend to switch to a lower spec and quieter machine for non-demanding uses anyway.

(I prefer 'hibernate' to 'standby')
 
Last edited:

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
No it doesn't, it says it costs 60w on average used by the AC to remove 30w of heat.

And no I am not trolling you... do you believe me to be wrong? I am basing it on having read that it takes 3x the amount to cool as it does to heat... so:
it takes 3x watts + x watts when its hot, it takes x watts but saves x watts on heating when its cold... (not exactly, but close enough).
now, how much of the year do you run cooling and how much heating? for me its about 8 months of cooling and 4 of heating in the year... so it comes out to 2/3 * (3x+x) + 1/3 * (x-x) = 2.7x watts... x is 30.
If you do a more normal 6 month cool 6 months heat it comes out to 2.5x

so i guess I was off on multiplying it by 3, I should have multiplied it by 2.5 or 2.7...
BTW, to claim it takes 120w to remove 30w of heat would be a result of 5x, and I never claimed 5x... and to get the average per month electricity requirement (which is what you were going at) to take care of that you do 2.7-1 = 1.7x, if x = 30 watt, then it takes 1.7*30 watt on average per month to get rid of it, which is 51 watts... but only because you are averaging -30 watts for 4 months of the year and +90 watts for 8 months.
Anyways, this is going on a wild tangent... point is, total yearly cost is 2.5x to 2.7x the base amounts if you account for cooling if you start off with cooling taking 3 times as much energy.

The source of the info that it takes 3 times as much energy to remove the heat via conventional AC then it does to produce it is not recalled to me and could be wrong and I would be glad to be shown it was wrong. But there is no trolling intended here.


if you use heating for 6 months of the year, then multiply cost by 2.5
And yes, I was looking at a worse case scenario... worse case being "anyone living in hawaii"... btw... hawaiians don't need to heat their house 6 months out of the year...
maybe you live somewhere where electricity is 7 cents a KWH and you need to heat up your house 12 month a year... lucky you. the only benefit to you is the noise reduction.

Your math is so far from being right it's not funny


You said that it costs 2x the energy cost to account for the energy cost itself and that needed to cool it off. If you say it costs $34 to run 30w for a year, and another 67 to cool it for the half of the year it even needs cooling then you are saying it takes 4x the cost to cool it, so 120w. When you added 2x to the yearly cost thats adding 4x to the 1/2 yearly cost where you have to cool it. See how that doesn't add up?




Also, you're an fool if you believe it costs 3x the electricity to cool a specified amount. It takes 300W used by an AC to counter 100W worth of heat? Yeah right. That would mean a house with say 250w worth of lighting and 750w of mixed electronics/appliances would need 3kw of cooling year round for that alone! Not to mention the many many KW the sun pumps into your house for hours all day long in the summer. The energy use of even a quad SLI GTX480 rig pales in comparison

Whoever or whatever told you that AC requires 3x the energy to remove X heat is g*%&^n retarded. ACs are usually a lot more than 100% efficient, meaning to cool 30w, it will need a lot LESS than 30w to do so, not 3x more
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
You said that it costs 2x the energy cost to account for the energy cost itself and that needed to cool it off. If you say it costs $34 to run 30w for a year, and another 67 to cool it for the half of the year it even needs cooling then you are saying it takes 4x the cost to cool it,

what are you talking about? 1+2 = 3. you are saying 1+2 = 4... i don't get it.

ACs are usually a lot more than 100% efficient
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Come on taltamir you've got at least to admit that it's really highly unlikely that you'll need 240W by an air cooler to nullify the heat dissipated by one small light bulb. If that was true, I'd need more than a kW to cool the lighting in my living room alone. I mean I don't have to make complicated calculations to see that that's just not possible :)
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.

This is true... That is why we have Window Units for A/C, and why Central Air has their units placed outside, or have the heat ducked to flow outside - To vent out the heat that is generated from running it.

However, there is no way that 30 watts of heat takes an extra 100+ watts of energy to counteract.

And, in addition to that - For people that have ALL FOUR SEASONS, this would be a wash... For 6 months is costs less to heat your house, for 6 months it costs more to cool it.

I really enjoy your equations and what not, but they just don't reflect real world scenarios... And, the vast majority of the population doesn't live in HI. It is somewhat pointless to bring up figures that would not apply to the vast majority of people in the United States.

And, I seen that you replied to my other post and I will just say this: You are a fanatic over power savings. It is like a friend of mine who calculated how his Kindle would save him money in the end. He justified it with his elaborate calculations only for it to get stolen 1 month after he had it... Oops, there goes the equation.

And here it is, why if, and I mean truly mean if, this is purely hypothetical, but here goes... What if this new technology that nVidia has suffers from their solder joints fiasco? So now you have a motherboard that would otherwise last a solid 3+ years, lasts 1 year and has to be replaced... Throws off the equation, doesn't it? And yes, that is hypothetical. But there is a reason why a budget has to have a 'slush' fund... Nothing goes as planned.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Come on taltamir you've got at least to admit that it's really highly unlikely that you'll need 240W by an air cooler to nullify the heat dissipated by one small light bulb. If that was true, I'd need more than a kW to cool the lighting in my living room alone. I mean I don't have to make complicated calculations to see that that's just not possible :)


240/3 = 80 watts... not so small a bulb. my CFLs do 13 watts each

lets take a pretty average home... 2000 kwh a month.
66.7 kwh a day
2.7 kwh / h = 2700 watt worth of stuff on at all times (obviously things are not on at all time though, so its higher during the day and lower at night)...

assuming 3x cooling for every x dissipated you take it and divide by 4 for 675 watts worth of appliances at any given time... don't forget you need to account for lower consumption at night, but overall its not an impossible figure. I know my parents house and my own house... they are closer to 3000 kwh a month and I am at about 1000 kwh a month, and power draw of all those appliances that are running should be lower than that actually. bulbs are all CFL, computers all use sleep mode extensively, etc etc. So a good amount of this is actually the AC dissipating heat produced by our human bodies rather then... numbers seem perfectly reasonable to me.

And, I seen that you replied to my other post and I will just say this: You are a fanatic over power savings. It is like a friend of mine who calculated how his Kindle would save him money in the end. He justified it with his elaborate calculations only for it to get stolen 1 month after he had it... Oops, there goes the equation.
Ouch, that must have sucked...

And here it is, why if, and I mean truly mean if, this is purely hypothetical, but here goes... What if this new technology that nVidia has suffers from their solder joints fiasco? So now you have a motherboard that would otherwise last a solid 3+ years, lasts 1 year and has to be replaced... Throws off the equation, doesn't it? And yes, that is hypothetical. But there is a reason why a budget has to have a 'slush' fund... Nothing goes as planned.
I always take into account the unexpected... even if I calculate "this will save me X" I will be cautious about making a switch over, I only do so if its a truly very high amount.

The thing about half a year of cooling and half a year of heating, its only a wash of cooling x watts takes exactly x watts, and using an x watts device means you need to spend x watts less on heating in winter...

in such a case you would get (x+x)/2 + (x-x)/2 = 2x/2 + 0/2 = x
If it takes more than x watts to remove x watts dissipated into your house, then the equation changes... to the point where I just calculated total yearly consumption to be 2.5x (including both cost of consumption, cost of cooling, and reduction to cost of heating)

That 3x to remove x figure could be wrong though.
And people who don't live in a hawaii pay less $ per kwh... although they still pay a decent amount of money...

As far as the nvidia soldering issue... this can happen, but choosing a more energy efficient solution does not increase your predisposition to be affected by such an issue.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ACs are usually a lot more than 100% efficient
I am pretty sure over 100% efficiency violates some of the most fundamental laws of physics. Such as conservation of energy.

This needs clarifying... AC efficiency is not measured in percents.
A 100% efficient AC unit will consume no energy at all and move heat against its heat gradients.
A greater than 100% efficient AC unit will PRODUCE energy while moving heat against its heat gradient...

both are physically impossible.

Watt is a measurement of power
Joule and BTU are a measurement of energy
AC efficiency is measured as energy it moves / (power * time)
Or (Energy/time)/power

Example, (BTU/h)/Watts, which is known as "Seasonal energy efficiency ratio (SEER)"

This is not a percentage... mmm, actually I got some interesting numbers, I can use those to calculate whether or not the 3x figure is correct or not, and if not, what is a correct figure. BRB