Long term effects of Nvidia Optimus on the GPU

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I have been looking for the right laptop to replace my wife's aging Inspiron 1150 from about 5 years ago, and finally decided on a new Dell XPS 17. It has a corei5 @ 2.53ghz and a GT435m (GF108) with Optimus.

Assuming that Nvidia has the "bumpgate" issue all fixed with the 400m series, doesn't the fact that with Optimus the discrete GPU will be turning on and off all the time (for many heating and cooling cycles during a single session of using the laptop) versus a traditional machine where the GPU is active the entire time its on potentially bring up concerns about long term reliability? Even versus more traditional switchable graphics where there still wouldn't be as much on/off?

If every other program is cycling between the discrete GPU and the IGP does this raise a concern about long term viability of the chip in this solution?

I would apply the same question/concern to any future AMD chips that would have a similar ability (I'm sure AMD will have this capability in the coming quarters).

I'm not really concerned about it for the XPS as her next replacement will likely be much sooner than 5 years, but I just thought it was an interesting thing to consider.

Again, even assuming there is no bumpgate issue, what are people's opinions on this?
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
could you explain your reasoning? I was under the impression that the heating/cooling cycles are what is hardest on a chip versus continuous on or off.

Isn't this why the XBOX 360 gets the RROD because all the expanding/contracting causes some part of it to split off the mainboard?

Of course, I may be wrong and welcome an explanation to the contrary.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1. the xbox suffered from several issues, not the least which being inadequate cooling and improper design (similar to bumpgate with nvidia)
2. heating and cooling is a problem, but so is running current through a microchip device (electron migration) and you vastly reduce the amount of current running through it.
3. different portions of chips are completely shut off in current designs, so shutting off the whole chip rather then 90% of it when idle is not so different.
4. heating cooling is really a big issue only for improperly bumped chips.
5. computers should and do handle repeated on off cycles with no problem, its a microchip not a lightbulb.
etc.

If there is a fundamental defect in the design such as bumpgate or the RROD problem then optimus will just cause it to fail a little earlier, but it would have failed regardless. If there isn't such a problem then extra wear from optimus should not really matter much. As both the wear from repeated on/offs and the wear from running a current at all should both be negligible compared to the durability of a proper microchip.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Usually, the so called heatsink in a notebook is nothing more than a copper plate connecting to a small radiator with a fan blowing at it. Both CPU and GPU ara attached to this plate, meaning that they are actually sharing the heatsink. That means, the actual heat cycle will not be increased by Optimus anymore than running a program as the heat generated from CPU is transfered towards the GPU, and vice versa. As long as the radiator doesn't completely blocked by dust/dirt/fur/condoms, dead due to cooling cycle is probably more rarer than breaking the display by opening/closing the lid of it.

Edit for grammar (wording)
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Usually, the so called heatsink in a notebook is nothing more than a copper plate connecting to a small radiator with a fan blowing at it. Both CPU and GPU and attached to this plate, meaning that they are actually sharing the heatsink. That means, the actually heat cycle will not be increased by Optimus anyway than running a program as the heat generated from CPU is transfered towards the GPU, and vice versa. As long as the radiator doesn't completely blocked by dust/dirt/fur/condoms, dead due to cooling cycle is probably more rarer than breaking the display by opening/closing the lid of it.

Good point, optimus actually improves battery life (and as a direct result, reduces overall heat production) compared to IGP alone.
the battery life from worst to best is standalone GPU < IGP < Optimus of IGP and standalone
 

tyl998

Senior member
Aug 30, 2010
236
0
0
Usually, the so called heatsink in a notebook is nothing more than a copper plate connecting to a small radiator with a fan blowing at it. Both CPU and GPU ara attached to this plate, meaning that they are actually sharing the heatsink. That means, the actual heat cycle will not be increased by Optimus anymore than running a program as the heat generated from CPU is transfered towards the GPU, and vice versa. As long as the radiator doesn't completely blocked by dust/dirt/fur/condoms, dead due to cooling cycle is probably more rarer than breaking the display by opening/closing the lid of it.

Edit for grammar (wording)
How the heck does a condom get into a laptop to block the radiator? D:
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
1. the xbox suffered from several issues, not the least which being inadequate cooling and improper design (similar to bumpgate with nvidia)
2. heating and cooling is a problem, but so is running current through a microchip device (electron migration) and you vastly reduce the amount of current running through it.
3. different portions of chips are completely shut off in current designs, so shutting off the whole chip rather then 90% of it when idle is not so different.
4. heating cooling is really a big issue only for improperly bumped chips.
5. computers should and do handle repeated on off cycles with no problem, its a microchip not a lightbulb.
etc.

If there is a fundamental defect in the design such as bumpgate or the RROD problem then optimus will just cause it to fail a little earlier, but it would have failed regardless. If there isn't such a problem then extra wear from optimus should not really matter much. As both the wear from repeated on/offs and the wear from running a current at all should both be negligible compared to the durability of a proper microchip.

thanks man!