Higher CPU voltage and lifespan

Abzstrak

Platinum Member
Mar 11, 2000
2,450
0
0
Can anyone explain to me why it would decrease the lifespan of a cpu to increase the voltage assuming the temperatures are kept low?

For example, I have my Barton 2500+ clocked at 2500MHz @ 1.825V (36C idle, 43C under load).... I'm catching alot of flak from people telling me its not going to last very long that way, but noone has been able to provide me with a reasonable explanation as to why. I'm looking for a proper (read scientific) explanation here..

The only risk I'm aware of is due to electron migration between gates/paths burning new paths and thereby F'ing the chip, but that shouldn't really happen if its kept cool enough right? (of course within a reasonable voltage range)
 

borealiss

Senior member
Jun 23, 2000
913
0
0
You're also increasing the potential across the gateoxide layer, which can eventually lead to faster dialectric breakdown over time. This usually depends on how many angstroms the oxide layer is, but CPU's are rated for a certain lifetime often in correspondence with this spec.
 

GoHAnSoN

Senior member
Mar 21, 2001
732
0
0
by the time you cpu goes down bcos of oc effect, it's time to buy a newer cpu.
so, dont bother. (as long as you dont burn it that is)
 

Evadman

Administrator Emeritus<br>Elite Member
Feb 18, 2001
30,990
5
81
The higher voltage causes electron migration IIRC. Been quite a while.
 

maluckey

Platinum Member
Jan 31, 2003
2,933
0
71
In the end, as stated above, it won't matter unless you melt the CPU down. After three years it won't be fast enough to run "new" applications.

My favorite CPU has been OC'ed waaaaay over default for over a year now, and requires the same voltage today as the first day I OC'ed it. It runs at 1.825 volts. I have had other CPU's running continuous high OC's for over two years before I upgraded.

In short, I don't think that a high OC will relly mean much unless you want to guarantee the CPU to last over three years. I can't say the same for other components though.
 

Pulsar

Diamond Member
Mar 3, 2003
5,224
306
126
I'm not going to go into the technical side of this, the basics were mentioned above.

I'll keep it very simple. It's luck of the draw on whether your chip will last 6 months or 6 years at a high voltage. Do you have the cash to wager on it? If so, then don't worry about it.
 

uart

Member
May 26, 2000
174
0
0
Originally posted by: GoHAnSoN
by the time you cpu goes down bcos of oc effect, it's time to buy a newer cpu.
so, dont bother. (as long as you dont burn it that is)


To some extent that's true. However if you are interested in system reliabilty then it is not possible to seperate the statistical "life time" from the probabilty of immenent failure in any given time frame. What I mean is that the "(as long as you dont burn it that is)" clause is not just a lossly related after-thought, but rather is at the crux of the issue.

Quite simple if one reduces the MTTF (mean time to failure) from say 25 years down to say 6 years by over-volt and over-clock then it is somewhat misleading to say that this is totally irrelevent due to the fact that you dont plan to use the cpu for say more than 2 or 3 years anyway. The probabilty of failure within any given timespan is mathematically related to the MTTF, so if you reduce the MTTF you also increase the probabilty of failure at all time points.

IMHO the major determinate factor in your choice to overclock or otherwise is really whether or not you can afford the possible down time or potential data loss that a failure may bring.


To answer the original question. If you just overclock (without voltage increase) then I believe that this has minimal impact on MTTF, provided that you keep the temperature well under control. Increasing the voltage is a bit of a double whammy however. It increases the power draw by the sqaure of the relative voltage rise and also increases the probabilty of catastropic failure. Personally I dont like to increase voltages too much if possible.

Also be aware that when you increase the power consumption of a cpu (due to overfreq and/or overvolt) that even if you beef up the cooling system so as to keep the thermistor temperature (or monitored temperature) unchanged, that the critical internal "transistor junction" tempertures will still be higher due to the higher thermal gradients that higher power implies.