Can anyone explain to me why it would decrease the lifespan of a cpu to increase the voltage assuming the temperatures are kept low?
For example, I have my Barton 2500+ clocked at 2500MHz @ 1.825V (36C idle, 43C under load).... I'm catching alot of flak from people telling me its not going to last very long that way, but noone has been able to provide me with a reasonable explanation as to why. I'm looking for a proper (read scientific) explanation here..
The only risk I'm aware of is due to electron migration between gates/paths burning new paths and thereby F'ing the chip, but that shouldn't really happen if its kept cool enough right? (of course within a reasonable voltage range)
For example, I have my Barton 2500+ clocked at 2500MHz @ 1.825V (36C idle, 43C under load).... I'm catching alot of flak from people telling me its not going to last very long that way, but noone has been able to provide me with a reasonable explanation as to why. I'm looking for a proper (read scientific) explanation here..
The only risk I'm aware of is due to electron migration between gates/paths burning new paths and thereby F'ing the chip, but that shouldn't really happen if its kept cool enough right? (of course within a reasonable voltage range)