Does the frequent change of CPU multiplier reduce the chip life span?

GundamF91

Golden Member
May 14, 2001
1,827
0
0
Most of the modern CPU comes with ways "downclock" when there's low CPU utilization. This is done through the usage of multiplier changes, so the CPU would frequently change from one multiplier setting to another (within spec). This is different compared to previous CPU which were always fixed at a multiplier.

I'm wondering if this frequent multiplier change would shorten the life span of the processor, ie. from 10 years to 5 years. This is like throttling an engine would probably cause more issues than if an engine is kept on cruise at a fixed speed.

This is more of a theoretical thought, and not about "CPU would be obsolete in 5 years anyway, so who cares." Any thoughts on this?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
In all my years as an IT, not even in the oldest machine I have ever worked with, with or without constant running of seti (100% CPU usage with practically nil usage of other components) have I ever seen a CPU break from wear and tear... (only extreme overheating due to overclocking, or physical impact damage)
And all the machines I run are on 24/7.
So if it DOES shorten its life span it would more likely be from 100 years to 50 years...

The power supply, harddrives, the fans on the video card and the north bridge and CPU... everything else dies first... and when it dies on a 5+ year old machine replacements are not worth it and it goes in the trash... (assuming you didnt trash it to begin with for being obsolete)

The car analogy is pointless. This isn't a car, it is a CPU. You are not throttling an engine... a lightbulb analogy would have served better because at least there you are using electricity. But still analogies are pointless in this issue.

Besides all of that... I see absolutely no reason why using less of the CPU will cause it to wear out faster, if anything the other way around. There are no physical on off switch to break, so I would think it will actually increase the lifespan by reducing the amount electricity and usage parts of the CPU see.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
CPU != Engine. This will not kill the life of the cpu at all (if anything, it will increase it.). Basiclly CnQ is just a few transistor flips, considering transistors can be flipped a few billions of times before failing, you have nothing to worry about 6 or 7 flips a second.

It is not harsh on the cpu or anything like that, it is friendly because it will work the cpu less when it is not in use (as opposed to having a high clock on the thing while idling)

Overclocking will kill a cpu faster then anything you could do to it, and even then a stable overclock will not half the life of a CPU, just become unstable at half the CPU life.
 

TC91

Golden Member
Jul 9, 2007
1,164
0
0
the frequent multiplier change i highly doubt is harmful to current cpu's. i like having my cpu throttle its multiplier and voltage when @ idle (i have speedstep and c1e on since they are stable with my overclock) so my cpu would live longer than having 1.5125v going thru it all the time.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: GundamF91
This is like throttling an engine would probably cause more issues than if an engine is kept on cruise at a fixed speed.

Do you have your cpu underclocked? If not, it isn't at cruising speed while web browsing with Cool n Quiet or SpeedStep disabled, it's at redline. You don't have your car's engine set to idle @ 5,000 RPM, do you? Do you think it would last longer, if you did?:D
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
myocardia makes a very good analogy... the cpu is normally pushed to the redline by default... because it needs to have the best performance at stock speeds for the average consumer... so underclocking it is like taking a car that has been set to idle at 5000rpm and telling it to cut down to 500rpm when on neutral...
 

jonmcc33

Banned
Feb 24, 2002
1,504
0
0
Originally posted by: taltamir
The car analogy is pointless. This isn't a car, it is a CPU. You are not throttling an engine... a lightbulb analogy would have served better because at least there you are using electricity. But still analogies are pointless in this issue.

Cars wear out because they have moving parts. A CPU should last forever if you don't kill it by overclocking.

 

sonoran

Member
May 9, 2002
174
0
0
An interesting question, but I don't think anyone has an answer. Changing temperatures due to clock speed changes may introduce some sort of thermal stresses inside the chip. It's no secret that different materials expand and contract at different rates with changes in temperature. Will these constant temperature changes, and constant stresses between the various chip materials, eventually lead to some sort of failure inside the chip? About the only answer I can provide to that is that both manufacturers still offer a 3 year warranty - so they must be satisfied that it won't have a significant effect at least over that time period.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The reason they don't offer a 20 year warrenty on CPUs (which would be appropriate considering how long they last) is because they will not be able to find a replacement chip for you after so long, and be forced to give you a better one... And the only reason your CPU will break in even 20 years is because of physical damage or overclocking... This is more of a "hypothetical" debate as the author said... not an issue if you COULD kill a cpu doing that, but rather, how many of the unknown years of operation a cpu could last are shaved off by this feature.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Wow, I cannot believe someone is comparing a mechanical part to an electrical part... Two very different things.
 

GundamF91

Golden Member
May 14, 2001
1,827
0
0
I can see why the comparison between mechanical part (car engine) and electrical part (process) isn't really applicable. That scenario is just what got me thinking about this.

As mentioned above, it would be more applicable to say Light Bulb at certain voltage vs. CPU at certain voltage. The higher voltage will give you brighter bulb or faster CPU, and reducing it would result in lesser performance, so its' a trade off. What I'm wondering is whether this constant reduction and increase in voltage would actually cause stress to the bulb/CPU more simply because of this change. I know from experience that brightness-adjustable light bulbs seem to last not as long when compared to standard incandescent light bulb.

 

bradley

Diamond Member
Jan 9, 2000
3,671
2
81
I agree, mechanical and electrical, two totally different beasts. But I believe the mostly negligible detriment of increased switching is more than offset by running your processor at a lower Vcore and multiplier. Generating less heat and dissipating more heat in general will also increase the longevity of electronic parts.
 

JackMDS

Elite Member
Super Moderator
Oct 25, 1999
29,545
422
126
20 years of Buildings and maintaining PCs.

Once or twice a CPU died for No apparent cause.

Otherwise, the so call "Life of CPU", whether OC or Not, is few times more than its Functional Life.

I.e. (as an example) if OC reduces the life of a CPU from 20 years to 10, functional computing would terminate the usefulness of the CPU in 3-4 years at best.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: GundamF91
I know from experience that brightness-adjustable light bulbs seem to last not as long when compared to standard incandescent light bulb.

That's because of the cheap elements they use in those bulbs. I once ran a normal 100 watt incandescent light bulb for well over a year, without ever turning it off, using a dimmer switch. Try burning a 100 watt incandescent bulb 24/7, and see how long it lasts without a dimmer (dimmers just reduce the voltage to the bulb).