Is "degradation" real or a myth?

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
It's real. I've had to downclock a couple of systems that ran for years on overclocks. I remember my A64 3000+ had to be dialed down or increased in voltage after a period of 3-4 years.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
It's real. I've had to downclock a couple of systems that ran for years on overclocks. I remember my A64 3000+ had to be dialed down or increased in voltage after a period of 3-4 years.
Thanks for that. :)
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,256
16,113
136
Yes, its real, but usually takes years to see the effects of a max overclocked (mild vcore) system. That is when you run them 24/7@100% load like I do.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.

Its real, and its generally a slow process unless you're reckless. I've killed 2 870s from over volting but it took both of them nearly a year and a half to die. I've also had a 300a degrade, that one took 2 years.

In any case, its one of the risks carried with OC'ing....generally by the time my chips had died I was ready to upgrade anyway.
 
Dec 30, 2004
12,553
2
76
It's real. I've had to downclock a couple of systems that ran for years on overclocks. I remember my A64 3000+ had to be dialed down or increased in voltage after a period of 3-4 years.

could be the motherboard getting old.
I had what I thought was degredation. e2180 stock 2ghz at 3.4ghz 1.5v.
Turned out the board was getting old, voltage at 1.48 would drop to 1.47, 1.46 under heavy load-- started giving me BSODs. Bumped up the voltage to 1.5025, and it would drop to 1.48v. Was good to go again for years.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.

Degradation doesn't equal destruction.

That's the first problem with your OP.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I belive it. Unlike human body where more abuse you give it stronger it gets...you know "no pain no gain" More abuse/use you give something mechanical or electrical the weaker it gets.

Think about highly clocked bloomys ppl have that have failed. Sitting at 4.4Ghz for years then can't even maintain 4.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Degradation is a process by which something gets destroyed.

Besides, if it's as bad as people say it is, people's CPUs should be blowing up by now.

Yeah... no, you don't seem to understand CPU degradation.

And no one here has said it'll kill your chip or make it blow up.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
Degredation exists but most people do not understand what it means.

It means a chip needs more voltage to maintain the same frequencies.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
A friend's 45nm C2D E5200 chip degraded after a few years in an IP35-E, running 3.75Ghz and 1.425v (BIOS). CPU-Z voltage under load was right around 1.4v.

It ran fine for some time, but then my friend stopped running DC for me, so that the CPU voltage went up from 1.4 to 1.425v most of the time, which is what I think degraded it.

He was also have BIOS temp warnings after a year or so, so we had to clean out the dust bunnies. The elevated temps might also have had something to do with it.

It degraded again.

Eventually, it was running 3.5Ghz, at something like 1.35v. Then it was replaced by a Q8200.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Degredation exists but most people do not understand what it means.

It means a chip needs more voltage to maintain the same frequencies.

This has been explained more than a dozen times here recently, yet there's still this taboo with some people thinking it means your machine will spontaneously burst into fire and burn down your house.

If the degradation is sufficiently high, the chip may have problems attaining any kind of overclock and may even be unstable at stock frequencies, requiring an underclock. Of course, since degradation is something gradual and depends on voltage and temperatures, your mileage may vary. Just remember that the higher you overvolt, the higher degradation becomes over time.
 

Matt1970

Lifer
Mar 19, 2007
12,320
3
0
I have seen some old Celeron 2.0GHz that degraded pretty quick but that was because we were running them at 3.0 and 3.2. After 6 months they either wouldn't overclock or would do strange stuff like ethernet not working. I had a 3.4 EE that ran at 3.7 for 2 years and all of a sudden wouldn't even overclock to 3.5 no matter what you did to it. They all went back to stock speeds just fine though.
 

TidusZ

Golden Member
Nov 13, 2007
1,765
2
81
I had a E4300 @ 3.2 degrade after ~1.5-2 years, didnt bother testing highest stable clock though, just put it at stock.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I assume having my 2500k at 4.4 using 1.33 will have no real impact on its useful lifespan since nearly 90% of the time its just at 1.6 and idle voltage of around 1.00.
 

Endymion FRS

Member
Mar 29, 2012
69
0
66
What effect would overclocking an i5 3570K to 4.2 have if I don't mess with voltage? How much of it's lifespan do I kill?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
What effect would overclocking an i5 3570K to 4.2 have if I don't mess with voltage? How much of it's lifespan do I kill?
that will have basically no effect on its useful lifespan. plus I assume you are still using power saving features which means its running idle clocks and voltage a great deal of the time.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
Not speaking for Intel, but speaking as an engineer who designs high speed analog I/O on an Intel CPU, electromigration is not only real but a huge amount of my design effort is spent making sure it isn't a problem. With that said, you are insane if you think an overclocked CPU should always work flawlessly, especially if you increase the voltage.

As for this:

It ran fine for some time, but then my friend stopped running DC for me, so that the CPU voltage went up from 1.4 to 1.425v most of the time, which is what I think degraded it.

I assure you that the last 25mV had very little to do with it. It was the other 300.
 

Endymion FRS

Member
Mar 29, 2012
69
0
66
that will have basically no effect on its useful lifespan. plus I assume you are still using power saving features which means its running idle clocks and voltage a great deal of the time.

So then it would be safe for a 24/7 overclock? Yes, I am using power saving, by the way