• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Is "degradation" real or a myth?

Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.
 
It's real. I've had to downclock a couple of systems that ran for years on overclocks. I remember my A64 3000+ had to be dialed down or increased in voltage after a period of 3-4 years.
 
Yes, its real, but usually takes years to see the effects of a max overclocked (mild vcore) system. That is when you run them 24/7@100% load like I do.
 
Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.

Its real, and its generally a slow process unless you're reckless. I've killed 2 870s from over volting but it took both of them nearly a year and a half to die. I've also had a 300a degrade, that one took 2 years.

In any case, its one of the risks carried with OC'ing....generally by the time my chips had died I was ready to upgrade anyway.
 
It's real. I've had to downclock a couple of systems that ran for years on overclocks. I remember my A64 3000+ had to be dialed down or increased in voltage after a period of 3-4 years.

could be the motherboard getting old.
I had what I thought was degredation. e2180 stock 2ghz at 3.4ghz 1.5v.
Turned out the board was getting old, voltage at 1.48 would drop to 1.47, 1.46 under heavy load-- started giving me BSODs. Bumped up the voltage to 1.5025, and it would drop to 1.48v. Was good to go again for years.
 
Has anyone actually destroyed a Sandy Bridge CPU via "degradation"?

A lot of people talk about it all over the place, but I have yet to hear from someone or see direct evidence that it exists.

Degradation doesn't equal destruction.

That's the first problem with your OP.
 
I belive it. Unlike human body where more abuse you give it stronger it gets...you know "no pain no gain" More abuse/use you give something mechanical or electrical the weaker it gets.

Think about highly clocked bloomys ppl have that have failed. Sitting at 4.4Ghz for years then can't even maintain 4.
 
Last edited:
Degradation is a process by which something gets destroyed.

Besides, if it's as bad as people say it is, people's CPUs should be blowing up by now.

Yeah... no, you don't seem to understand CPU degradation.

And no one here has said it'll kill your chip or make it blow up.
 
Degredation exists but most people do not understand what it means.

It means a chip needs more voltage to maintain the same frequencies.
 
A friend's 45nm C2D E5200 chip degraded after a few years in an IP35-E, running 3.75Ghz and 1.425v (BIOS). CPU-Z voltage under load was right around 1.4v.

It ran fine for some time, but then my friend stopped running DC for me, so that the CPU voltage went up from 1.4 to 1.425v most of the time, which is what I think degraded it.

He was also have BIOS temp warnings after a year or so, so we had to clean out the dust bunnies. The elevated temps might also have had something to do with it.

It degraded again.

Eventually, it was running 3.5Ghz, at something like 1.35v. Then it was replaced by a Q8200.
 
Degredation exists but most people do not understand what it means.

It means a chip needs more voltage to maintain the same frequencies.

This has been explained more than a dozen times here recently, yet there's still this taboo with some people thinking it means your machine will spontaneously burst into fire and burn down your house.

If the degradation is sufficiently high, the chip may have problems attaining any kind of overclock and may even be unstable at stock frequencies, requiring an underclock. Of course, since degradation is something gradual and depends on voltage and temperatures, your mileage may vary. Just remember that the higher you overvolt, the higher degradation becomes over time.
 
I have seen some old Celeron 2.0GHz that degraded pretty quick but that was because we were running them at 3.0 and 3.2. After 6 months they either wouldn't overclock or would do strange stuff like ethernet not working. I had a 3.4 EE that ran at 3.7 for 2 years and all of a sudden wouldn't even overclock to 3.5 no matter what you did to it. They all went back to stock speeds just fine though.
 
I had a E4300 @ 3.2 degrade after ~1.5-2 years, didnt bother testing highest stable clock though, just put it at stock.
 
I assume having my 2500k at 4.4 using 1.33 will have no real impact on its useful lifespan since nearly 90% of the time its just at 1.6 and idle voltage of around 1.00.
 
What effect would overclocking an i5 3570K to 4.2 have if I don't mess with voltage? How much of it's lifespan do I kill?
that will have basically no effect on its useful lifespan. plus I assume you are still using power saving features which means its running idle clocks and voltage a great deal of the time.
 
Not speaking for Intel, but speaking as an engineer who designs high speed analog I/O on an Intel CPU, electromigration is not only real but a huge amount of my design effort is spent making sure it isn't a problem. With that said, you are insane if you think an overclocked CPU should always work flawlessly, especially if you increase the voltage.

As for this:

It ran fine for some time, but then my friend stopped running DC for me, so that the CPU voltage went up from 1.4 to 1.425v most of the time, which is what I think degraded it.

I assure you that the last 25mV had very little to do with it. It was the other 300.
 
that will have basically no effect on its useful lifespan. plus I assume you are still using power saving features which means its running idle clocks and voltage a great deal of the time.

So then it would be safe for a 24/7 overclock? Yes, I am using power saving, by the way
 
Back
Top