"Burning in"

Yossarian

Lifer
Dec 26, 2000
18,010
1
81
Anyone have links describing the benefits and methods? I know it involves running the CPU at max capacity for a long time to supposedly strengthen the electrical pathways (or something like that), I'd just like to find out some details.

Thanks in advance.
 

Renob

Diamond Member
Jun 18, 2000
7,596
1
81
I think burning is BS. I read a article put out by AMD and they said it does not help in getting a better overclock
 

Rastus

Diamond Member
Oct 10, 1999
4,704
3
0
Just the fact that you are here probably means your CPU gets quite a work out on a regular basis. God only knows, all mine do. ;)
 

Keeksy

Member
Dec 25, 2000
67
0
0
Burning is generally used to see if your system is still stable after overclock or installing new hardware. If your system is still up after a 24-hour, hi-res Quake loop, I'd say your new configuration works fine. Burning simply tests your system for stability. If it isn't stable you need to fix something.
 

Stephen24

Senior member
Jul 21, 2000
430
0
0
IF burning in does not work than why have so many people been able to clock the cpu higher after a burn in?
 

Renob

Diamond Member
Jun 18, 2000
7,596
1
81
Well if I remember right the test AMD did was under very controled condishions with 1500 cpu and they said it did not help...... now the people could be changing bios setting or things like that that could effect there overclocking........
 

compuwiz1

Admin Emeritus Elite Member
Oct 9, 1999
27,112
930
126
Burn in is crap, even confirmed by an Intel engineer.

Faith, the right conditions and "Karma" may have more to do with it than anything else. :)

I'm going on close to 2000 overclocked cpus that I have pretested for resale. If anything, the result may get worse after 12 hours or so. ;)
 

larrymoencurly

Senior member
Oct 10, 1999
598
0
0
Didn't a design engineer from Intel mention a few months ago here that nothing made a chip better, and running it made it only worse? He not only relied upon his own knowledge but also researched the literature and also asked the other experts at Intel.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I think what some people have experienced which led to the burn in theory is that when you first apply heatsink compound and mate the heatsink unit to the CPU, you end up with many air pockets trapped between the two surfaces. After a period of time, the expansion/contraction of many heat-up cool down cycles, plus the continual tension of the heatsink/fan retention device forces these air pockets out, and at the same time helps to evenly disribute the heatsink compound. I think possibly that the heatsink device, as well as the CPU, will "take" to the shape of the surface it has been mounted to, so that they become more efficient in transfering heat.
I would have to agree with the experts that there's nothing that happens within the CPU to promote overclocking, but I believe that there is something that happens externally, because I have experienced it myself, and others swear that after a short period that their CPU's can make that "next" step up as well. We all know that cooling is the key to successful overclocking.
 

BlueScreenVW

Senior member
Sep 10, 2000
509
0
0
I agree with all the gentlemen above (Killrose has a very good point): burn-in is mainly for testing the CPU, and possible speed gain is most likely not due to improved conductivity (transistor switching efficiency, etc). But for the sake of truth I'd like to point out that there is a possibility that electron/ion migration due to heating/voltage might effect overclockability. Not a very big possibility, but still a real one. But almost all significant changes occur as rare statistict events at the microscopic level (individual transistors), not on the macroscopic, and are generally to the worse. Yet, you could see an improvement with burn-in, just as you could do brain surgery with a shotgun - it's not likely, but still not impossible.

:) :) :)
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
I'm the Intel engineer that has been mentioned previously, and I have spent some time researching the issue. I did a full paper search of the IEEE archives, pulled any that seemed relevant, read them and discovered that no one has ever written a paper showing silicon performance improves with time. I spent time talking to the process 'gurus' at Intel. I then requested and received the burn-in frequency reports for several lots of the P54CS cC0 processor - a sample size of literally 10's of thousands of processors. The burn-in reports showed that statistically CPUs get significantly slower after going through burn-in at Intel. Admittedly, this was a 0.35um process and we are now running about half that, but I have no reason to suspect that 0.18um CPUs will behave differently (since I have spent the last couple of years designing one).

Here's a quote from me in one of the many threads about this subject. Threads are here, here and here.

Intel (and AMD - and every chip manufacturer that I've ever heard of) performs burn-in at the packaging facility after packaging. Burn-in involves elevated voltages and elevated temperatures (hence the 'burn-in') for a period of time substantially less than two weeks. The idea behind this is to kill marginal parts before they hit the street. So, we are intentionally trying to kill/destroy any parts which are marginal enough that they would die within a brief period of time of in-system use by an end-consumer.

There are graphs of number of unit failures (y-axis) versus time (x-axis) and these show a "U" characteristic. So initially a high number of parts may be expected to fail, this number falls off to a low value for a long period of time (years) and then the number of failures increases near the end of life. So we elevate voltages and temperatures to accelerate the degradation of the processor to the point where we are now selling units that are in the area of the graph where the number of failures is low.

End-users do not need to do this. Not only will you reduce the expected life of your processor dramatically, but older processors are slower than newer ones. This is a fact - statistically all degradation mechanisms in silicon slow down transistors. There are people who claim that a form of "burn-in" enabled them to overclock farther and I'm not calling these people liars, but based on my decade worth of experience designing and testing microprocessors for Intel, HP and SGS Thompson, I can say with assurance that whatever is going on has nothing to do with the silicon. It may be a chemical reaction with the package/thermal grease, or it may be mechnically based (ie. in the fan), but it is definitely not silicon related. Statistically speaking, silicon only gets slower with time.

Having said this, it is worth doing a stress test of your system after you have assembled it. Run a program that stresses the system in many ways (HD, memory, processor, CDROM, sound, etc.) in order to check that the system works well together simply so that you can determine that you have a problem quickly while it's easy to do returns. I would recommend running it straight for 24 to 72 hours running a variety of demanding tasks to expose problems with the system as quickly as possible.

Patrick Mahoney
IPF Microprocessor Circuit Design
Intel Corp
pmahoney@mipos2.intel.com
 

BlueScreenVW

Senior member
Sep 10, 2000
509
0
0
Thanks for the very thorough investigation, pm. The only thing missing now is an AMD engineer appearing on stage to contradict your results... Naaah, just kidding! :)
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Nah, the engineers at Intel and AMD generally get along pretty well. Or at least I get along with the half dozen or so guys that I know that work at AMD.

We used to have the annual "Intel vs. AMD Paintball tournament" (they probably call it the "AMD vs. Intel Paintball tournament"). I'm not sure if they still do it, but we did it for a couple of years from 1994-1996. I moved out of SiliValley back in 1997 and I don't remember it happening that year and wouldn't know if it still happens. AMD won twice, Intel once. They had one guy who should have been a professional soldier instead of a CPU designer. :)

I used to go rock climbing with a mixed group of AMD, Intel, 3Dfx and nVidia engineers. Never had a fist-fight (or a rope that suspiciously gets cut) that I can recall. Of course, we tended to avoid confrontational topics. :)
 

Wiz

Diamond Member
Feb 5, 2000
6,459
16
81
Entropy sucks, but cannot be ignored. Anytime someone tries to tell me things improve with age I think they might know what they are talking about in the very short term but long term it just can't be true. For example a fine french red wine can be very good after ten to fifteen years. However you will be hard pressed to find one that is 30 years old and still good - too much can go wrong in that amount of time.
I speak from experience, having kept a 1975 bottle of Rothschild until New Years Eve last year and opening it to find it not merely bad, but very very bad. I likely would have greatly enjoyed it 5 or 10 years ago. Of course that's a single persons experience and not statistically significant!
Anyway entropy has it's way with everything, things don't improve with age. 'Burn in'
will not make your chips run faster and better. Welcome to reality. Accidents happen, and sometimes we view them as something else, like time and pressure making the paste between your heat sink and cpu spread out more evenly or thin out and then finding your chip will go faster. With me it was replacing my old 16 bit soundblaster with a PCI sound card and finding out it was keeping my system from going as fast as possible. I don't know why, I only know I got another 12% speed with the PCI card that I couldn't get before. My system improved with time, but I did have to provide more input to get that improvement.
Sounds kind of like one of those laws of physics doesn't it. Second law of thermodynamics isn't it? (it's been quite a while since high school for me)