Originally posted by: Viditor
Originally posted by: Absolute0
Yeah it's pretty obvious that a CPU wears out eventually, like a light bulb...
Just that it's usually so many years away that it isn't much of an argument about anything. Unless you keep 10 year old computers around running!
That's a fairly common mistake because it's based on history (older CPUs lasted many years longer). The problem is that the older CPUs also ran MUCH slower and generated far less heat. This is why their lifespan was soooo much longer than todays CPUs.
Another point (this from personal experience on 2 systems)...as time goes on, the temp of the CPU will increase from heat and use. For example, when I first bought my 3200+ system, I used a program called "Toast" (excellent stress program) on it for an hour. The highest that the temp went was 44 degrees. I tried it again on the same system (several months later) after making sure there wasn't any dust at all inside (I do this regularly) and the temp is closer to 50 now. The ambient temps are identical as it's climate controlled in here...
It would be interesting for me to see if any of you have tried to do the same on some of your more used systems. BTW, this particular system has never been overclocked.
I don't know what you do to your systems... but CPUs definitely don't get hotter with time. I've been running systems 24/7 for a while now, i monitor the temps in smartguardian constantly and record all the overclocking progress. I have a database of overclocking screenshots that's over 4 Gigabytes, and spans many months of overclocking on many CPUs.
Heat doesn't increase with time, that's absolutely ludicrous, then after a full year you'd be running hotter, and a 2 year old computer would be about to fry itself? NO, it's simply a matter of cleaning, contact, and ambient temperatures.
Let me use my 3800+ X2 as an example. I ran it on excellent watercooling, naked core, 1.6v, 2.87 Ghz. I ran there 24/7 for months. Through these months, the load temp was consistently ~38c. Of course it depended on ambient temps, but it's not like this temp was increasing. WHY should it? The idea that a CPU starts producing more heat as it gets older is unfounded and ridiculous. I may be able to accept this possibly happens, but at a rate so slow that we cannot measure the effects of a year.
I have a better idea, temps go DOWN with time. This comes from a burn in of the paste used, like AS5 or Arctic Ceramique. Anyone who uses those will attest to a burnned in final temp of about 2c less than the fresh install. Of course it takes about a week running full load before you see this. Also the TIM used between the CPU and the IHS has a chance to burn in and settle.
*and even if you claim there was no dust inside... after a few months where WILL be dust, just there isn't so much that you can see and readily remove. I run an open case system and chance stuff up a lot, and i've got watercooling... dust isn't a problem for me.
**there is also a distinct possibility that your sensor is off, or that contact somehow weakened over the months. While i do not doubt that you saw 44c, and then later saw 50c, i would propose that SOMETHING ELSE happened than the CPU starting to suck up and release more energy.