Dual core on a budget! Who needs AMD?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
22,896
12,957
136
Do you guys remember when Prescott was released? I remember there was a small dust-up related to Prescott's heat output regarding the guarenteed lifespan of the CPU.

I believe earlier Netburst incarnations either had no lifespan guarentee from Intel, or were rated to last about 5-10 years of operation at stock speeds with proper (stock) cooling and operating conditions.

Prescott, which was operating at speeds similar to Northwood but at much higher temperatures, was only guarenteed to last 3 years at stock speeds.

Heat kills. Voltage kills, too.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
Originally posted by: tanishalfelven
what i'd like to see is stock to stock comparison.

Stock to stock what ? Performance ? A 3800 X2 will wipe the floor with an 805. Heat ? the 3800 X2 at stock has no heat to speak of, and the 805 still puts out waves of heat.
 

ooeric

Senior member
Apr 8, 2006
414
0
0
Originally posted by: DrMrLordX
Do you guys remember when Prescott was released? I remember there was a small dust-up related to Prescott's heat output regarding the guarenteed lifespan of the CPU.

I believe earlier Netburst incarnations either had no lifespan guarentee from Intel, or were rated to last about 5-10 years of operation at stock speeds with proper (stock) cooling and operating conditions.

Prescott, which was operating at speeds similar to Northwood but at much higher temperatures, was only guarenteed to last 3 years at stock speeds.

Heat kills. Voltage kills, too.


thats true with your last line,
its 50/50 thou
cause you can have hardcore cooling modders with pots that cool to -150C
pumping 1.8v into a chip and it still dies.
so cpu death swings both ways. (godbless mobos with working cpu temp cutoff switchs)

but regardless to that fact, i believe all cpus are durable for years to come with maintence,, like a car

off the dust, get some new paste on, we all good
 

Absolute0

Senior member
Nov 9, 2005
714
21
81
Originally posted by: Viditor
Originally posted by: Absolute0
Yeah it's pretty obvious that a CPU wears out eventually, like a light bulb...

Just that it's usually so many years away that it isn't much of an argument about anything. Unless you keep 10 year old computers around running! :p

That's a fairly common mistake because it's based on history (older CPUs lasted many years longer). The problem is that the older CPUs also ran MUCH slower and generated far less heat. This is why their lifespan was soooo much longer than todays CPUs.

Another point (this from personal experience on 2 systems)...as time goes on, the temp of the CPU will increase from heat and use. For example, when I first bought my 3200+ system, I used a program called "Toast" (excellent stress program) on it for an hour. The highest that the temp went was 44 degrees. I tried it again on the same system (several months later) after making sure there wasn't any dust at all inside (I do this regularly) and the temp is closer to 50 now. The ambient temps are identical as it's climate controlled in here...

It would be interesting for me to see if any of you have tried to do the same on some of your more used systems. BTW, this particular system has never been overclocked.

I don't know what you do to your systems... but CPUs definitely don't get hotter with time. I've been running systems 24/7 for a while now, i monitor the temps in smartguardian constantly and record all the overclocking progress. I have a database of overclocking screenshots that's over 4 Gigabytes, and spans many months of overclocking on many CPUs.

Heat doesn't increase with time, that's absolutely ludicrous, then after a full year you'd be running hotter, and a 2 year old computer would be about to fry itself? NO, it's simply a matter of cleaning, contact, and ambient temperatures.

Let me use my 3800+ X2 as an example. I ran it on excellent watercooling, naked core, 1.6v, 2.87 Ghz. I ran there 24/7 for months. Through these months, the load temp was consistently ~38c. Of course it depended on ambient temps, but it's not like this temp was increasing. WHY should it? The idea that a CPU starts producing more heat as it gets older is unfounded and ridiculous. I may be able to accept this possibly happens, but at a rate so slow that we cannot measure the effects of a year.

I have a better idea, temps go DOWN with time. This comes from a burn in of the paste used, like AS5 or Arctic Ceramique. Anyone who uses those will attest to a burnned in final temp of about 2c less than the fresh install. Of course it takes about a week running full load before you see this. Also the TIM used between the CPU and the IHS has a chance to burn in and settle.


*and even if you claim there was no dust inside... after a few months where WILL be dust, just there isn't so much that you can see and readily remove. I run an open case system and chance stuff up a lot, and i've got watercooling... dust isn't a problem for me.
**there is also a distinct possibility that your sensor is off, or that contact somehow weakened over the months. While i do not doubt that you saw 44c, and then later saw 50c, i would propose that SOMETHING ELSE happened than the CPU starting to suck up and release more energy.
 

stevty2889

Diamond Member
Dec 13, 2003
7,036
8
81
Originally posted by: Absolute0
Originally posted by: Viditor
Originally posted by: Absolute0
Yeah it's pretty obvious that a CPU wears out eventually, like a light bulb...

Just that it's usually so many years away that it isn't much of an argument about anything. Unless you keep 10 year old computers around running! :p

That's a fairly common mistake because it's based on history (older CPUs lasted many years longer). The problem is that the older CPUs also ran MUCH slower and generated far less heat. This is why their lifespan was soooo much longer than todays CPUs.

Another point (this from personal experience on 2 systems)...as time goes on, the temp of the CPU will increase from heat and use. For example, when I first bought my 3200+ system, I used a program called "Toast" (excellent stress program) on it for an hour. The highest that the temp went was 44 degrees. I tried it again on the same system (several months later) after making sure there wasn't any dust at all inside (I do this regularly) and the temp is closer to 50 now. The ambient temps are identical as it's climate controlled in here...

It would be interesting for me to see if any of you have tried to do the same on some of your more used systems. BTW, this particular system has never been overclocked.

I don't know what you do to your systems... but CPUs definitely don't get hotter with time. I've been running systems 24/7 for a while now, i monitor the temps in smartguardian constantly and record all the overclocking progress.

Tell that to my 3.06ghz northwood. For the first year and a half in never broke 50c. I upgraded and didn't use if for about 3 months. Started using it again(on the same motherboard) and temps started rising, even with an XP-120 the temps were getting up to around 68c. I cleaned and re-applied the thermal paste over and over, and no change, tried 3 other heatsinks, and the temps were even worse, stuck it on another motherboard, and temps were still the same. I got the temps back down to around 63c by removing the IHS, but the temps never got back down to where they were. The gate dielectric is a very thin layer, and over time heat and current are going to break it down, allowing more current to leak through, and causing higher temperatures. At stock speeds with reasonable cooling, they could last 10 years, but they will break down eventualy.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Thermal leakage may get worse faster on CPUs that are *not* on 24x7. Think basic thermal expansion. A cpu that's at 58C at all times is going to be stressed FAR less than one going 25C->50C->25C at least once a day simply because the CPU is composed of different elements, which have different thermal expansion characteristics.
 

Absolute0

Senior member
Nov 9, 2005
714
21
81
Originally posted by: stevty2889
Originally posted by: Absolute0
Originally posted by: Viditor
Originally posted by: Absolute0
Yeah it's pretty obvious that a CPU wears out eventually, like a light bulb...

Just that it's usually so many years away that it isn't much of an argument about anything. Unless you keep 10 year old computers around running! :p

That's a fairly common mistake because it's based on history (older CPUs lasted many years longer). The problem is that the older CPUs also ran MUCH slower and generated far less heat. This is why their lifespan was soooo much longer than todays CPUs.

Another point (this from personal experience on 2 systems)...as time goes on, the temp of the CPU will increase from heat and use. For example, when I first bought my 3200+ system, I used a program called "Toast" (excellent stress program) on it for an hour. The highest that the temp went was 44 degrees. I tried it again on the same system (several months later) after making sure there wasn't any dust at all inside (I do this regularly) and the temp is closer to 50 now. The ambient temps are identical as it's climate controlled in here...

It would be interesting for me to see if any of you have tried to do the same on some of your more used systems. BTW, this particular system has never been overclocked.

I don't know what you do to your systems... but CPUs definitely don't get hotter with time. I've been running systems 24/7 for a while now, i monitor the temps in smartguardian constantly and record all the overclocking progress.

Tell that to my 3.06ghz northwood. For the first year and a half in never broke 50c. I upgraded and didn't use if for about 3 months. Started using it again(on the same motherboard) and temps started rising, even with an XP-120 the temps were getting up to around 68c. I cleaned and re-applied the thermal paste over and over, and no change, tried 3 other heatsinks, and the temps were even worse, stuck it on another motherboard, and temps were still the same. I got the temps back down to around 63c by removing the IHS, but the temps never got back down to where they were. The gate dielectric is a very thin layer, and over time heat and current are going to break it down, allowing more current to leak through, and causing higher temperatures. At stock speeds with reasonable cooling, they could last 10 years, but they will break down eventualy.

More likely that the thermal sensor is wrong.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Markfw900
Originally posted by: tanishalfelven
what i'd like to see is stock to stock comparison.

Stock to stock what ? Performance ? A 3800 X2 will wipe the floor with an 805. Heat ? the 3800 X2 at stock has no heat to speak of, and the 805 still puts out waves of heat.

That would be like comparing a bmw 320 to a Corvette. You can't compare 2 products when one is 2x more than the other for stock performance.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Absolute0
[
More likely that the thermal sensor is wrong.

Actually, my guess as to the reason that you have not experienced this is that you run them 24/7. Thermal fatigue is caused by a change in temp (delta T)...the greater the change, the greater the fatigue.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Absolute0
Interesting, i will accept it as a possibility. Any articles on this?

Go to the link I posted on page 2 as a start...from the link:
"the magnitude of delta T is directly proportional to failure rate"

Edit: If you think about it, it really does make sense. When you heat then cool any metal, that metal develops "metal fatigue" and it changes it's nature (often becomes more brittle). As an example, this was a major problem with the Concord and it cost them a fortune in constant testing and replacement parts.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: RussianSensation
Originally posted by: Markfw900
Originally posted by: tanishalfelven
what i'd like to see is stock to stock comparison.

Stock to stock what ? Performance ? A 3800 X2 will wipe the floor with an 805. Heat ? the 3800 X2 at stock has no heat to speak of, and the 805 still puts out waves of heat.

That would be like comparing a bmw 320 to a Corvette. You can't compare 2 products when one is 2x more than the other for stock performance.

Well, it's more like comparing a Corvette to a Ford Pinto IMHO, but your reasoning is sound. :)
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: dexvx
System Power Draw tests:

http://techreport.com/reviews/2006q2/core-duo/index.x?pg=15

FYI, the Yonah board is $99 @ ZZF (sold out). The XE965 draws about the same power as the FX-60, with the T2600 drawing 100W less than either at load.

Makes sense...I suspect that the X2 Turion 64 will draw less than the Turion because of the new SiGe design, which puts it about the same as the core-duo according to that chart.
One thing to remember though is that Intel didn't put 64bit into Yonah specifically for thermal/power reasons, so I wonder if Merom will be that much easier on the power.

Makes you wonder just how efficeint the Turion X2 @65nm with the upcoming Advanced Power Management circuits will do...?
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
They demo'ed a few laptops with the Turion X2, but its all speculation at this time.

Guestimation: The AM2 3800+ 38W should be a decent measure. Assuming you bin the Turions, it can be around the 30W TDP rating, which is actually less than the ML-44, but more than the MT-40. I believe the Turion X2 will not be available higher than 2.0Ghz, whereas the Yonah is commercially available at 2.133 Ghz at this time.

Combined with the more power efficient DDR2, it should save a few watts at full load (I'm not really caring about idle power atm).


Anyways... I'm thinking the 805 shouldn't consume more power at load than the FX60 on the charts.

----

In any event my point is, with the $199 P-D 920 + ECS 945 Mobo combo or your choice of a $99 Asus powered Viiv Yonah board + a cheap Core Duo (less than $300), there's no real point to getting a 805. On one hand, you won't save initial money compared to the 920 and on the other hand the "long term" power savings from a Viiv + Core Duo is readily apparent, even moreso than a 3800+/NF4 combo (Core Duo + Viiv easily has a 50W load advantage or 100+W on the 805).
 

Absolute0

Senior member
Nov 9, 2005
714
21
81
Originally posted by: Viditor
Originally posted by: Absolute0
Interesting, i will accept it as a possibility. Any articles on this?

Go to the link I posted on page 2 as a start...from the link:
"the magnitude of delta T is directly proportional to failure rate"

Edit: If you think about it, it really does make sense. When you heat then cool any metal, that metal develops "metal fatigue" and it changes it's nature (often becomes more brittle). As an example, this was a major problem with the Concord and it cost them a fortune in constant testing and replacement parts.

Alright, I read the article.

I already knew that increasing T will decrease CPU life. It seems a valid point to me as well that increasing dT will decrease CPU life as well. However, i have yet to see how any of this plays into the CPU producing more heat.
 

d3lt4

Senior member
Jan 5, 2006
848
0
76
I need amd. :roll:
The 805 looks really good for that price, but it takes a blamed lot of watts, and just think of the heat. Other than that it looks like a really good chip, for the budget dual core.
Opty for single core still rules!
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Btw, the 915 and 925 should be coming out as well, starting around $130/$150 .

915 has no EIST, so that kind of sucks
925 has EIST.

Both do not have virtualization technology, but I'm guessing 925 power consumption be similar to the 965XE family (eg less than the 955XE which the 915 seems to be based off of).
 

d3lt4

Senior member
Jan 5, 2006
848
0
76
Originally posted by: dexvx
Btw, the 915 and 925 should be coming out as well, starting around $130/$150 .

915 has no EIST, so that kind of sucks
925 has EIST.

Both do not have virtualization technology, but I'm guessing 925 power consumption be similar to the 965XE family (eg less than the 955XE which the 915 seems to be based off of).


Yea, I saw that too. Intel will be doing some really good things for consumers in the next few months. Can't wait to see prices drop. :)