Originally posted by: cmdrdredd
Originally posted by: stev
Originally posted by: cmdrdredd
Originally posted by: stev
Originally posted by: cmdrdredd
Originally posted by: stev
Yes that would be interesting. I don't know jack about overclocking, but others around here do. I have yet to see solid facts on the effects of overclocking on longevity of a CPU (links anyone?). Most people do it for a short-term gaming performance increase. But hey, if it's on a CPU you may end up replacing anyway, why not? I'm interested to know how it goes.
It's likely that your CPU will last longer than you'll keep it in an overclocked situation provided you have adequate cooling to compensate for the heat output generate from higher frequencies and higher voltages if you do alter them.
And that's pretty much what everyone says. But when I ask to what extent small/med/large amounts of overclocking will affect a CPUs lifespan (or the lifespan of other hardware), no one has any proof. Since there are tons of benchmarking datasets and no data on CPU lifespan versus level of overclock, that leads me to believe overclocking is purely a short-term user's game and of no interest to me on a system where being reliable is much more important than extra performance.
If you actually plan to overclock a system and are worried about fault tolerance, 100% 24/7 uptime etc...you're going to be called an idiot.
Precisely why I wouldn't do it. For me, any performance gain wouldn't be worth the possible loss in reliability. Still, the topic of overclocking and the fact that no one ever talks about any downsides is interesting to me.
Interesting how? We upgrade BEFORE the CPU fails. We use the old systems to run servers or DC boxes after we're done with them in our main rig.
I have a P4 2.6 @ 3.4Ghz that has been running just about 24/7 for going on 6 years now.
If you don't think that's enough...I can't help you