My philosophy is to run it at the speed you need. So if 3Ghz is good enough for me for now, I'll oc to 3Ghz. And then a year from now, when my games demand more power, I'll crank it up to 3.3Ghz with more vCore. This will probably burn up the CPU faster, but I'm getting more use of it, and delaying buying the next best thing. And I'd see that I'll need to hit 3.5Ghz in 2009 if I want to keep the E4500. By then, it wouldn't matter much if I shorten the CPU life from 20yrs to 5yrs, because I wouldn't need it for that long.
But for now, as long as the power is enough for what I do (I don't encode video much), then I'm happy with a medium overclock, and not go extreme, because stability is far more important, and I dont' want to accidentally burn up and have all the data loss related to this. this is one aspect of OC that people dont' talk about. I only have this one PC, and it's my mission critical piece, so I can't afford to crank it skyhigh just to have it crash when I need to finish my paper.
If you have a test rig dedicated for OC, then go all out and it won't hurt anything other than your wallet. But then again, if you have a dedicated OC test rig, then what are you using it for other than simply OC? Wouldn't the additional cost offset any value you get from buying a slower processor rather than buying bleeding edge?
In that case, having 2 PC would end up costing more than what you really need. This in itself defeats the performance:value ratio, and the idea of getting the best bang for the buck, which is what OC was all about. Now it's more like the muscle car industry, people just cranking it up so they can say how much HP they have under the hood. It's just a claim on synthetic benchmark test, and they dont' run the computer for what they need the computer for.