I have talked to people on madonion.com, hardforums.com and seen web sites such as this one that claim that you can save money by overclocking.
The site pcstats.com says:
<< Let's be honest, we all want more than what we paid for. Overclocking is the most common method of getting more for your hardware dollars. Sure 800Mhz is darn fast, but what if it was able to do 1Ghz or more? Coincidentally, overclocking can save you some money too! >>
Yet what I believe they forgot to take into account was the cost of operating the equipment that enables them to overclock.
Let's say I purchased a 1.3GHz Pentium 4 CPU from Google Gear for $165. Then I purchased an 80 watt peltier for $25 from 3DCool.com. For a total of $190. After that I put together my system and with the peltier I manage to overclock my system to 1.5GHz for a gain of 200mhz. According to Anandtech's weekly CPU price guideI saved $39 by taking the original price of the CPU which is $165 adding the $25 from the cost of the peltier and subtracting that from the price of a 1.5Ghz which is $229. So it is: $229 - ($165+$25)=$39.
Electricity in my area which is San Diego, California costs about 8 cents per kilowatt hour. This comes from my local electric company's web site:
Historical Price of Electric Commodity
So if you do the math by multiplying .08 kilowatt hours (the amount of electricity the peltier uses to operate) by 8 cents (the price per kilowatt hour in my area) it costs about .64 cents an hour to run the peltier. So it is: (80/1000)*8 = .64.
At .64 cents an hour after I had the computer on for 6093 hours amount of money that I on overclocking would completely gone. Any time after 6093 hours I would actually start to lose money. I got this by dividing the amount of money saved which was $39 by the amount it costs to operate the peltier which is .64 cents an hour which gives you the number of hours until you lose the amount of money you saved. So it is: 39/.0064 = 6093. Or you can do it by multiplying .0064 by 6093 to get 39. So it is: .0064*6093=39. Since I leave my computer on pretty much 24/7 for cracking purposes after 8 months or 253 days the money I saved would be gone. Any amount of time after that I would actually start to lose money. Not to mention the fact that here in California the price of electricity can sky rocket at the drop of a hat and I could end up paying a lot more for that peltier than it is worth.
In conclusion I really don't see the point of overclocking in an economic sense. While this is only one particular overclocking set up the fundamental problem I see with overclocking is the equipment that enables you to overclock incurs a continuous cost due to the extra electricity it requires to power it. So after enough time the amount of money you saved by overclocking will be gone. After that point you will actually start to lose money by overclocking.
The site pcstats.com says:
<< Let's be honest, we all want more than what we paid for. Overclocking is the most common method of getting more for your hardware dollars. Sure 800Mhz is darn fast, but what if it was able to do 1Ghz or more? Coincidentally, overclocking can save you some money too! >>
Yet what I believe they forgot to take into account was the cost of operating the equipment that enables them to overclock.
Let's say I purchased a 1.3GHz Pentium 4 CPU from Google Gear for $165. Then I purchased an 80 watt peltier for $25 from 3DCool.com. For a total of $190. After that I put together my system and with the peltier I manage to overclock my system to 1.5GHz for a gain of 200mhz. According to Anandtech's weekly CPU price guideI saved $39 by taking the original price of the CPU which is $165 adding the $25 from the cost of the peltier and subtracting that from the price of a 1.5Ghz which is $229. So it is: $229 - ($165+$25)=$39.
Electricity in my area which is San Diego, California costs about 8 cents per kilowatt hour. This comes from my local electric company's web site:
Historical Price of Electric Commodity
So if you do the math by multiplying .08 kilowatt hours (the amount of electricity the peltier uses to operate) by 8 cents (the price per kilowatt hour in my area) it costs about .64 cents an hour to run the peltier. So it is: (80/1000)*8 = .64.
At .64 cents an hour after I had the computer on for 6093 hours amount of money that I on overclocking would completely gone. Any time after 6093 hours I would actually start to lose money. I got this by dividing the amount of money saved which was $39 by the amount it costs to operate the peltier which is .64 cents an hour which gives you the number of hours until you lose the amount of money you saved. So it is: 39/.0064 = 6093. Or you can do it by multiplying .0064 by 6093 to get 39. So it is: .0064*6093=39. Since I leave my computer on pretty much 24/7 for cracking purposes after 8 months or 253 days the money I saved would be gone. Any amount of time after that I would actually start to lose money. Not to mention the fact that here in California the price of electricity can sky rocket at the drop of a hat and I could end up paying a lot more for that peltier than it is worth.
In conclusion I really don't see the point of overclocking in an economic sense. While this is only one particular overclocking set up the fundamental problem I see with overclocking is the equipment that enables you to overclock incurs a continuous cost due to the extra electricity it requires to power it. So after enough time the amount of money you saved by overclocking will be gone. After that point you will actually start to lose money by overclocking.