Originally posted by: BooGiMaN
first you teach those little buggers to swim.....
Originally posted by: MrBond
You can calibrate your thermometers, but it's a difficult process. You need several solutions of known temperatures that fall within the range of your thermometer. You mark the actual reading on the therometer with the actual temperature (ie: liquid is 65°C, thermometer reads 70°C) and that's a point on your graph. You repeat this for at least 3 points (4+ is better) and get a calibration curve, which you can then use to tell the real temperature of your freezer.
Like I said, the process sucks, but that's how you calibrate them. It's not usually something the average person can do unless you have access to at least one known-accurate therometer that you can check your standard solutions with.
Originally posted by: MrBond
It's not usually something the average person can do unless you have access to at least one known-accurate therometer that you can check your standard solutions with.
Originally posted by: edprush
Originally posted by: MrBond
It's not usually something the average person can do unless you have access to at least one known-accurate therometer that you can check your standard solutions with.
How the heck do you find an accurate thermometer? Just because one costs $90 doesn't mean it's any more accurate than a $20 one.
I don't think anyone knows the real temperature.
Originally posted by: Babbleswhy do you care if it is accurate more than ±3º or so? Seems as if you are trying to overly complicate something.
Listen to DrPizza. Make a batch of ice/water. Make certain there is about equal portions of ice and water, and that there aren't big pockets of just water or just ice. Stir it and wait a few minutes for it to reach ~32°F (forgetting to wait would be the biggest source of error). Then put in each thermometer. Again wait for them all to settle and give consistant readings. Then you'll know how far off each thermometer is. Add or subtract the error from each other.Originally posted by: DrPizza
Yes, it's easy to do.
Get some ice water. It's 32 degrees (F). Compare to the temperature on your thermometers. Add or subtract the necessary number of degrees.
Since you're working with refrigerator temperatures, the fact that the thermometers might not go up exactly 1 degree on their markings every time the actual temperature goes up one degree is most likely insignificant. i.e. if the thermometers actually go up 20 degrees when the actual temperature goes up 21 degrees or 19 degree, it's only going to mean about a 1/2 degree difference.
For best results, use distilled water when making your ice, and use distilled water for the ice water. If you have a physical constants book at your disposal, you can correct for your altitude/atmospheric pressure, etc. But, again, that's pretty much insignificant for your purposes.
Originally posted by: dullard Then put in each thermometer. Again wait for them all to settle and give consistant readings. Then you'll know how far off each thermometer is. Add or subtract the error from each other.
That's why you might have to do it at several points like I suggested earlier. That's really not nessecary for the average person, but it can be done for highly sensitive cases. Since you're not in need of super-accurate results, you can probably get away with using ice and boiling water, remembering that water boils at 212F only at sea level and correcting accordingly. As long as the therometer varies linearly, you'll be fine with two points.Originally posted by: edprush
Originally posted by: dullard Then put in each thermometer. Again wait for them all to settle and give consistant readings. Then you'll know how far off each thermometer is. Add or subtract the error from each other.
That was very good information about the various locations in the refrigerator differing in temperature. Thanks.
But on another note, just because the temperature is 3º off at 32º doesn't mean it is 3º off at 100º...especially with those coil thermometers.