- Apr 2, 2001
- 5,661
- 5
- 81
It amazes me that after years of having on-die temperature diodes, we're still guessing how hot the core of a CPU is. Granted, there is some method to the madness that is CPU temperature readings, but it's not consistent across motherboards. Seems to me like the standard for reading the value on the cpu temperature diode would be passed on to motherboard manufacturers so that they can calibrate their sensors accordingly. Wouldn't the value be consistent at least within a chip generation? The boards have no problems with running x86 code - something far more complex than taking the value reported by the diode and turning it into a proper temperature reading.
Instead, what we find are motherboards that read the same chip with the same cooling with the same ambient temperature as having temperatures that differ by up to 10C! Is there some technical roadblock here? Is the true temperature reading not known because there is no benchmark to base the reading? Are manufacturers trying to edge each other out by offering that their board will cause your chip to run cooler? Are manufactuers just too incompetent/lazy to provide an accurate solution? I'm guessing the latter two are not true, but the former is possible. What do you think?
Instead, what we find are motherboards that read the same chip with the same cooling with the same ambient temperature as having temperatures that differ by up to 10C! Is there some technical roadblock here? Is the true temperature reading not known because there is no benchmark to base the reading? Are manufacturers trying to edge each other out by offering that their board will cause your chip to run cooler? Are manufactuers just too incompetent/lazy to provide an accurate solution? I'm guessing the latter two are not true, but the former is possible. What do you think?