imported_wired247
Golden Member
- Jan 18, 2008
- 1,184
- 0
- 0
Originally posted by: Idontcare
Originally posted by: wired247
Originally posted by: CatchPhrase
I deeply apologize that I care about my components getting damaged.Originally posted by: wired247
I draw the line where the manufacturer tells me to do so.
It is usually available on the website or in the documentation.
People are overly paranoid about temps, with the mentality that "what is too hot for me is too hot for my processor!"
this is a silly notion but a good number of people will assume that 25C must be superior to 50C in terms of performance
The point is, the manufacturer's knowledge about appropriate temps > your knowledge.
People think "50C feels hot to the touch. It must be bad for a processor" "80C is really cooking."
When these very same temps are 110% tested and appropriate for various CPU/GPUs .
What is hot for you != what is hot for your processor
I seriously doubt Intel knows what is best for my Western Digital hardrives.
Are you going to grow up anytime soon or do we get to continue to enjoy your perception that your opinion > anyone else's? You continue to press this angle of attack that your opinion is superior to all others by way of assuming everyone else's ability to have an opinion is subject to your assessment and approval. Totally FTL dude.
Please stop trying to start a flame war with me.
If you don't like what I have to say, you are free to ignore me, but your posts are seriously bordering on harassment.
Also, the topic at hand was about CPU/GPU, I don't know why people put words in my mouth when I never mentioned "your" WD HDs. (Straw man argument...)
Now let me make something clear about temperature versus heat.
If you do not change any variables, like assume a non-OC'ed processor under 100% load measuring a temperature of 50C:
regardless of what temperature your processor core reads, it is putting out the same amount of heat (all other variables remaining constant).
All of the heat is coming into your case. The only difference between high temperatures and low temperatures (with all else remaining constant) is the efficiency of your heatsink/fan combo.
So, it matters very little to your hard drives whether your CPU reads "32C" or "50C" if your processor is in fact putting out the same amount of heat.
Now, I think lower temperatures are fantastic. It gives your CPU/GPU more overhead in case dust, ambient temperatures, or failing fan speeds becomes an issue. It also may extend the life of your processor.
However, when judging whether a CPU/GPU is "too hot" or "well within limits" what reference *should* one use, if not the manufacturer's?