- Jan 3, 2008
- 517
- 0
- 0
I heard that 80c is the limit for GPU and 60c is the limit for CPU.
When do you start seeing artifacts?
When do you start seeing artifacts?
I deeply apologize that I care about my components getting damaged.Originally posted by: wired247
I draw the line where the manufacturer tells me to do so.
It is usually available on the website or in the documentation.
People are overly paranoid about temps, with the mentality that "what is too hot for me is too hot for my processor!"
this is a silly notion but a good number of people will assume that 25C must be superior to 50C in terms of performance
Originally posted by: wired247
I draw the line where the manufacturer tells me to do so.
It is usually available on the website or in the documentation.
People are overly paranoid about temps, with the mentality that "what is too hot for me is too hot for my processor!"
this is a silly notion but a good number of people will assume that 25C must be superior to 50C in terms of performance
Originally posted by: CatchPhrase
I deeply apologize that I care about my components getting damaged.Originally posted by: wired247
I draw the line where the manufacturer tells me to do so.
It is usually available on the website or in the documentation.
People are overly paranoid about temps, with the mentality that "what is too hot for me is too hot for my processor!"
this is a silly notion but a good number of people will assume that 25C must be superior to 50C in terms of performance
Originally posted by: CatchPhrase
I heard that 80c is the limit for GPU and 60c is the limit for CPU.
When do you start seeing artifacts?
Originally posted by: wired247
Originally posted by: CatchPhrase
I deeply apologize that I care about my components getting damaged.Originally posted by: wired247
I draw the line where the manufacturer tells me to do so.
It is usually available on the website or in the documentation.
People are overly paranoid about temps, with the mentality that "what is too hot for me is too hot for my processor!"
this is a silly notion but a good number of people will assume that 25C must be superior to 50C in terms of performance
The point is, the manufacturer's knowledge about appropriate temps > your knowledge.
People think "50C feels hot to the touch. It must be bad for a processor" "80C is really cooking."
When these very same temps are 110% tested and appropriate for various CPU/GPUs .
What is hot for you != what is hot for your processor
Originally posted by: Idontcare
I seriously doubt Intel knows what is best for my Western Digital hardrives.
Are you going to grow up anytime soon or do we get to continue to enjoy your perception that your opinion > anyone else's? You continue to press this angle of attack that your opinion is superior to all others by way of assuming everyone else's ability to have an opinion is subject to your assessment and approval. Totally FTL dude.
Originally posted by: Binky
Originally posted by: Idontcare
I seriously doubt Intel knows what is best for my Western Digital hardrives.
Are you going to grow up anytime soon or do we get to continue to enjoy your perception that your opinion > anyone else's? You continue to press this angle of attack that your opinion is superior to all others by way of assuming everyone else's ability to have an opinion is subject to your assessment and approval. Totally FTL dude.
Keep it civil. I tend to agree with wired247 and I don't see anything inflamatory in his/her post. Many people worry too much about temps. Your post seems a bit angry.
Nothing was said about HD temps. If you want info there, read the Google study on HD temps and HD lifespans.
Originally posted by: Idontcare
Oh really? You don't find statements where its bolded that others are "too paranoid" and that they have "silly notions" as being a bit inflamatory?