- Jan 12, 2004
 
- 11,078
 
- 11
 
- 81
 
First off, I want to thank graysky for taking the time to write up a great OCing guide. I've not OC'd in a very long time (since the first Celeron breached the market), so I have to admit--I was a little nervous. 
But, everything went very smoothly. My Q6600 (G0) did OC nicely to 3Ghz/per core (9 x 333). Infact, it's been pretty stable so far and I'm really impressed with everything. I've had one or two lock ups in Crysis and Hellgate, but at this time it's difficult to determine if it's software related or because of the OC.
Anyway, the reason I created this thread was to find out about core temps. I noticed last night, after playing Hellgate for a few hours, that my core temps were (in celcius):
49
48
44
43
Is that fine? I imagine CPUZ or my mobo would sound some kind of alarm if the temps were too hot--but I want to make sure because the CPU was expensive.
Also, I wanted to ask; Is there any anecdotal or scientific evedance that overclocking shortens the life of cpus/gpus? It makes sense that it would but I'm wondering if anyone can speak on this at all.
Can someone recommend a good program to monitor GPU temps? I poked around the nVidia control panel, but couldnt find the temp. The older control panel was easier to navigate and I know included a temp guage.
			
			But, everything went very smoothly. My Q6600 (G0) did OC nicely to 3Ghz/per core (9 x 333). Infact, it's been pretty stable so far and I'm really impressed with everything. I've had one or two lock ups in Crysis and Hellgate, but at this time it's difficult to determine if it's software related or because of the OC.
Anyway, the reason I created this thread was to find out about core temps. I noticed last night, after playing Hellgate for a few hours, that my core temps were (in celcius):
49
48
44
43
Is that fine? I imagine CPUZ or my mobo would sound some kind of alarm if the temps were too hot--but I want to make sure because the CPU was expensive.
Also, I wanted to ask; Is there any anecdotal or scientific evedance that overclocking shortens the life of cpus/gpus? It makes sense that it would but I'm wondering if anyone can speak on this at all.
Can someone recommend a good program to monitor GPU temps? I poked around the nVidia control panel, but couldnt find the temp. The older control panel was easier to navigate and I know included a temp guage.
				
		
			