Just wondering if ATi ever acknowledged that their cards run (too) hot and will be addressing that with the R600 series. I'm curious to know if the reason the X1900 series ran so hot was because they HAD to clock them that high to compete with the 7900 series. I'm guessing if they could have gotten away with lower clocks that ran cooler they would have.
I'm curious how much temperature improvement will be seen on the smaller die process.
The 8800 series is 90nm and seems to run fairly warm (anything above 66C is downright HOT, imho, and guaranteed to shorten the longterm lifespan of the product. Seeing as the X1900s that ran up toward 80C and higher started to run into artifacting and early deaths demonstrates that point rather well.)
I'd like to see the 80nm (maybe 65nm) R600 be powerful enough hardware-wise to not necessitate the clockspeeds that push it above 65C.
Considering the 7900 series ran relatively cool (my 7900GT KO doesn't pass 60C even under intense load) I see no reason DX10 cards with their much more powerful stream processing and GDDR4 and whatever else should run any hotter while outperforming the DX9 gen cards and utilizing a smaller die.
I'm curious how much temperature improvement will be seen on the smaller die process.
The 8800 series is 90nm and seems to run fairly warm (anything above 66C is downright HOT, imho, and guaranteed to shorten the longterm lifespan of the product. Seeing as the X1900s that ran up toward 80C and higher started to run into artifacting and early deaths demonstrates that point rather well.)
I'd like to see the 80nm (maybe 65nm) R600 be powerful enough hardware-wise to not necessitate the clockspeeds that push it above 65C.
Considering the 7900 series ran relatively cool (my 7900GT KO doesn't pass 60C even under intense load) I see no reason DX10 cards with their much more powerful stream processing and GDDR4 and whatever else should run any hotter while outperforming the DX9 gen cards and utilizing a smaller die.