Originally posted by: legoman666
dissipation ?
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Uh no. ALL of the heat is transferred to the air no matter what. Where do you think it goes if not into the air?
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.
It's hard though to make a comparison between those two, since they have different TDPs.
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.
It's hard though to make a comparison between those two, since they have different TDPs.
No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.
A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.
Originally posted by: error8
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.
It's hard though to make a comparison between those two, since they have different TDPs.
No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.
A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.
Yes, you're right and I'm stupid.
If the cooler reaches the same temperature of the chip, then it will stop being effective in transferring heat away from the chip. So, a cooler that keeps the card at 90C is transferring heat to the ambient air to slow, so it heats up and in response keeps the GPU hot.
I have to read more books, dam it.
Originally posted by: error8
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
So the GTS keeps your room hot, but the 3870 doesn't?
Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).
The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.
It's hard though to make a comparison between those two, since they have different TDPs.
No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.
A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.
Yes, you're right and I'm stupid.
If the cooler reaches the same temperature of the chip, then it will stop being effective in transferring heat away from the chip. So, a cooler that keeps the card at 90C is transferring heat to the ambient air to slow, so it heats up and in response keeps the GPU hot.