Heat dispensation review?

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
well.. wouldnt it be safe to say that how much power a video card uses is related to how much heat it puts out..?

 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.
 

legoman666

Diamond Member
Dec 18, 2003
3,628
1
0
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Uh no. ALL of the heat is transferred to the air no matter what. Where do you think it goes if not into the air?
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Uh no. ALL of the heat is transferred to the air no matter what. Where do you think it goes if not into the air?

That is way some graphic cards have 90 C. Because not all the heat goes into the air, some remain trapped on to the graphic chip.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.

You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.

It's hard though to make a comparison between those two, since they have different TDPs.
 

legoman666

Diamond Member
Dec 18, 2003
3,628
1
0
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.

You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.

It's hard though to make a comparison between those two, since they have different TDPs.

No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.

A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.

You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.

It's hard though to make a comparison between those two, since they have different TDPs.

No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.

A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.

Yes, you're right and I'm stupid. :(
If the cooler reaches the same temperature of the chip, then it will stop being effective in transferring heat away from the chip. So, a cooler that keeps the card at 90C is transferring heat to the ambient air to slow, so it heats up and in response keeps the GPU hot.


 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: error8
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.

You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.

It's hard though to make a comparison between those two, since they have different TDPs.

No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.

A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.

Yes, you're right and I'm stupid. :(
If the cooler reaches the same temperature of the chip, then it will stop being effective in transferring heat away from the chip. So, a cooler that keeps the card at 90C is transferring heat to the ambient air to slow, so it heats up and in response keeps the GPU hot.

I have to read more books, dam it.

 

legoman666

Diamond Member
Dec 18, 2003
3,628
1
0
Originally posted by: error8
Originally posted by: legoman666
Originally posted by: error8
Originally posted by: Zstream
Originally posted by: error8
Originally posted by: Zstream
As the topic suggest anyone know of a review with heat dissipation in mind? I have a 3870 and 880gts with the gts pushing the limits on heat (keeps my room super hot).

So the GTS keeps your room hot, but the 3870 doesn't?

Maybe the 8800 GTS has a more efficient cooler, so that the most part of the heat generated by the GPU is transferred to the air outside (if you have the stock GTS).

The stock cooled ATI seems to have a weaker cooler, so a big part of the heat from the GPU is kept on the GPU itself and only a small part is transferred to the ambient air. Even if these two suppositions are right, I doubt you can really tell the difference in the temperature of the air inside your room. We are speaking here of a flux of heat of only 20-40 W that probably the GTS generates in plus over the ATI card and I don't think it translates in even one degree Celsius of your room temperature.

Wrong, the cooler on my HIS turbo is much better. It keeps my system temp down by 7-12C sometimes even 14C compared to my 8800gts system.

You are thinking that the hotter the GPU is, the hotter the room is. Well no, it's not like that. The more efficient the cooler is in heat dissipation, the hotter the air will be around it. So if the 3870 stays cooler, it means that more heat is taken away from the GPU and transfered to the ambient air. Theoretically the 3870 should be the one the heats the room more then the GTS, because the Nvidia card seems to hold the heat on itself "better" then that ATI.

It's hard though to make a comparison between those two, since they have different TDPs.

No. No. No. The hotter the cooler is, the more inefficient it is at removing heat. Heat does not get trapped in a GPU. You need to take a heat transfer class or something.

A GPU with a TDP of 100w @ 90C or the same GPU with a different cooler at 65C are still putting the same amount of heat into the air.

Yes, you're right and I'm stupid. :(
If the cooler reaches the same temperature of the chip, then it will stop being effective in transferring heat away from the chip. So, a cooler that keeps the card at 90C is transferring heat to the ambient air to slow, so it heats up and in response keeps the GPU hot.

Exactly, the rate of heat transfer is related to the temperature difference between the heatsink and the ambient air. The greater the difference, the faster heat transfers from the sink to the air.