thilanliyan
Lifer
- Jun 21, 2005
- 11,943
- 2,171
- 126
Originally posted by: chizow
Look at the close-up that you linked....its quite clear the majority of that heatsink is just housing for the 160mm fan.
Look at the cutout for where the fan goes, those look like circular fins. There are also fins in the exhaust portion. Heatpipes alone do not provide adequate cooling...in fact they provide almost none...they're there to move the heat to a more efficient cooling surface which would be the fins.
Originally posted by: chizow
But since we have the professor here, maybe you can explain to Thilan a heatsink dominated by a 160mm fan is not going to provide the full surface area for heat dissipation.
Here's a link to power consumption numbers (these are the original revisions of the consoles I think):
http://www.hardcoreware.net/reviews/review-356-2.htm
The PS3 consumes similar (actually a bit more) power to a 360. If you're asserting that the PS3 cooler is actually not that substantial, then the PS3 would suffer from overheating and reliability issues as well. In fact in newer revisions of the PS3...some cooling capacity was actually taken away as it didn't need it...compare this to the 360 where cooling capacity had to be added even with the die shrinks...showing that the cooler MS decided to use was inadequate from the beginning.
Also, there are plenty of power consumption numbers for the 360, I've already provided some in this thread.
I said separately for the GPU and CPU.
I've read the entire article, I guess you didn't read the whole thing or didn't fully understand it:
Microsoft does not design GPUs. ATI does. ATI designed a GPU for Microsoft that spec'd for high lead eutectics (<--***THIS IS FALSE***), so again, it falls back on ATI's faulty fireball of a design. You can say it was Microsoft's fault for using eutectic bumps but again, ATI designs GPUs, Microsoft does not. Expecting Microsoft to respin a piece of silicon they didn't design in the first place is a bit of a joke.
Did you read the part I quoted? Here I'll quote it again since you seem to not understand who was responsible for the high lead eutectics:
"Although ATI made the switch to eutectic bumps with its GPUs in 2005, Microsoft was in charge of manufacturing the Xenos GPU and it was still built with high-lead bumps, just like the failed NVIDIA GPUs."
That can't be very difficult to understand. So if it was ATIs call they would have used eutectic bumps but Microsoft in all their wisdom decided to use the high lead bumps...so whose fault is that?
Originally posted by: ArchAngel777
Thilan,
I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.
Yeah that was pretty funny actually...it was like a kid covering their ears and yelling "la la la la la la la" just because they didn't want to believe you.
Originally posted by: cmdrdredd
Doesn't matter, hotter = hotter regardless of where the heat goes :roll:
Of course it's better that your 280 runs cooler (and I have to say some of the NV cards have very nice cooling solutions)...that just means it has either a better designed cooler or one card is set up for silence rather than ultimate cooling performance.
BUT as ArchAngel said there's a difference between the temperature of the card and the actual heat output. What he was referring to was where someone said their room is heated more by a 4870x2 (or was it 4870 x3...I can't remember) than a GTX 260 SLI (or tri-SLI...again I can't remember)...which is false as there were links that showed a GTX260 consumed more power at load, which is all dumped into the case/room as heat.