• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Poll: How far can hardware be pushed? Is it a question of heat?

Deanodarlo

Senior member
Just thought I start a discussion about hardware - will heat ultimately be the deciding factor in how fast computers develop - especially processors?

As mhz double every year heat production is becoming a major issue - have amd and intel shot themself in the foot by developing their products TOO quickly.

As the processors head above the 2gig mark, solutions are going to be harder and more costly to find in order to combat heat. Processors have already shot up in power way ahead of the average users demand causing relatively new and powerful processors (eg 600mhz durons) being made to look obsolete - all within a matter of months!

For example:

Most users are happy with 600Mhz PCs that can run any business application, surf the web, play games and run dvd drives.

Those that crave power of course help keep the market flowing, although even they must get sick of upgrading every month or so.

Many people who upgrade once or twice a year (including many businesses) will miss out on a particular processor altogether - one that they may well have purchased if the market had been fed mhz more slowly. Is the current rate of development out of control and will it ultimately hurt the developers?

The present batch of hardware is getting ridiculous - memory with heat sinks, graphics cards with fans, processors requiring greater heating solutions, system cases with three + fans fitted, the list goes on.

How long will it be before we can heat our rooms on PCs alone?

A simple fact is faster means hotter - its a law of a computer chip. Perhaps amd and intels war will end sooner rather than later - all because they've released chips at an alarming rate........I wonder?
 
AMD will be using a new silicon structure which will let chips run cooler.
At the moment hardware is speeding ahead of sofware. At least that way it's nice for programmers to know they have headroom for whatever they'll dream up.
 
Dean, you bring up a good point, but I'm not sure I follow your arguement (similar to Fk's post). Are you saying that designs are too fast for their own good, or that heat will be the eventual limiter? It comes out like you are saying faster processors are pointless (since we have fast enough computers alreayd), but that faster processors won't be possible because heat will stop forward progress.



<< A simple fact is faster means hotter - its a law of a computer chip. >>



I disagree... it's not a simple fact. Today's 733MHz Coppermines use less power than the older 500MHz Katmai's. In fact, the relationship between power dissipation, performance and silicon process generation is anything but simple.

In my opinion, power dissipation is a simple result of having a lot more transitors while not actually making individual chip designers responsible for their own power budget. In current and previous high performance CPU designs, heat was a non-issue. No one cared. Now designs are pushing the cooling capability of conventional (fan and heatsink) cooling techniques and designers are starting to take the issue seriously. I don't think that heat will be the limiter to higher performance designs. Architectures like the Intel XScale (StrongARM) and Transmeta Crusoe show that performance and low power are not necessarily opposites - if the designers consider low power to be important. This hasn't been the case in the past, but will be in the future. Similar to other previously unimportant/inconsequential issues on chips such as wire inductance, wire resistance and cross-coupling noise, heat is becoming an important concern to future designs. But I don't think it will limit overall computer performance.



Patrick Mahoney
Microprocessor Design Engineer
Intel Corp.
 
I agree with pm, you can't say that the companies should not developed better processors because they aren't needed. They willed be needed eventually, and it is better to have them early than to have them late. Also, I don't heat will limit hardware. Solutions to the heat problem will be developed as the chip companies run into more and more heat problems and realize they need a solution.
 
Thanks pm for filling me in - i'm no expert in the computer field and i can see now that I was talking from a position of ignorance.

I respect you for setting me straight on the issue of future chip design - this is how inexperienced people like me learn.
 


<< Thanks pm for filling me in - i'm no expert in the computer field and i can see now that I was talking from a position of ignorance. I respect you for setting me straight on the issue of future chip design - this is how inexperienced people like me learn. >>


I apologize for coming on harshly in my reply. I didn't intend to. I was just typing and I didn't realize how much of a slam it came out as until I read your reply. Like I said, you brought up a good point, Dean. Heat is a definite concern. I just think, like so many other things that have started as non-issues and are now design considerations, that future designs will have strongly enforced power envelopes to design within and that this will take care of the problem.

In fact, between my posting and my reading yours, an informal conversation started outside my cube about power dissipation (I'd say it was a &quot;heated discussion&quot;, but then people would start throwing stuff at me 😛 ). It's on engineers minds nowadays.
 
Back
Top