• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New 5.2Ghz Chip by IBM

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Well light, for one, is one conversion that energy can make. (Which eventually ends up as heat.. but whatever, I'm talking at the CPU).

1) Light is a form of radiative heat. (Spectroscopy was the topic of the experimental work in my dissertation)

While small, there is some electric energy changed into kinetic (Think electron motion). And then there is the "Things I don't understand but believe exist" possibilities for energy conversion, I'm sure something is going on at the quantum level to disperse energy in a non-heat fashion 😀. All added up, they don't compare to the amount of energy that is converted directly into heat.

2) Now you changed the entire definition of the thermodynamic system involved here.

yes, to be sure, if you partition the system down to just the CPU or even a smaller subset of the micro-world you could argue that some of the energy is temporarily converted into potential energy (capacitance in the system) as well as "work" (kinetic energy)...but you haven't balanced the energy equation until you account for what happens to that potential and kinetic energy as you run out the time-dimension.

At the end of the day if the power meter registers you using 1800W of power then you - all summed up - added 1800W of heat to your surroundings.
 
work performaed. Swithing a transistor is work. Why do people always forget this. Plus electrons in and electrons out.

There is no "switch" in a xtor, no PV work being done, the concept of work done in a xtor is entirely a man-made perception.

A xtor is a resistor, a variable resistance resistor but one nonetheless.
 
1) Light is a form of radiative heat. (Spectroscopy was the topic of the experimental work in my dissertation)



2) Now you changed the entire definition of the thermodynamic system involved here.

yes, to be sure, if you partition the system down to just the CPU or even a smaller subset of the micro-world you could argue that some of the energy is temporarily converted into potential energy (capacitance in the system) as well as "work" (kinetic energy)...but you haven't balanced the energy equation until you account for what happens to that potential and kinetic energy as you run out the time-dimension.

At the end of the day if the power meter registers you using 1800W of power then you - all summed up - added 1800W of heat to your surroundings.

🙂 Well, when you put it THAT way. I've never really considered light to be heat, but I guess it is really.
 
Earth... While no heater is 100% efficient, they are all somewhere in the neighborhood of 99.999% efficient. Heck, A 5W Incandescent light bulb is 95% efficient at producing heat, and that isn't even what it was made for!


I know. but its fun dicking with people who state absolutes.
 
There is no "switch" in a xtor, no PV work being done, the concept of work done in a xtor is entirely a man-made perception.

A xtor is a resistor, a variable resistance resistor but one nonetheless.


Right the atoms just move without a nudge. I'll have to rewrite physics laws for that one.
 
No No and No. 100% of the energy is converted to heat. There are no moving parts, emission of light or magnetic forces. I guess at 1800w it could get hot enough to glow so micro amounts of atomic excitement could happen but nothing measurable.
 
No No and No. 100% of the energy is converted to heat. There are no moving parts, emission of light or magnetic forces. I guess at 1800w it could get hot enough to glow so micro amounts of atomic excitement could happen but nothing measurable.

change in current = magnetic forces being emitted... Just sayin.
 
Unless something moves the sum of all forces (i.e. magnetic) is zero.

A change in current cause a magnetic field to be created. Thus, a change in magnetic field would cause a change in current on any metal thing. So when you turn on your computer, the screw that holds the motherboard in will get a (all be it tiny) current induced on it. The magnetic force may be zero, but the energy from creating and destroying the magnetic field will not be. (in other words, SOME energy is lost from the establishment of a magnetic field).

Now, that energy lost pretty much gets directly converted to heat at the screw, but it isn't on the CPU 😀.

Again, for all intents and purposes a CPU that consumes 1800W of energy puts out 1800W of heat at the CPU it is pretty pedantic to argue otherwise..
 
Last edited:
The magnetic force may be zero, but the energy from creating and destroying the magnetic field will not be. (in other words, SOME energy is lost from the establishment of a magnetic field).

No! Energy is never lost. Electrons in the medium always have a magnetic field. When they propagate their forces are applied to other electrons in the surrounding mediums. Their only effect would be to excite those particles into heat.
 
Guys, thus has (almost) nothing to do with the 5.2 ghz chip, you guys are off-topic. Lets get back on before I have to lock this please. 20 posts talking about heat of light bulbs and such.
 
No! Energy is never lost. Electrons in the medium always have a magnetic field. When they propagate their forces are applied to other electrons in the surrounding mediums. Their only effect would be to excite those particles into heat.

Lost from the system of just the CPU (Maybe transmitted is a better term? radiated?). The screw heats up ever so slightly, but it is outside of the system of just the CPU. Thus the energy put into the CPU system leaves it by a method other then direct transition into heat.
 
Guys, thus has (almost) nothing to do with the 5.2 ghz chip, you guys are off-topic. Lets get back on before I have to lock this please. 20 posts talking about heat of light bulbs and such.

😛 But lightbulb talk is fun! Ok, ok I'll get back on topic.

1800W is extreme for a normal home user. Even if the CPU was affordable, I can't see the cooling being affordable.

I wonder what primary OS IBM uses with these machines.
 
I run my entire office (wife's computer, mine, AC, lights, TV, etc.) from a 15 amp breaker. I guess I won't be buying one of these. 🙂
 
I agree. but i'd take a bit more power useage for a peformance bump.

Look at all the people in this form that complain about their i7's, those hunks of junk idle at damn near 100f as it is. Nobody should have to put up with that just to browse the internet, listen to music, watch video, ect.

Heat is viewed as a problem in the computing industry for good reason, part longevity being one of them, cooling costs, noise, energy consumption costs.
 
Last edited:
Look at all the people in this form that complain about their i7's, those hunks of junk idle at damn near 100f as it is. Nobody should have to put up with that just to browse the internet, listen to music, watch video, ect.

Heat is viewed as a problem in the computing industry for good reason, part longevity being one of them, cooling costs, noise, energy consumption costs.

100F? That is quite cool really (37C for everyone else) Even laptops will run at that temperature.

As for the tasks you listed. It all depends. The codec used in your video, for example, could very well be computationally complex to decode.
 
Look at all the people in this form that complain about their i7's, those hunks of junk idle at damn near 100f as it is. Nobody should have to put up with that just to browse the internet, listen to music, watch video, ect.

Heat is viewed as a problem in the computing industry for good reason, part longevity being one of them, cooling costs, noise, energy consumption costs.


If the goverment were not invovled in the energy efficiency of appliances etc. You would be seeing such a cocern. Also since the advent of lead free solder heat has become a more obssesed over thing then it once was.

I remeber when cpu's had passive air coolers and they were hot enough to cook eggs on.

Buts thats for another day.
 
No No and No. 100% of the energy is converted to heat. There are no moving parts, emission of light or magnetic forces. I guess at 1800w it could get hot enough to glow so micro amounts of atomic excitement could happen but nothing measurable.


no No No No No heat radiation is byproduct of energy saturation.

WTF are they teaching in physics these days ?
 
Look at all the people in this form that complain about their i7's, those hunks of junk idle at damn near 100f as it is. Nobody should have to put up with that just to browse the internet, listen to music, watch video, ect.

Heat is viewed as a problem in the computing industry for good reason, part longevity being one of them, cooling costs, noise, energy consumption costs.

HUH?

37C is pretty cool... oh the humanity!!11!!
 
If the goverment were not invovled in the energy efficiency of appliances etc. You would be seeing such a cocern. Also since the advent of lead free solder heat has become a more obssesed over thing then it once was.

I remeber when cpu's had passive air coolers and they were hot enough to cook eggs on.

Buts thats for another day.

Shens... the way you get on you can't be more than 16... a fleabag of the CPU subforums if you will...

At least Fleabag had the sense to not call out people. Keep it to PMs, please.
-ViRGE
 
Last edited by a moderator:
Back
Top