Satellites in space, how do the electronics inside it keep cool ?

Status
Not open for further replies.
May 11, 2008
19,560
1,195
126
At work we had a short discussion how that functions.
I was thinking about it and although it is very cold in space, it is a vacuum so convection and conduction is not going to work. That must be radiation by radiating IR. But how do they do that ?
They must convert heat to IR radiation as efficient as possible.

Also, i figured to keep the satellite cool from radiation from the sun, they use IR reflecting material. I figured that must be that gold foil the satellite companies are using.

And how do the solar panels cope with the radiation from the sun ?
The solar panels are much more efficient in space when compared to in the atmosphere on Earth, at least that is what i read. I can imagine this is because of a lack of atmosphere that absorbs certain EM wavelengths radiated by the sun. But how does the solar panels in space cope with the IR spectrum ? Do the solarpanels become very hot ?
 
Last edited:
May 11, 2008
19,560
1,195
126
Maybe in the future, we will use leds to turn heat into IR radiation. :)

http://physicsworld.com/cws/article/news/2012/mar/08/led-converts-heat-into-light

A light-emitting diode (LED) that emits more light energy than it consumes in electrical energy has been unveiled by researchers in the US. The device – which has a conventional efficiency of greater than 200% – behaves as a kind of optical heat pump that converts lattice vibrations into infrared photons, cooling its surroundings in the process. The possibility of such a device was first predicted in 1957, but a practical version had proved impossible to create until now. Potential applications of the phenomenon include energy-efficient lighting and cryogenic refrigeration.

Cool LEDs

The energy of photons emitted by an LED is dictated by the band gap of the semiconductor used – the energy required to make an electron–hole pair. When an electron and hole recombine in a radiative process, a photon carries away the extra energy. The voltage across the LED creates the electron–hole pairs but its value does not affect the photon energy, since the semiconductor's band gap is a permanent feature of the material.

However, it is possible for the individual emitted photons to have energies that are different to the band gap. The vast majority of electron–hole recombinations actually result in the production of heat, which is absorbed by the semiconductor in the form of quantized lattice vibrations called phonons. These vibrations create a heat reservoir that can then boost the energy of photons produced by radiative recombination. In 1957 Jan Tauc at the Institute of Technical Physics in Prague pointed out that, since this provided a mechanism for radiation to remove heat from a semiconductor lattice, there was no barrier in principle to an LED being more than 100% efficient, in which case it would actually cool its surroundings.
Obeys the second law

At first glance this conversion of waste heat to useful photons could appear to violate fundamental laws of thermodynamics, but lead researcher Parthiban Santhanam of the Massachusetts Institute of Technology explains that the process is perfectly consistent with the second law of thermodynamics. "The most counterintuitive aspect of this result is that we don't typically think of light as being a form of heat. Usually we ignore the entropy and think of light as work," he explains. "If the photons didn't have entropy (i.e. if they were a form of work, rather than heat), this would break the second law. Instead, the entropy shows up in the outgoing photons, so the second law is satisfied."

Despite the soundness of the physics, over the past five decades nobody had managed to demonstrate an LED actually cooling its surroundings. One way researchers tried to maximize the number of photons produced was to increase the bias voltage across the LED, but this also increases the heat produced through non-radiative recombinations.

So, Santhanam and colleagues did the exact opposite and reduced the bias voltage to just 70 µV. They also heated the LED to 135 °C to provide more lattice heat. In this regime, less than 0.1% of the electrons passing through the LED produced a photon. However, when the researchers measured the minute power of the infrared radiation produced by the LED, they measured 70 pW of power being emitted by the LED while only 30 pW was being consumed, an efficiency of more than 200%. This happens because as the voltage approaches zero, both light output and power dissipation also vanish. However, the power dissipated is proportional to the square of current, whereas light output is proportional to the current – halving the bias voltage therefore doubles the efficiency.
Important breakthrough

One possible application of the effect is a refrigeration device that removes heat in the form of light. As an expert in this field, Jukka Tulkki of Aalto University in Finland, told physicsworld.com, "I think this is a historically important breakthrough…that could eventually lead to more useful and technologically relevant applications." However, he cautions that the cooling power of this particular device is extremely low and not great enough for any practical applications.

Santhanam, meanwhile, believes the principle may find applications in fields other than refrigeration. "My personal opinion is that it's more likely to be useful as a light source," he says. "Refrigerators are mostly useful when they are high power. Light sources, however, are used in all kinds of ways. In particular, light sources used for spectroscopy and communication don't necessarily need to be very bright. They just need to be bright enough to be clearly distinguishable from some background noise."

The research is published in Physical Review Letters.
 

Railgun

Golden Member
Mar 27, 2010
1,289
2
81
In a lot of cases, they may need to keep warm rather than cool. These electronics are not directly exposed to sunlight. There was a recent comment regarding the Philae lander about this very thing.
 

Jeff7

Lifer
Jan 4, 2001
41,599
19
81
In a lot of cases, they may need to keep warm rather than cool. These electronics are not directly exposed to sunlight. There was a recent comment regarding the Philae lander about this very thing.
I know the Mars Exploration Rovers had a special "warm box" for the electronics. That can keep them safe from the extreme temperature shifts encountered during day/night and summer/winter cycles on Mars, and just give a more normal operating environment.


Something to check out would be some of the probes to Venus and Mercury.
(Venus Express and Messenger are two recent ones.)

Messenger's got pretty small solar panels, and it's got a sun shield in the front. I'd guess that if it has a problem with attitude control and tumbles the wrong way, it'd get toasted to death.
Also compare Messenger's solar panels to Rosetta's.

I think that even happened to an orbiter way out at Mars.
Checking...


Yes, here we go.

"The spacecraft reoriented to an angle that exposed one of two batteries carried on the spacecraft to direct sunlight," read a NASA press release describing the most likely cause of the failure.

"This caused the battery to overheat and ultimately led to the depletion of both batteries. Incorrect antenna pointing prevented the orbiter from telling controllers its status, and its programmed safety response did not include making sure the spacecraft orientation was thermally safe."
All the way out at Mars and a component still overheated from improper exposure to sunlight.



Out in space, radiative cooling is all you've got. In some scenarios here on Earth, if you're including radiative effects along with your convection or conduction calculations, your design is probably too sensitive.;) But it depends greatly on the environment.
In some cases, it can make a big difference.

Like any equations dealing with heat transfer, the difference in temperature from one thing to another is used. But radiation looks at temperature to the fourth power. It's multiplied by a very small constant, but when you're up against something that's to the 4th power, it doesn't take long for that difference to start to matter, and that applies whether you're trying to radiate energy into the cold microwave background of space, or receiving energy from a very hot sphere of plasma.

One example is why the desert can get so cold on a clear night. Its radiation environment is the atmosphere and the background of space. The atmosphere and the background of space do put out EM radiation, as is the case with anything that has a temperature above absolute zero, or in other words, everything. But they're not putting out as much radiation as the stuff in the desert is. So the desert surface can lose energy into the sky and get quite cold.
Same with Mercury. The night side is quite cold. It rotates out of the sunlight with a high temperature, and now you've got a very high ΔT versus the background of space. (That's T⁴hot - T⁴cold, in Kelvin.) It's going to start shedding energy straight away.
 
Last edited:
Status
Not open for further replies.