- Aug 14, 2001
 
- 12,530
 
- 35
 
- 91
 
Was just noodling about and didn't know the answer to this, nor did a quick Google provide an answer. 
Most radioactive decay is unaffected by temperature as it's a nuclear process and the energy of the electrons associated with the nucleus doesn't really matter.
But in k-capture, an electron is 'absorbed' into the nucleus. Intuitively it seems to me that cooling the atoms down might lead to a higher chance of the electron being captured as it will be at a lower energy state and therefore spend more time closer to the nucleus. conversely, heating it up will lead the electrons to higher energy states and if hot enough cause the electrons to leave the immediate vicinity (but this is mostly just the outer electrons, I assume it's the inner shells that get captured most often). Anyway, the question: can temperature affect the rate of k-capture, and if so, how much of an effect are we talking about, how much a change in temp is needed to see the effect, etc???
			
			Most radioactive decay is unaffected by temperature as it's a nuclear process and the energy of the electrons associated with the nucleus doesn't really matter.
But in k-capture, an electron is 'absorbed' into the nucleus. Intuitively it seems to me that cooling the atoms down might lead to a higher chance of the electron being captured as it will be at a lower energy state and therefore spend more time closer to the nucleus. conversely, heating it up will lead the electrons to higher energy states and if hot enough cause the electrons to leave the immediate vicinity (but this is mostly just the outer electrons, I assume it's the inner shells that get captured most often). Anyway, the question: can temperature affect the rate of k-capture, and if so, how much of an effect are we talking about, how much a change in temp is needed to see the effect, etc???
				
		
			