I am trying to find a proof to link Claude Shannon's informational entropy as defined by
S = - (sum of)Pi ln Pi
with Boltzmann's statistical thermodynamic entropy
S = k ln W.
After looking into 3 p chem books and searching the web, I can't find the conversion. Am I wrong to think they are interchangeable?
S = - (sum of)Pi ln Pi
with Boltzmann's statistical thermodynamic entropy
S = k ln W.
After looking into 3 p chem books and searching the web, I can't find the conversion. Am I wrong to think they are interchangeable?