Entropy & Information

Analog

Lifer
Jan 7, 2002
12,755
3
0
I was listening to a podcast of Dr. Leonard Susskind on CBC radio, and he said that Physicists have made the connection between entropy and information. i.e. the amount of information in a system is what he calls entropy. I always thought (from thermodynamics) that entropy was the state of disorder in a system.

I was wondering if anyone could elucidate on this topic. How are these related? If anyone wants a link to the podcast, here it is:

http://www.cbc.ca/quirks/archives/07-08/jan05.html
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
I think what he means is that with less entropy in a system everything is more ordered and therefore contains less information (you don't need as much information to describe the quantum state of an ordered crystal)

hmm...you could probably extend this and say that entropy can't change faster than the speed of light.
 

QuixoticOne

Golden Member
Nov 4, 2005
1,855
0
0
Thanks for the link to the podcast.

It is an interesting idea. I've got a book of papers on the topic that may be interesting further reading:

"Complexity, Entropy, And The Physics Of Information"
Edited by Wojciech H. Zurek.

A proceedings volume in the Santa Fe Institute In The Sciences Of Complexity.

There certainly are some profound cross-disciplinary paradigms that have emerged in relation to these ideas.

i.e.
* Uncertainty / Information / Observability / Energy / Position / Momentum in Quantum Mechanics.

* Entropy / Order in Thermodynamics.

* Information, Entropy in the science of Information Theory and communications.

* Symmetry, Group Theory in Quantum theory and relative to particle physics / unified theories.

* Invariants / Conservation laws relevant to classical mechanics, quantum mechanics, information theory, particle physics, thermodynamics, et. al.

I suppose it's a question of different specialization fields of scientists starting to learn to use similar metadata, adopt a more common vernacular, and use more similar mathematical techniques (information theory, conservation laws, probability statistics, logic, group theory, invariance, et. al.) to express theories in very different mathematical and physical sciences.

In relativity mechanics you can certainly look at the information flow between systems in terms of space-time, light cones of events, et. al. and then look at how "reality" evolves as events begin to affect each other.

In chemistry / metallurgy / crystallography / thermodynamics you can start to look at order / disorder, annealing processes, crystallization processes, free energies, "temperature" (as a measure of effectively entropy of energy), et. al. and realize that you're basically talking about processes mediated by entropy / information in variables of temperature, concentration, position, energy, et. al.

One can look at any diffusion or advection process as an information flow of entropy / energy so your physical flows and the shock waves / gradients of those processes become lines of iso-information / entropy as the information about a change in one variable propagates through the system. So a balloon bursting is all about the transfer of infomation about the presence/absence of a confining skin inward to the gas molecules as they bounce around in brownian motion. Then as your information propagates the entropy changes in suit as the pressure / volume / temperature manifestations (basically changes of variable of the same basic informational flow) ensue et. al.

In particle physics you can talk of CPT conservation, (C) total charge, (P) parity, (T) time invariance. In mechanics you talk of conservation of momentum. In relativity you talk of conservation of mass/energy. In quantum mechanics you talk of properties of wavefunctions such as position, momentum. As before temperature is nothing but a statistic of energy / entropy. And ignoring FTL propagations as unphysical thus all of physical reality is basically about propagation of information between events (quantum or relativistic or statistical as in thermodynamics) and entropy as relates to physical fields / energy / temperature / system observables.

It is interesting to look at areas that start to blur the lines of classical, quantum, thermodynamic such as Maxwell's Demon or how classical discrete physics blends into statistical physics of things like quantum mechanics or macroscopic thermodynamics.

Look into Hamiltonian / Lagrangian Mechanics, Information Theory, and Thermodynamics as mathematical forms talking about probability, uncertainty, information, and entropy and it'll start to blend together whether your physical variables are pressure, density, energy, temperature, electrical field, momentum, et.al.

http://en.wikipedia.org/wiki/E...and_information_theory

http://en.wikipedia.org/wiki/Information_Theory
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
By the way, I really think the path some physicicts are taking with analyzing the "nature" of information is a waste of time. We're not getting anywhere with it and seems more like philosophy than physics.