Entropy

Entropy

Date:  Saturday, November 17, 2018 - 10:30

Venue:  Martin Wood Lecture Theatre, Clarendon Laboratory

The practical objective of improving steam engines led to an impressive level of abstract thinking during the development of thermodynamics. In particular, it is remarkable that the significance of entropy, defined in terms of reversible heat flow, was recognised before it had a microscopic interpretation as a measure of the degree of disorder. These lectures will survey some of the ways that the notion of entropy appears in physics, from its origins to current research frontiers.

 

Speakers


Prof Alex Schekochihin

Entropy: Gaining Knowledge by Admitting Ignorance

Podcast Presentation (PDF)

When dealing with physical systems that contain many degrees of freedom, a researcher's most consequential realisation is of the enormous amount of detailed information about them that she does not have, and has no hope of obtaining. It turns out that this vast ignorance is not a curse but a blessing: by admitting ignorance and constructing a systematic way of making fair predictions about the system that rely only on the information that one has and on nothing else, one can get surprisingly far in describing the natural world. In an approach anticipated by Boltzmann and Gibbs and given mathematical foundation by Shannon, entropy emerges as a mathematical measure of our uncertainty about large systems and, paradoxically, a way to describe their likely behaviour—and even, some argue, the ultimate fate of the Universe. Alex Schekochihin will admit ignorance and attempt to impart some knowledge.

 

Prof John Chalker

Entropy: two short stories

Podcast Presentation (PDF)

Thermodynamics and statistical mechanics give us two alternative ways of thinking about entropy: in terms of heat flow, or in terms of the number of micro-states available to a system. John Chalker will describe a physical setting to illustrate each of these. By applying thermodynamics in a realm far beyond its origins, we can use the notion of an ideal heat engine to find the temperature of a black hole. And by applying combinatorial mathematics to hydrogen bonding, we can find the entropy of ice.

 

Prof Siddharth Parameswaran

Entropy from Entanglement

Podcast Presentation (PDF)

The usual picture of entropy in statistical mechanics is that it quantifies our degree of ignorance about a system. Recent advances in cooling and trapping atoms allow the preparation of quantum systems with many interacting particles isolated from any external environment. Textbook discussions of entropy — that invoke the presence of a “large” environment that brings the system to thermal equilibrium at a fixed temperature --- cannot apply to such systems. Sid Parameswaran will explain how “entropy” of subsystems of such isolated quantum systems arises from quantum entanglement between different parts of the system, and how their approach to thermal equilibrium is best described as the `scrambling’ of quantum information as it is transferred to non-local degrees of freedom.