Topics

Complexity, Entropy and the Physics of Information

23 January 2024

Complexity, Entropy and the Physics of Information, edited by Wojciech Zurek, Santa Fe Institute

Complexity, Entropy and the Physics of Information

“This quantum business is so incredibly important and difficult that everyone should busy himself with it,” wrote Einstein in a 1908 letter, cited by John Wheeler at the workshop “Complexity, Entropy and the Physics of Information” held at the Santa Fe Institute in 1989. More than a century after Einstein’s letter, many fundamental questions connecting physics and information remain unanswered.

The book Complexity, Entropy and the Physics of Information consists of 32 essays capturing the talks given at the Santa Fe workshop. Building on the fundamental work by Claude Shannon, the aim of the workshop was to explore fundamental questions relating to the foundations of quantum theory and quantum information science. Most of the questions raised are still relevant today, as many contributions to this two-volume reprint of proceedings demonstrate.

The workshop started with Wheeler’s famous talk “It from bit”, in which he aimed to “deduce the quantum from existence”. Those remain a guiding principle in the life of a researcher in the field. Indeed, in a talk at the QTML 2023 conference held at CERN, Max Welling (University of Amsterdam) motivated his recent work on “General Message Belief Propagation” for quantum computations using Wheeler’s principle, linking machine-learning models to thermodynamics.

William Wootters’ contribution, on the other hand, builds on the work of John Bell, who showed that quantum mechanics is inherently non-local, i.e. that correlations between spatially separated systems are stronger than what is allowed in a hidden-variable theory. In contrast, Wootter focusses on the locality of quantum mechanics – specifically stating that local measurements on parts of a system and correlations between those measurements allow the state of an ensemble to be determined. Furthermore, Benjamin Schumacher presents his thoughts on the “physics of communication” and discusses the connections between information and entropy. He promotes the idea that “it is not the number of available signals but rather their distinguishability that matters in communication.”

Wojciech Zurek focusses on the implications of a quantum measurement, which converts a collection of possible alternatives to a definite outcome and thus decreases the statistical entropy. In this regard, he discusses the connections between physical and statistical entropy (Shannon entropy) and the algorithmic information content of the data (Kolmogorov complexity). Applications in non-equilibrium systems highlight the fundamental cost of information erasure that was first mentioned by Landauer in 1961.

Charles Bennett asks “what is complexity?” and presents various suitable notions for a “formal measure of complexity” based on computational theory, information theory and thermodynamics. He thus highlights the notion of “logical depth”, which is the execution time needed to generate the object of interest by a near-incompressible universal computer program. The behaviour of complexity measures in dynamical systems exhibiting self-organisation and phase transitions are also discussed.

A noteworthy collection of essays

Tommaso Toffoli explores whether the principles of mechanics are universal and fundamental because they emerge from an extremely fine-grained underlying structure, in which case they would be of mathematical rather than physical origin. This mode of thought is in line with statistical mechanics, where laws emerge due to collective effects in systems with many elements.

In his contribution, Edwin Jaynes focuses on the meaning of probability in quantum mechanics, which he regards not as a “physically real thing” but relevant for quantifying the role of incomplete information and the precision with which a theory is able to predict results. In case of its infiniteness, the theory is unable to predict this quantity and, hence, the uncertainty is infinite. But he stresses that it does not mean that the physical quantity is infinite.

This is just snapshot of the many rich contributions. Besides quantum information theory, the book also touches on cosmology, quantum gravity and dynamical systems. An introduction from Seth Lloyd, who attended the Santa Fe workshop, also provides valuable context to the significance of the proceedings.

Complexity, Entropy and the Physics of Information forms a noteworthy collection of essays linking information, computation and complexity, as well as physics and especially quantum mechanics. As it contains many individual essays grouped thematically, readers may pick out topics based on their own interest. I would recommend this work for anyone who is interested in this area, especially researchers and students working in quantum physics and computational theory.

bright-rec iop pub iop-science physcis connect