Melting Hadrons, Boiling Quarks: From Hagedorn Temperature to Ultra-Relativistic Heavy-Ion Collisions at CERN. With a Tribute to Rolf Hagedorn
By Johann Rafelski (ed.)
Springer
Also available at the CERN bookshop

The statistical bootstrap model (SBM), the exponential rise of the hadron spectrum, and the existence of a limiting temperature as the ultimate indicator for the end of ordinary hadron physics, will always be associated with the name of Rolf Hagedorn. He showed that hadron physics contains its own limit, and we know today that this limit signals quark deconfinement and the start of a new regime of strong-interaction physics.

This book is edited by Johann Rafelski, who was a long-time collaborator with Hagedorn and took part in many of the early conceptual developments of the SBM. It may perhaps be best characterised by pointing out what it is not. It is not a collection of review articles on the physics of the SBM and related topics, which could be given to newcomers as an introduction to the field. It is not a collection of reprints to summarise the well-known work of Hagedorn on the SBM, and it is also not a review of the history of this theory. Actually, in this thoughtfully composed volume, aspects of all of the above can be found. However, it goes beyond all of them.

Including a collection of earlier articles on Hagedorn’s work, as well as new invited articles by a number of authors, and original work by Hagedorn himself, along with comments and reprinted material of Rafelski, the book clearly gains its value through the unexpected. It provides an English translation of an early overview article by Hagedorn written in German, as well as unpublished material that may even be new to well-informed practitioners in the field. As such, it presents the transcript of the draft minutes of the 1982 CERN Scientific Policy Committee (SPC) Meeting, at which Maurice Jacob, then head of the CERN Theory Division, reported about the 1982 Bielefeld workshop on the planned experimental exploration of ultra-relativistic heavy-ion collisions, setting the scene for the forthcoming experimental programme at CERN’s SPS.

The book is split into three parts.

Part I, "Reminiscences: Rolf Hagedorn and Relativistic Heavy Ion Research", contains a collection of 15 invited articles from colleagues of Hagedorn who witnessed the initial stages of his work, leading to formulation of the SBM theory in the early 1960s, and its decisive contribution in expressing the need for an experimental research programme in the early 1980s: Johann Rafelski, Torleif Ericson, Maurice Jacob, Luigi Sertorio, István Montvay and Tamás Biro, Krzysztof Redlich and Helmut Satz, Gabriele Veneziano, Igor Dremin, Ludwik Turko, Marek Gaździcki and Mark Gorenstein, Grażyna Odyniec, Hans Gutbrod, Berndt Müller, and Emanuele Quercigh. These contributions draw a lively picture of Hagedorn, both as a scientist and as a man, with a wide range of interests spanning high-energy physics to music. They also illustrate the impact of Hagedorn’s work on other areas of physics.

Part II, "The Hagedorn Temperature", contains a collection of original work by Hagedorn. In this section, the scientist’s seminal publication that appeared in 1964 in Nuovo Cimento is deliberately not included; however, publications that emphasise the hurdles that had to be overcome to get to the SBM, and the interpretation Hagedorn offered on his own work in later years, are presented. This is undoubtedly of great interest to those familiar with the physicist’s work but also curious about its creation and growth.

Part III, "Melting Hadrons, Boiling Quarks: Heavy Ion Path to Quark–Gluon Plasma", puts the work of Hagedorn into the context of the discussion of a possible relativistic heavy-ion programme at CERN that took place in the early 1980s. It starts with his thoughts about a possible programme of this kind, presented at the workshop on future relativistic heavy-ion experiments, held at the Gesellschaft fuer Schwerionenforschung (GSI). It also includes the draft minutes of the 1982 CERN SPC meeting, and some early works on strangeness production as an indicator for quark–gluon plasma formation, as put forward after many years by Rafelski.

The book is undoubtedly an ideal companion to all those who wish to recall the birth of one of the main areas of today’s concepts in high-energy physics, and it is definitely a well-deserved credit to one of the great pioneers in their development.

• Frithjof Karsch, Bielefeld University, Germany.


Unifying Physics of Accelerators, Lasers and Plasmas
By Andrei Seryi
CRC Press

Particle accelerators have led to remarkable discoveries and enabled scientists to develop and test the Standard Model of particle physics. On a different scale, accelerators have many applications in technology, materials science, biology, medicine (including cancer therapy), fusion research, and industry. These machines are used to accelerate electrons, positrons or ions to energies in the range of 10 s of MeV to 10 s of GeV. Electron beams are employed in generating intense X-rays in either synchrotrons or free-electron lasers, such as the Linear Collider Light Source at Stanford or the XFEL in Hamburg, for a range of applications.

Particle accelerators developed over the last century are now approaching the energy frontier. Today, at the terascale, the machines needed are extremely large and costly. The size of a conventional accelerator is determined by the technology used and final energy required. In conventional accelerators, radiofrequency microwave cavities support the electric fields responsible for accelerating charged particles. Plasma-based particle accelerators, driven by either lasers or particle beams, are showing great promise as future replacements, primarily due to the extremely large accelerating electric fields they can support, leading to the possibility of compact structures. These fields are supported by the collective motion of plasma electrons, forming a space-charge disturbance moving at a speed slightly below the speed of light in a vacuum. This method is commonly known as plasma wakefield particle acceleration.

Plasma-based accelerators are the brainchild of the late John Dawson and colleagues at the University of California, Los Angeles, and is a topic that is being investigated worldwide with a great deal of success. In the 1980s, John David Lawson asked: "Will they be a serious competitor and displace the conventional ‘dinosaur’ variety?" This is still a valid question, with plasma accelerators already producing bright X-ray sources through betatron radiation at the lower energy scale, and there are plans to create electron beams that are good enough to drive free-electron lasers and future colliders. The topic and application of these plasma accelerators have seen rapid progress worldwide in the last few years, with the result that research is no longer limited to plasma physicists, but is now seeing accelerator and radiation experts involved in developing the subject.

The book fills a void in the understanding of accelerator physics, radiation physics and plasma accelerators. It is intended to unify the three areas and does an excellent job. It also introduces the reader to the theory of inventive problem solving (TRIZ), proposed by Genrikh Altshuller in the mid 20th century to aid in the development of successful patents. It is argued that plasma accelerators fall into the prescription of TRIZ, however, it could also be argued that knowledge, imagination, creativity and time were all that was needed. The concept of TRIZ is outlined, and it is shown how it can be adopted for scientific and engineering problems.

The book is well organised. First, the fundamental concepts of particle motion in EM fields, common to accelerators and plasmas, are presented. Then, in chapter 3, the basics of synchrotron radiation are introduced. They are discussed again in chapter 7, with a potted history of synchrotrons together with Thomson and Compton scattering. It would make sense to have the history of synchrotrons in the earlier chapter.

The main topic of the book, namely the synergy between accelerators, lasers and plasma, is covered in chapter 4, where a comparison between particle-beam bunch compression and laser-pulse compression is made. Lasers have the additional advantage of being amplified through a non-linear medium amplification using chirped-pulse amplification (CPA). This method, together with optical parametric amplification, can push the laser pulses to even higher intensities.

The basics of plasma accelerators are covered in chapter 6, where simple models of these accelerators are described, including laser- and beam-driven wakefield accelerators. However, only the lepton wakefield drivers, not the proton one used for the AWAKE project at CERN, are discussed. This chapter also describes general laser plasma processes, such as laser ionisation, with an update on the progress in developing laser peak intensity. The application of plasma accelerators as a driver of free-electron lasers is covered in chapter 8, describing the principles in simple terms, with handy formulae that can be easily used. Proton and ion acceleration are covered in chapter 9, where the reader is introduced to Bragg scattering, the DNA response to radiation and proton-therapy devices, ending with a description of different plasma-acceleration schemes for protons and ions. The basic principles of the laser acceleration of protons and ions by sheaths, radiation pressure and shock waves are briefly covered. The penultimate chapter discusses beam and pulse manipulation, bringing together a fairly comprehensive but brief introduction to some of the issues regarding beam quality: beam stability, cooling and phase transfer, among others. Finally, chapter 11 looks at inventions and innovations in science, describing how using TRIZ could help. There is also a discussion on bridging the gap between initial scientific ideas and experimental verification to commercial applications, the so-called "Valley of Death", something that is not discussed in textbooks but is now more relevant than ever.

This book is, to my knowledge, the first to bridge the three disciplines of accelerators, lasers and plasmas. It fills a gap in the market and helps in developing a better understanding of the concepts used in the quest to build compact accelerators. It is an inspiring read that is suitable for both undergraduate and graduate students, as well as researchers in the field of plasma accelerators. The book concentrates on the principles, rather than being heavy on the mathematics, and I like the fact that the pages have wide margins to take notes.

• Robert Bingham, University of Strathclyde, Glasgow.


Books received

Principles of Radiation Interaction in Matter and Detection (4th edition)
By C Leroy and P G Rancoita
World Scientific
Also available at the CERN bookshop

Leroy, Rancoita

Based on a series of lectures given to undergraduate and graduate students over several years, this book provides a comprehensive and clear presentation of the physics principles that underlie radiation detection.

To detect particles and radiation, the effects of their interaction with matter, when passing through it, have to be studied. The development of increasingly sophisticated and precise detectors has made possible many important discoveries and measurements in particle and nuclear physics.

The book, which has reached its 4th edition thanks to its good reception by readers, is organised into two main parts. The first is dedicated to an extensive treatment of the theories of particle interaction, of the physics and properties of semiconductors, as well as of the displacement damage caused in semiconductors by traversing radiation.

The second part focuses on the techniques used to reveal different kinds of particles, and the relative detectors. Detailed examples are presented to illustrate the operation of the various types of detectors. Radiation environments in which these mechanisms of interaction are expected to take place are also described. The last chapter is dedicated to the application of particle detection to medical physics for imaging. Two appendices and a very rich bibliography complete the volume.

This latest edition of the book has been fully revised, and many sections have been extended to give as complete a treatment as possible of this developing field of study and research. Among other things, this edition provides a treatment of Coulomb scattering on screened nuclear potentials resulting from electrons, protons, light ions and heavy ions, which allows the corresponding non-ionising energy-loss (NIEL) doses deposited in any material to be derived.


Physics and Mathematical Tools: Methods and Examples
By A Alastuey, M Clusel, M Magro and P Pujol
World Scientific

Alastuey, Clusel, Magro, Pujol

This volume presents a set of useful mathematical methods and tools that can be used by physicists and engineers for a wide range of applications. It comprises four chapters, each structured in three parts: first, the general characteristics of the methods are described, then a few examples of applications in different fields are given, and finally a number of exercises are proposed and their solutions sketched.

The topics of the chapters are: analytical properties of susceptibilities in linear response theory, static and dynamical Green functions, and the saddle-point method to estimate integrals. The examples and exercises included range from classical mechanics and electromagnetism to quantum mechanics, quantum field theory and statistical physics. In this way, the general mechanisms of each method are seen from different points of view and therefore made clearer.

The authors have chosen to avoid derivations that are too technical, but without sacrificing rigour or omitting the mathematics behind the method applied in each instance. Moreover, three appendices at the end of the book provide a short overview of some important tools, so that the volume can be considered self-contained, at least to a certain extent.

Intended primarily for undergraduate and graduate physics students, the book could also be useful reading for teachers, researchers and engineers.


Goethe’s ‘Exposure of Newton’s Theory’: A Polemic On Newton’s Theory of Light and Colour
By M Duck and M Petry (translators), with an introduction by M Duck
Imperial College Press

Duck, Petry

Johann Wolfgang von Goethe is undoubtedly famous for his literary work, however it is not widely known that he was also fond of science and wrote a polemic text on Newton’s theory of light and colours, which he did not accept. He tried to reproduce the experiment that Newton used to demonstrate that light is heterogeneous but, according to what Goethe himself wrote, he could not obtain the same results.

The book provides an English translation of Goethe’s polemic, completed by an introduction in which a possible justification of this resistance by Goethe to Newton’s theory is given. Many suppositions have been offered: maybe he was prevented from reasoning clearly by a psychological refusal, or perhaps he was simply unable to understand Newton’s experiments and reproduce them well.

In the introduction to this volume, the editor suggests that the reason for Goethe’s stubborn attitude, which made him preserve his belief that light is immutable and that colours result from the interaction of light and darkness, is theological. Goethe believed in the spiritual nature of light, and he could not conceive it as being anything other than simple, immutable and unknowable.

This book, addressed to historians of science, philosophers and scientists, will allow the reader to discover Goethe’s polemic against Newton and to obtain new insights into the multifaceted personality of the German poet.


Superconductivity: A New Approach Based on the Bethe–Salpeter Equation in the Mean-Field Approximation
By G P Malik
World Scientific

Malik

This specialist book on superconductivity proposes an approach to the topic, based on the Bethe–Salpeter equations, that allows a description of the characteristics of superconductors (SCs) that are considered unconventional.

The basic theory of superconductivity, elaborated in 1957 by Bardee, Cooper and Schrieffer (BCS), which was worth a Nobel prize to its "fathers", proves itself to be inadequate in describing the behaviour of high-temperature superconductors (HTSCs) – materials that have a critical temperature higher than 30 K. In this monographic work, the author shows how a generalisation of the BCS equations enables the superconducting features of non-elemental SCs to be addressed in the manner that elemental SCs are dealt with in the original theory. This generalisation is achieved by adopting the "language" of Bethe–Salpeter.

It was the intention of the author to give an essential treatment of the topic, without including material that is not strictly necessary, and to keep it reasonably simple and accessible. Nevertheless, quantum field theory (QFT) and its finite-temperature version (FTFT) are used to derive some equations in the text, so a basic knowledge of them is needed to follow the dissertation.


The Unknown as an Engine for Science: An Essay on the Definite and the Indefinite
By H J Pirner
Springer

Pirner

This essay is the result of interdisciplinary research pursued by the author, a theoretical physicist, on the concept of the indefinite and its expression in different fields of human knowledge. Examples are taken from the natural sciences, mathematics, economics, neurophysiology, history, ecology and philosophy.

Physics and mathematics often deal with the indefinite, but they try to reduce it, to reach a theory that would be able to explain everything or to allow reliable predictions. Indefiniteness is strictly connected to uncertainty, which is a component of many analyses of complex processes, so the concept of the indefinite can also be found in economics and risk assessments.

The author explains how uncertainty is present in the humanities. For example, historians might have to work on just a few indeterminate sources and connect the dots to reconstruct a story. Uncertainty is also inherent to our memory – we tend to forget, and lose and confuse details. Psychologists understand that forgetting permits new ideas to form, while strong memories would prevent them from emerging.

The book shows how uncertainty and indefiniteness define the border of our understanding and, at the same time, are engines for research and for continuous attempts to push back that limit.

The first part focuses on information and how it helps to reduce indefiniteness. New elements must be combined with existing parts to be integrated in the knowledge system, so that maximum profit can be taken from the new information. The author tries to quantify the value of information on the basis of its ability to reduce uncertainty.

The second part of the book presents a number of methods that can be used to handle indefiniteness, which come from fuzzy logic, decision theory, hermeneutics, and semiotics. An interdisciplinary approach is promoted because it enables bridges to be built between the different fields among which our knowledge is dispersed.