Topics

Bookshelf

26 August 2014

• A Brief History of String Theory: From Dual Models to M-Theory • Statistical Data Analysis for the Physical Sciences • Books received

A Brief History of String Theory: From Dual Models to M-Theory
By Dean Rickles
Springer
Hardback: £35.99 €32.12 $49.99
E-book: £27.99 €42.79 $39.99
Also available at the CERN bookshop

String theory provides a theoretical framework for unifying particle physics and gravity that is also consistent at the quantum level. Apart from particle physics, it also sheds light on a vast range of problems in physics and mathematics. For example, it helps in understanding certain properties of gauge theories, black holes, the early universe and even heavy-ion physics.

This new book fills a gap by reviewing the 40-year-plus history of the subject, which it divides into four parts, with the main focus on the earlier decades. The reader learns about the work of researchers in the early days in detail, where so-called dual models were investigated with the aim of describing hadron physics. It took ingenious insights to realize that the underlying physical interpretation is in terms of small, oscillating strings. Some of the groundbreaking work took place at CERN – for example, the discovery of the Veneziano amplitude.

The reader obtains a good impression of how it took many years of collective effort and struggle to develop the theory and understand it better, often incrementally, although sometimes the direction of research changed drastically in a serendipitous manner. For example, at some point there was an unexpected shift of interpretation, namely in terms of gravity rather than hadron physics. Supersymmetry was discovered along the way as well, demonstrating that string theory has been the source and inspiration of many ideas in particle physics, gravity and related fields.

The main strength of the book is the extensively and carefully researched history of string theory, rather than profound explanations of the physics (for which enough books are available). It is full of anecdotes, quotations of physicists at the time, and historical facts, to an extent that makes it unique. Despite the author’s avoidance of technicalities, the book seems to be more suitable for people educated in particle physics, and less suitable for philosophers, historians and other non-experts.

One caveat, however: the history covered in the book more or less stops at around the mid-1990s, and as the author emphasizes, the subject becomes much harder to describe after that, without going into the details more deeply. While some of the new and important developments are mentioned briefly in the last chapter – for example, the gauge/gravity correspondence – they do not get the attention that they deserve in relation to older parts of the history. In other words, while the history has been quite accurately presented until the mid-1990s, the significance of some of its earlier parts is rather overrated in comparison with more recent developments.

In summary, this is a worthwhile and enjoyable book, full of interesting details about the development of one of the main research areas of theoretical physics. It appears to be most useful to scientists educated in related fields, and I would even say that it should be a mandatory read for young colleagues entering research in string theory.

Wolfgang Lerche, CERN.

Statistical Data Analysis for the Physical Sciences
By Adrian Bevan
Cambridge University Press
Hardback: £40 $75
Paperback: £18.99 $31.99

E-book: $26
Also available at the CERN bookshop

The numerous foundational errors and misunderstandings in this book make it inappropriate for use by students or research physicists at any level. There is space here to indicate only a few of the more serious problems.

The fundamental concepts – probability, probability density function (PDF) and likelihood function – are confused throughout. Likelihood is defined as being “proportional to probability”, and both are confused with a PDF in section 3.8(6). Exercise 3.11 invites the reader to “re-express the PDF as a likelihood function”, which is absurd because the two are functions of different kinds of arguments.

Probability and probability density are confused most notably in section 5.5 (χ2 distribution), where the “probability of χ2” is given as the value of the PDF instead of its integral from χ2 to infinity. (The latter quantity is in fact the p value, which is introduced later in section 8.2, but is needed here already.) The student who evaluates the PDFs labelled P(χ2, ν) in figure 5.6 to do exercises 5.10 to 5.12 will get the wrong answers, but the numbers given in table E11 – miraculously – are correct p values. Fortunately the formulas in the book were not used for the tables.

From the beginning there is confusion about what is Bayesian and what is not. Bayesian probability is defined correctly as a degree of belief, but Bayes’s theorem is introduced in the section entitled “Bayesian probability”, even though it can be used equally well in frequentist statistics, and in fact nearly all of the examples use frequentist probabilities. The different factors in Bayes’s theorem are given Bayesian names (one of which is wrong: the likelihood function is inexplicably called “a priori probability”), but the examples labelled “Bayesian” do not use the theorem in a Bayesian way. Worse, the example 3.7.4, labelled Bayesian, confuses the two arguments of conditional probability throughout, and equation 3.17 is wrong (as can be seen by comparing it with P(A) in section 3.2, which is correct). On the other hand, in section 8.7.1 a similar example – with frequentist probabilities again – is presented clearly and correctly. Example 3.7.5 (also labelled Bayesian) is, as far as I can see, nonsense (what is outcome A?).

The most serious errors occur in chapter 7 (confidence intervals). Confidence intervals are frequentist by definition, otherwise they should be called credible intervals. But the treatment here is a curious mixture of Bayesian, frequentist and pure invention. The definition of the confidence level (CL) is novel and involves integration under a PDF that could be the Bayesian posterior but in some examples turns out to be a likelihood function. Coverage is then defined in a frequentist-inspired way (invoking repeated experiments), but it is not the correct frequentist definition. The Feldman–Cousins (F–C) frequentist method is presented without having described the more general Neyman construction on which it is based. A good treatment of the Neyman construction would have allowed the reader to understand coverage better, which the book identifies correctly as the most important property of confidence intervals. It is true that for discrete (e.g. Poisson) data, the F–C method in general over-covers, but it should also have been stated that for this case any method (including Bayesian) that covers for all parameter values must over-cover for some. The “coverage” that this book claims to be exact for Bayesian methods is not an accepted definition because it represents subjective belief only and does not have the frequentist properties required by physicists.

Fred James, CERN.

Books received

The Physics of Reality: Space, Time, Matter, Cosmos
By Richard L Amoroso, Louis H Kauffman, Peter Rowlands (ed.)
World Scientific
Hardback: £111
E-book: £83

Amoroso, Kauffman, Rowlands

As the proceedings of the 8th Symposium Honoring Mathematical Physicist Jean-Pierre Vigier, this book introduces a new method in theory formation, completing the tools of epistemology. Like Vigier himself, the Vigier symposia are noted for addressing avant-garde, cutting-edge topics in contemporary physics. In this, several important breakthroughs are introduced for the first time. The most interesting is a continuation of Vigier’s pioneering work on tight-bound states in hydrogen. The new experimental protocol described not only promises empirical proof of large-scale extra dimensions in conjunction with avenues for testing string theory, but also implies the birth of unified field mechanics, ushering in a new age of discovery.

Semiconductor X-Ray Detectors
By B G Lowe and R A Sareen
CRC Press
Hardback: £108

Lowe, Sareen

The history and development of Si(Li) X-ray detectors is an important supplement to the knowledge required to achieve full understanding of the workings of SDDs, CCDs, and compound semiconductor detectors. This book provides an up-to-date review of the principles, practical applications, and state-of-the-art of semiconductor X-ray detectors, and describes many of the facets of X-ray detection and measurement using semiconductors – from manufacture to implementation. The initial chapters present a self-contained summary of relevant background physics, materials science and engineering aspects. Later chapters compare and contrast the assembly and physical properties of systems and materials currently employed.

Fission and Properties of Neutron-Rich Nuclei: Proceedings of the Fifth International Conference
By J H Hamilton and A V Ramayya (ed.)
World Scientific
Hardback: £131
E-book: £98

Hamilton, Ramayya

The five-year interval between the international conferences covering fission and properties of neutron-rich nuclei allows for significant new results to be achieved. At the latest in the series, leaders in theory and experiments presented their latest results in areas such as the synthesis of superheavy elements, recent results and new facilities using radioactive ion beams, the structure of neutron-rich nuclei, the nuclear fission process, fission yields and nuclear astrophysics. The conference brought together more than 100 speakers from the major nuclear laboratories, along with leading researchers from around the world.

One Hundred Physics Visualizations Using MATLAB
By Dan Green
World Scientific
Hardback (with DVD): £48
Paperback (with DVD: £23
E-book: £17

Green

The aim of this book is to have an interactive MATLAB script where the user can vary parameters in a specific problem and then immediately see the outcome by way of dynamic “movies” of the response of the system in question. MATLAB tools are used throughout, and the software scripts accompany the text in symbolic mathematics, classical mechanics, electromagnetism, waves and optics, gases and fluid flow, quantum mechanics, special and general relativity, and astrophysics and cosmology. The emphasis is on building up an intuition by running many different parametric choices chosen actively by the user and watching the subsequent behaviour of the system.

Modern Functional Quantum Field Theory: Summing Feynman Graphs
By Herbert M Fried
World Scientific
Hardback: £65
E-book: £49

Fried

These pages offer a simple, analytic, functional approach to non-perturbative QFT, using a frequently overlooked functional representation of Fradkin to calculate explicitly relevant portions of the Schwinger generating functional. In QED, this corresponds to summing all Feynman graphs representing virtual photon exchange between charged particles. It is then possible to see, analytically, the cancellation of an infinite number of perturbative, UV logarithmic divergences, leading to an approximate but reasonable statement of finite-charge renormalization. A similar treatment of QCD is then able to produce a simple, analytic derivation of quark-binding potentials. An extension into the QCD binding of two nucleons to make an effective deuteron presents a simple, analytic derivation of nuclear forces. Finally, a new QED-based solution of vacuum energy is presented as a possible candidate for dark energy.

SILVER SUPPLIERS

Copyright © 2019 by CERN
bright-rec iop pub iop-science physcis connect