The applications of nuclear and particle physics to medicine have seen extraordinary development since the discovery of X-rays by Röntgen at the end of the 19th century. Medical imaging and oncologic therapy with photons and charged particles (specifically hadrons) are currently hot research topics.
This special issue of Modern Physics Letters is dedicated to hadron therapy, which is the frontier of cancer radiation therapy, and aims at filling a gap in the current literature on medical physics. Through 10 invited review papers, the volume presents the basics of hadron therapy, along with the most recent scientific and technological developments in the field. The first part covers topics such as the history of hadron therapy, radiation biophysics, particle accelerators, dose-delivery systems and treatment planning. In the second part, more specific topics are treated, including dose and beam monitoring, proton computer tomography, innoacustics and microdosimetry.
This volume will be very useful to students, researchers approaching medical physics, and scientists interested in this interdisciplinary and fast-moving field.
This book provides an introduction to astrophysics and cosmology for absolute beginners, as well as for any reader looking for a general overview of the subject and an account of its latest developments.
Besides presenting what we know about the history of the universe and the marvellous objects that populate it, the author is interested in explaining how we came to such knowledge. He traces a trajectory through the various theories and the discoveries that defined what we know about our universe, as well as the boundary of what is still to be understood.
The first six chapters deal with the state-of-the-art of our knowledge about the structure of the universe, its origin and evolution, general relativity and the life of stars. The following five address the most important open problems, such as: why there is more matter than antimatter, what dark matter and dark energy are, what there was before the Big Bang, and what the fate of the universe is.
Written in plain English, without formulas and equations, and characterized by a clear and fluid prose, this book is suitable for a wide range of readers.
This is not another “quantum mechanics for dummies” book, as the author himself states. Nevertheless, it is a text that talks about quantum mechanics but is not meant for experts in the field. It explains complex concepts of theoretical physics almost without bringing up formulas, and makes no reference to a specialist background.
The book focuses on an intriguing issue of present-day physics: nonlocality and the associated phenomenon of entanglement. Thinking in macroscopic terms, we know that what happens here affects only the surrounding environment. But going down to the microscopic level where quantum mechanics applies, we see that things work in a different way. Scientists discovered that in this case, besides the local effects, there are less evident effects that reveal themselves in strange correlations that occur instantaneously between remote locations. Even stronger nonlocal correlations, still consistent with relativity, have been theoretically supposed, but have not been observed up to now.
This complex subject is treated by the author using a particular metaphor, which is actually more than just that: he draws a metaphoric world made of magic bananas, and simple actions that can be performed on them. Thanks to this, he is able to explain nonlocality and other difficult physics concepts in a relatively easy and comprehensive way.
Even if it requires some general knowledge of mathematics and familiarity with science, this book will be accessible and interesting to a wide range of readers, as well as being an entertaining read.
This book aims to present the history of particle physics, from the introduction of the concept of particles by Greek philosophers, to the discovery of the last tile of the Standard Model, the Higgs boson particle, which took place at CERN in 2012. Chronologically following the development of this field of science, the author gives an overview of the most important notions and theories of particle physics.
The text is divided into seven sections. The first part provides the basics concepts and a summary of the history of physics, arriving at the modern theory of forces, which are the subject of the second part. It carries on with the Higgs boson discovery and the description of some of the experimental apparatus used to study particles (from the LHC at CERN to cosmic rays and neutrino experiments). The author also provides a brief treatment of general relativity, the Big Bang model and the evolution of the universe, and discusses the future developments of particle physics.
In the main body of the book, the topics are presented in a non-technical fashion, in order to be accessible to non-experts. Nevertheless, a rich appendix provides demonstrations and further details for advanced readers. The text is accompanied by plenty of images, including paintings and photographs of many of the protagonists of particle physics.
By Luca Lista Springer
Also available at the CERN bookshop
Particle-physics experiments are very expensive, not only in terms of the cost of building accelerators and detectors, but also due to the time spent by physicists and engineers in designing, building and running them. With the statistical analysis of the resulting data being relatively inexpensive, it is worth trying to use it optimally to extract the maximum information about the topic of interest, whilst avoiding claiming more than is justified. Thus, lectures on statistics have become regular in graduate courses, and workshops have been devoted to statistical issues in high-energy physics analysis. This also explains the number of books written by particle physicists on the practical applications of statistics to their field.
This latest book by Lista is based on the lectures that he has given at his home university in Naples, and elsewhere. As part of the Springer series of “Lecture Notes in Particle Physics”, it has the attractive feature of being short – a mere 172 pages. The disadvantage of this is that some of the explanations of statistical concepts would have benefited from a somewhat fuller treatment.
The range of topics covered is remarkably wide. The book starts with definitions of probability, while the final chapter is about discovery criteria and upper limits in searches for new phenomena, and benefits from Lista’s direct involvement in one of the large experiments at CERN’s LHC. It mentions such topics as the Feldman–Cousins method for confidence intervals, the CLs approach for upper limits, and the “look elsewhere effect”, which is relevant for discovery claims. However, there seems to be no mention of the fact that a motivation for the Feldman–Cousins method was to avoid empty intervals; the CLs method was introduced to protect against the possibility of excluding the signal plus background hypothesis when the analysis had little or no sensitivity to the presence or absence of the signal.
The book has no index, nor problems for readers to solve. The latter is unfortunate. In common with learning to swim, play the violin and many other activities, it is virtually impossible to become proficient at statistics by merely reading about it: some practical exercise is also required. However, many worked examples are included.
There are several minor typos that the editorial system failed to notice; and in addition, figure 2.17, in which the uncertainty region for a pair of parameters is compared to the uncertainties in each of them separately, is confusing.
There are places where I disagree with Lista’s emphasis (although statistics is a subject that often does produce interesting discussions). For example, Lista claims it is counter-intuitive that, for a given observed number of events, an experiment that has a larger than expected number of background events (b) provides a tighter upper limit than one with a smaller background (i.e. a better experiment). However, if there are 10 observed events, it is reasonable that the upper limit on any possible signal is better if b = 10 than if b = 0. What is true is that the expected limit is better for the experiment with smaller backgrounds.
Finally, the last three chapters could be useful to graduate students and postdocs entering the exciting field of searching for signs of new physics in high energy or non-accelerator experiments, provided that they have other resources to expand on some of Lista’s shorter explanations.
By E Gozzi, E Cattaruzza and C Pagani World Scientific
The path integral formulation of quantum mechanics is one the basic tools used to construct quantum field theories, especially gauge-invariant theories. It is the bread and butter of modern field theory. Feynman’s original formulation developed and extended some of the work of Dirac in the early 1930s, and provided an elegant and insightful solution to a generic Schrödinger equation.
This short book provides a clear, pedagogical and insightful presentation of the subject. The derivations of the basic results are crystal clear, and the applications worked out to be rather original. It includes a nice presentation of the WKB approximation within this context, including the Van Vleck and functional determinant, the connections formulae and the semiclassical propagator.
An interesting innovation in this book is that the authors provide a clear presentation of the path integral formulation of the Wigner functions, which are fundamental in the study of quantum statistical mechanics; and, for the first time in an elementary book, the work of Koopman and von Neumann on classical and statistical mechanics.
The book closes with a well selected set of appendices, where some further technical details and clarifications are presented. Some of the more mathematical details in the basic derivations can be found there, as well as aspects of operator ordering as seen from the path integral point formulation, the formulation in momentum space, and the use of Grassmann variables, etc.
It will be difficult to find a better and more compact introduction to this fundamental subject.
When I meet Jack in his office in Building 2, he has just returned from a “splendid” birthday celebration – a classical-music concert “with a lady conductor”, he is quick to add. It had been organised by members of his town of birth, Bad Kissingen in South Germany, and was held at the local gymnasium that bears his name. Steinberger’s memories of the town are those of a 13 year-old child in pre-war Germany during the Nazi election propaganda. “Hitler was psychopathic when it came to Jews,” he says. “In making me leave, however, he did me a great favour because I had a wonderful education in America.”
Talking to this extraordinary man and physicist – who is too modest to dwell on the 1962 discovery of the muon neutrino that won him, Leon Lederman and Melvin Schwartz the 1988 Nobel Prize in Physics – is like taking a trip back in the history of particle physics. With the help of a scholarship from the University of Chicago, Steinberger completed a first degree in chemistry in 1942. He owes his first contact with physics to Ed Purcell and Julian Schwinger, with whom he worked at the MIT radiation laboratory where he had been assigned a military role in 1941 – the year that Japan attacked the US at Pearl Harbour.
“We were making bombsights for bombers, something that could be mounted on airplanes and could see the ground with radar and so you could find military targets,” he explains. “The bombsight we succeeded in developing had a very limited accuracy and you couldn’t see a military target, but you could see cities.” With a heavy heart, Steinberger adds that the radar system was used in the infamous Dresden bombing. “That was my contribution during the war,” he states flatly.
The Fermi years
When the war ended, Steinberger went back to Chicago with the intention of completing a thesis in theoretical physics. Then he met Enrico Fermi. “Fermi was the biggest luck I had in my life!” he exclaims, with a spark in his striking blue eyes. “He asked me to look into a problem raised by an experiment by Rossi and Sands on stopping cosmic-ray muons, and suggested that I do an experiment instead of waiting for a theoretical topic to surface,” recalls Steinberger. At the time, most experiments required just a handful of Geiger counters and a detector measuring about 20 cm long, he says. “The experiment I wanted to do required 80 of those and was 50 cm long, so it was not trivial to build it.”
It was the time before computers, when vacuum tubes were the height of technology, and Fermi had identified the resources required in the physics department of the University of Chicago. Once the experiment was up and running, however, Fermi suggested it would produce results more quickly if it were located on top of a mountain, where there would be more mesons from cosmic rays. “He found a young driver – I didn’t know how to drive, it was the beginning of cars – who took me to the only mountain in the US with a road to the top,” says Steinberger. “It was almost as high as Mt Blanc, and I could do the experiment faster by being on top of that thing.”
The experiment showed that the energy spectrum of the electron in certain meson decays is continuous. It suggested that the muon undergoes a three-body decay, probably into an electron and two neutrinos, and helped to lay the experimental foundation for the concept of a universal weak interaction. What followed is history, leading to the discovery of the muon neutrino (see “DUMAND and the origins of large neutrino detectors”). “It is likely that we had no prejudice on the question of whether the neutrino in muon decay is the same as the one in beta decay.”
Apart from the discovery of the muon neutrino, Steinberger’s pioneering work in physics overlaps 40 years of history of electroweak theory and experiment. At each turn of a decade, Steinberger was the first user of the latest device available for experimentalists, starting with McMillan’s electron synchrotron when it had just been completed in 1949, or Columbia’s 380 MeV cyclotron in 1950. In 1954, he published the first bubble-chamber paper with Leitner, Samios and Schwartz, making a substantial contribution to the technique itself and achieving important results on the properties of the new unstable (strange) particles.
Lasting legacy
What brought Steinberger to CERN in 1968 was the availability of Charpak’s wire chamber, which he realised was a much more powerful way to study K0 decays – to which he says he had “become addicted”. Then he conceived and led the ALPEH experiment at the Large Electron–Positron (LEP) collider. The results of this and the other LEP experiments, he says, “dominated CERN physics, perhaps the world’s, for a dozen or more years, with crucial precise measurements that confirmed the Standard Model of the unified electroweak and strong interactions”.
These days, Jack still comes to CERN with the same curiosity for the field that he always had. He says he is “trying to learn astrophysics, in spite of my mental deficiencies”, and thinks that the most interesting question today is dark matter. “You have a Standard Model which does not predict everything and it does not predict dark matter, but you can conceive of mechanisms for making dark matter in the Standard Model,” he says. “You don’t know if you really understand it, but you can imagine it. And I am not the only one who doesn’t know.”
This essay is the result of interdisciplinary research pursued by the author, a theoretical physicist, on the concept of the indefinite and its expression in different fields of human knowledge. Examples are taken from the natural sciences, mathematics, economics, neurophysiology, history, ecology and philosophy.
Physics and mathematics often deal with the indefinite, but they try to reduce it, to reach a theory that would be able to explain everything or to allow reliable predictions. Indefiniteness is strictly connected to uncertainty, which is a component of many analyses of complex processes, so the concept of the indefinite can also be found in economics and risk assessments.
The author explains how uncertainty is present in the humanities. For example, historians might have to work on just a few indeterminate sources and connect the dots to reconstruct a story. Uncertainty is also inherent to our memory – we tend to forget, and lose and confuse details. Psychologists understand that forgetting permits new ideas to form, while strong memories would prevent them from emerging.
The book shows how uncertainty and indefiniteness define the border of our understanding and, at the same time, are engines for research and for continuous attempts to push back that limit.
The first part focuses on information and how it helps to reduce indefiniteness. New elements must be combined with existing parts to be integrated in the knowledge system, so that maximum profit can be taken from the new information. The author tries to quantify the value of information on the basis of its ability to reduce uncertainty.
The second part of the book presents a number of methods that can be used to handle indefiniteness, which come from fuzzy logic, decision theory, hermeneutics, and semiotics. An interdisciplinary approach is promoted because it enables bridges to be built between the different fields among which our knowledge is dispersed.
By M Duck and M Petry (translators), with an introduction by M Duck Imperial College Press
Johann Wolfgang von Goethe is undoubtedly famous for his literary work, however it is not widely known that he was also fond of science and wrote a polemic text on Newton’s theory of light and colours, which he did not accept. He tried to reproduce the experiment that Newton used to demonstrate that light is heterogeneous but, according to what Goethe himself wrote, he could not obtain the same results.
The book provides an English translation of Goethe’s polemic, completed by an introduction in which a possible justification of this resistance by Goethe to Newton’s theory is given. Many suppositions have been offered: maybe he was prevented from reasoning clearly by a psychological refusal, or perhaps he was simply unable to understand Newton’s experiments and reproduce them well.
In the introduction to this volume, the editor suggests that the reason for Goethe’s stubborn attitude, which made him preserve his belief that light is immutable and that colours result from the interaction of light and darkness, is theological. Goethe believed in the spiritual nature of light, and he could not conceive it as being anything other than simple, immutable and unknowable.
This book, addressed to historians of science, philosophers and scientists, will allow the reader to discover Goethe’s polemic against Newton and to obtain new insights into the multifaceted personality of the German poet.
This specialist book on superconductivity proposes an approach to the topic, based on the Bethe–Salpeter equations, that allows a description of the characteristics of superconductors (SCs) that are considered unconventional.
The basic theory of superconductivity, elaborated in 1957 by Bardee, Cooper and Schrieffer (BCS), which was worth a Nobel prize to its “fathers”, proves itself to be inadequate in describing the behaviour of high-temperature superconductors (HTSCs) – materials that have a critical temperature higher than 30 K. In this monographic work, the author shows how a generalisation of the BCS equations enables the superconducting features of non-elemental SCs to be addressed in the manner that elemental SCs are dealt with in the original theory. This generalisation is achieved by adopting the “language” of Bethe–Salpeter.
It was the intention of the author to give an essential treatment of the topic, without including material that is not strictly necessary, and to keep it reasonably simple and accessible. Nevertheless, quantum field theory (QFT) and its finite-temperature version (FTFT) are used to derive some equations in the text, so a basic knowledge of them is needed to follow the dissertation.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.