By Andreas Recknagel and Volker Schomerus Cambridge University Press
Hardback: £65 $99
Also available as an e-book
Boundary conformal field theory is concerned with a class of 2D quantum field theories, which display a rich mathematical structure and have many applications, ranging from string theory to condensed-matter physics. This comprehensive introduction to the topic reaches from theoretical foundations to recent developments, with an emphasis on the algebraic treatment of string backgrounds.
By Daniel Denegri, Claude Guyot, Andreas Hoecker and Lydia Roos EDP Sciences
Paperback: €34
Also available at the CERN bookshop
The authors, leading figures in the CMS and ATLAS experiments, have succeeded in writing a remarkable book, which I enthusiastically recommend to anyone interested in learning about the recent progress, open questions and future perspectives of high-energy physics. Throughout its 300 pages, it offers a broad coverage of the present status of particle physics, adding a few chronological accounts to place things in an historical context.
Despite being published in a collection that targets the general public, the book delves into several topics to a deep level and will be useful reading for many professional physicists. To accommodate different audiences, the authors have organized the book nicely in two “layers”, the standard flow of chapters being complemented by extra boxes giving “further reading”. Still, the reader is often told that some sections might be left aside in a first reading. It seems to me that this is a well-balanced solution for such a book, although I wonder if most readers from the “general public” would agree with the claim that the text is written in a “simple and pedagogical form”. The first chapter, describing the Standard Model, is particularly demanding and long, but these 40 pages should not deter: the rest of the book provides easier reading.
I was impressed particularly by the care with which the authors prepared many figures, which in some cases include details that I have not seen in previous works of this kind – for example, the presence of gluon lines and quark–antiquark loops inside the cartoon representing the pion, besides the standard valence quarks. Such representations are common for the proton, especially when discussing deep-inelastic scattering measurements, but it is rare to point out that any hadron – including the π or the Υ – should equally be characterized by “parton distribution functions”. The profusion of high-quality figures and photographs contributes significantly to making this book well worth reading.
A few things could be improved in a future edition. For instance, the number of footnotes is excessive. While meant as asides not worth including in the main body of the text, they end up disrupting the fluidity of the reading, especially when placed in the middle of a sentence. Most footnotes should be integrated in the text, deleted, or moved to the end of the book, so that the reader can ignore them if preferred. While understanding that this book is addressed to a French audience, I would nevertheless recommend “smoothing out” some French-specific choices. For instance, I was pleased to read that Pierre Fayet, in Paris, had an important role in the development of the MSSM extension to the Standard Model, but I was puzzled to see no other name mentioned in the pages devoted to supersymmetry.
Being one of the “LHC adventurers” myself, I read with particular curiosity the chapters devoted to the construction of the LHC accelerator and experiments, which include many interesting details about sociological aspects. I would have liked this part to have been further expanded, especially knowing by personal experience how fascinating it is to listen to Daniel Denegri, when he tells all sorts of anecdotes about physics and physicists.
All in all, this is a highly recommendable book, which provides an interesting guided tour through present-day high-energy physics while, at the same time, offering opportunities for non-French people to learn some French expressions, such as “se faire coiffer au poteau“. Note, however, that the enjoyable reading comes mixed with harder sections, which require extra effort from the reader: this book, like the LHC data, provides “du pain sur la planche“.
By James Binney and David Skinner Oxford University Press
Hardback: £49.99
Paperback: £24.99
Also available as an e-book
The aim of this book is to give students a good understanding of how quantum mechanics describes the material world. It shows that the theory follows naturally from the use of probability amplitudes to derive probabilities. It emphasizes that stationary states are unphysical mathematical abstractions that enable solution of the theory’s governing equation – the time-dependent Schrödinger equation. Every opportunity is taken to illustrate the emergence of the familiar classical, dynamical world through the quantum interference of stationary states.
By Paolo Amore and John Dirk Walecka World Scientific
Paperback: £32
John Dirk Walecka’s Introduction to Modern Physics: Theoretical Foundations, published in 2009 aimed at covering a range of topics in modern physics in sufficient depth that things would “make sense” to students, so that they could achieve an elementary working knowledge of the subjects. To this end, the book contained more than 175 problems. Now, Introduction to Modern Physics: Theoretical Foundations provides solutions to these problems.
By James Lequeux, translated from the original Naissance, évolution et mort des étoiles, published by EDP Sciences World Scientific
Paperback: £17
How stars form from interstellar matter, how they evolve and die, was understood only relatively recently. All of these aspects are covered in this book by Lequeux, who directed the Marseilles observatory from 1983 to 1988 and served for 15 years as chief editor of the European journal Astronomy & Astrophysics. The text is accompanied by many images, while the theory is explained as simply as possible, but without avoiding mathematical or physical developments when they are necessary for a good understanding of what happens in stars.
By Paul Baillon World Scientific
Hardback: £57
Also available at the CERN bookshop
The theory of differential manifolds is a common substratum of much of our current theoretical descriptions of physical phenomena. It has proved to be well adapted to many branches of classical physics –mechanics, electromagnetism, gravitation – for which it has provided a framework for a precise formulation of fundamental laws. Its use in quantum physics has led to spectacular discoveries associated with the unification of electromagnetic, weak and strong interactions. In this connection, manifolds appear not only in the description of the substratum of these phenomena but also in the description of the phenomena themselves, in terms of the so-called gauge theories.
This mathematical theory constitutes an important body of contemporary mathematics. Baillon’s book, which aims at making the subject accessible to a readership that is rich in a completely different culture, adopts an unconventional expository style. Instead of appealing to intuition based on mathematically non-rigorous images and analogies – a common practice – it insists on providing complete proofs of most of the elementary mathematical facts on which the theory is grounded.
A substantial part of the book is devoted to a detailed description of the necessary mathematical equipment. Applications culminate in an introduction to some delicacies of the electroweak theory, as well as of general relativity.
By Mark Thomson Cambridge University Press
Hardback: £40 $75
Also available as an e-book, and at the CERN bookshop
Mark Thomson has written a wonderful new introductory textbook on particle physics. As the title suggests, it is modern and up-to-date. It contains several chapters on the latest developments in neutrino physics, B-meson physics, on the LHC and of course also on the Higgs boson. All the same, as new data pour in, the latter part on the Higgs boson will have to be updated in future editions, of which I expect there to be many.
The book is aimed at students who are already familiar with quantum mechanics and special relativity, but not quantum field theory. Interestingly, although written by an experimentalist, I would say that this book, in level, is most closely comparable to the well-known textbook by Francis Halzen and Alan Martin, both theorists. However, it is an improvement in many ways.
It starts out with an extensive discussion on what can be measured by detectors, as well as the basics of scattering theory, and the Klein–Gordon and Dirac equations. Thomson then guides the reader carefully through pedagogical steps to the computation of matrix elements and cross-sections for scattering processes at fixed-target experiments and colliders. He uses the helicity-eigenstate basis, which helps to make the underlying physics in the reactions more evident. As a theorist, I might have enjoyed an emphasis on two-component fermions, but this might not be so readily digestible for experimentalists.
I found the chapter on flavour SU(3) well written and elucidating. The chapter on neutrino physics discusses the implications of the measurements of θ13 nicely, and presents the MINOS and Sudbury Neutrino Observatory experiments and their relevance to the determination of the neutrino parameters. Regarding neutrino oscillations, Thomson points out rightly the necessity of the wave-packet treatment, but unfortunately gives no reference to a more detailed discussion, such as the paper by Boris Kayser. The gauge principle and spontaneous symmetry breaking are explained in great detail. The emphasis throughout is always on explicit and concrete computations.
The book is well written – it is easy to read, with clear pedagogical lines of reasoning, and the layout is pleasing. There are numerous homework problems at the end of each chapter. My only criticism would be that since Thomson is an experimentalist, I expected a modern version of Don Perkins’ book, with many details on experimental techniques – that is, a different book. However, as I am teaching an introduction to theory this autumn, I will definitely be using this book.
CERN’s accelerator complex is gradually restarting, more than a year after the start of the first long shutdown (LS1). On 4 April, a team switched on the proton source and a week later re-commissioning began on Linac2. Beams are scheduled to be sent to the Proton Synchrotron Booster (PSB) at the end of May and then on to the Proton Synchrotron (PS) by mid-June, reaching the Super Proton Synchrotron (SPS) later this year, with the LHC restart on course for early 2015. One LHC sector has already begun the long cool down towards its operating temperature.
Since LS1 began, the Superconducting Magnets And Circuits Consolidation (SMACC) project has formed a “train” of workers in the LHC tunnel, with “wagons” of teams distributed across the eight sectors. The end of April marked the installation of the last of the 27,000 shunts – low-resistance connections to consolidate the 10,000 ”splices” that interconnect the superconducting magnet bus bars and can carry currents up to 13,000 A. The shunts provide an alternative path for a portion of this current in the event that a splice loses its superconducting state, to avoid repeating the incident in September 2008 that delayed the start-up of the LHC (CERN Courier September 2010 p27).
To install a shunt, the SMACC team had to first open the area around the interconnection, slide custom-built metallic bellows out of the way and remove the thermal shielding within. This revealed a series of metallic pipes linking the magnets to each other. One set of these pipes – the “M lines” – then had to be cut open to access the splices between the superconducting cables. The team opened up the last of the M lines in February this year (CERN Courier April 2014 p5), and since then has been continuing to install the shunts, finishing on 30 April.
Now, emphasis is shifting from installation to testing, including short-circuit tests to verify all of the LHC’s hardware components – cables, interlocks, energy extraction systems, power converters that provide current to the superconducting magnets, and the cooling system. The most complicated components are the superconducting circuits, which have a myriad of different failure modes with interlock and control systems. Testing these circuits now dramatically reduces the time spent in the tunnel during the powering tests at low temperatures, which are planned to start in August. Although the magnet circuits themselves cannot be tested at warm temperatures, the teams can verify the power converter and the circuits up to where the cables enter the magnets. These circuits are tested at the highest current the power converters can achieve – higher than the nominal current in the machine – for a few hours to a day. Teams then verify that the warming effect of the current stabilizes at a temperature that does not affect the behaviour of the cables. To do this, they use infrared cameras to check the temperature rise of the different parts of the circuit.
As well as short-circuit tests, electrical quality-assurance testing (ELQA) is ongoing. Sector 6-7 showed no non-conformities and, on 7 May, was the first of the LHC sectors to begin the cool down to 1.8 K. The other seven sectors will follow as the long journey of the SMACC train approaches its end and the teams complete their final tasks.
The CLOUD (Cosmic Leaving OUtdoor Droplets) experiment at CERN, which is studying whether cosmic rays have a climatically significant effect on aerosols and clouds, is also tackling one of the most challenging problems in atmospheric science – understanding how new aerosol particles are formed in the atmosphere and the effect that these particles have on climate. Aerosol particles and clouds have a large net cooling effect on the planet and, according to the Intergovernmental Panel on Climate Change, they represent the largest source of uncertainty in present climate models. A new study published by CLOUD now sheds new light on the first steps of cloud formation, helping to improve our understanding of the aerosol–cloud–climate connection.
Cloud droplets form on aerosol particles – tiny solid or liquid particles suspended in the atmosphere – above a size of about 50 nm. Aerosol particles are either emitted directly into the atmosphere (like sea-spray particles) or else form by the spontaneous clustering (“nucleation”) of trace atmospheric molecules. Around one half of all cloud seeds are thought to originate from nucleated particles, but the process is poorly understood. Sulphuric acid is thought to play a key role, but previous studies by CLOUD have shown that it cannot form new particles in the lower atmosphere without another ingredient to glue the molecules together and prevent them evaporating. CLOUD recently showed one such ingredient to be a class of vapours known as amines. However, these are only found close to primary sources, such as animal husbandry, so another ingredient must be involved.
Vapours at the level of one part in 1012 control atmospheric nucleation, so it is a challenge to meet the technological requirements for studies in the laboratory. The chamber used by the CLOUD collaboration has achieved much lower concentrations of contaminants than previous experiments, allowing nucleation to be measured under atmospheric conditions for the first time without the complicating effect of undetected gases. CLOUD uses state-of-the-art instruments to measure these very low concentrations of atmospheric vapours and also to measure the chemistry and growth of newly formed molecular clusters from single molecules up to stable particles. Another unique aspect of the experiment is the capability to measure nucleation arising from ionization by cosmic rays, or from the enhanced ionization provided by a pion beam from CERN’s Proton Synchrotron – or with the effects of all ionization suppressed completely by means of a strong electric clearing field.
For the latest results, CLOUD studied particle nucleation involving the oxidation products of a volatile biogenic vapour known as alpha-pinene, which gives pine forests their familiar smell. This and similar volatile biogenic vapours are oxidized in the atmosphere to produce daughter vapours with extremely low volatility. During the past few years, numerous studies have shown the importance of these oxidized biogenic vapours for growing freshly nucleated particles to sizes where they can seed cloud droplets. However, it was not known if these oxidized biogenic vapours could provide the glue to help the first sulphuric acid molecules stick together to form embryonic particles.
Now, CLOUD has shown that oxidized biogenic vapours do form new particles with sulphuric acid and, moreover, that this process can explain a large fraction of particle formation observed in the lower atmosphere. Ions produced in the atmosphere by galactic cosmic rays are found to provide significant additional stability to the clusters, but only when the concentrations of sulphuric acid and oxidized organic vapours are relatively low.
In addition to the experimental measurements, the CLOUD team reported theoretical and global modelling studies. Quantum chemical calculations confirm the stability of the embryonic clusters of sulphuric acid and oxidized organics. Moreover, a global modelling study that includes the new nucleation mechanism has captured – for the first time – the pronounced seasonal variation of new particle production that is observed in the atmosphere. The modelling results establish that trees play a fundamental role in forming new particles in the atmosphere, which is familiar as “blue haze” when viewing distant mountains.
• The CLOUD collaboration consists of the California Institute of Technology, Carnegie Mellon University, CERN, Finnish Meteorological Institute, Helsinki Institute of Physics, Johann Wolfgang Goethe University Frankfurt, Karlsruhe Institute of Technology, Lebedev Physical Institute, Leibniz Institute for Tropospheric Research, Paul Scherrer Institute, University of Beira Interior, University of Eastern Finland, University of Helsinki, University of Innsbruck, University of Leeds, University of Lisbon, University of Manchester, University of Stockholm and University of Vienna.
The new results shown by the ALICE collaboration at the Quark Matter 2014 conference in Darmstadt focus principally on the most recent collisions at the LHC – those of protons and lead nuclei (pPb) performed towards the end of Run 1 in early 2013. Planned mainly as a control experiment for studies of deconfined matter in lead–lead (PbPb) collisions, the pPb run in fact revealed tantalizing possibilities for studying collective phenomena in small systems, but with exceedingly-high-energy densities. This subject, already tackled in several theoretical papers, featured at the conference. Having already published interesting results on flow-like correlations measured via the azimuthal emission patterns of particles, ALICE presented a host of results on particle correlations, extending out to mesons containing charm quarks (D mesons).
The initial-state effects (gluon shadowing) are studied in pPb collisions via the nuclear modification factor, which quantifies the measurement in nuclear collisions with respect to that in pp collisions scaled by the corresponding number of binary collisions. Its dependence on transverse momentum (pT) for inclusive charged-particle production, already obtained with the data of the pilot pPb run of 2012 has been extended up to pT of 50 GeV/c (figure 1). The data support the binary-collision scaling up to the highest measured pT and confirm the predictions from saturation/shadowing models.
ALICE has obtained results for a variety of hadron species, including charmed and beauty hadrons (directly identified or studied via their decays into electrons or muons), as figure 2 shows. An enhancement at pT values around 5 GeV/c – known from measurements at lower energies as the “Cronin effect” – is found for protons and observed to depend on the “centrality” of the collision. For all hadrons, binary-collision scaling is fulfilled at high pT in pPb collisions. By contrast, in PbPb collisions, the measured suppression of hadron production is an indication of strong interactions within the hot and dense medium (or jet quenching, measured also with reconstructed jets).
Studies of Bose–Einstein pion correlations with two and three pions demonstrate that, in pPb collisions, the size of the system at freeze-out is closer to that in pp collisions than to PbPb, for events with the same particle multiplicities.
ALICE has also found that the ψ(2S) charmonium state is more suppressed in pPb, compared with expectations from the scaling of production in pp collisions, than its lower-mass sibling J/ψ. No convincing theoretical explanation has been put forward, but the ALICE results might imply a final-state effect, possibly owing to deconfined matter in pPb collisions.
Another interesting measurement presented in Darmstadt is the dependence of open-charm hadrons (D mesons) and charmonium production on the multiplicity measured in pPb collisions.
The interpretation of the above results is complicated by the observation in ALICE that the connection between experimental observables and collision geometry (the impact parameter) is much more complex in pPb collisions than in PbPb. A comprehensive study performed with data on charged-particle production illustrates this, and is likely to imply further challenges to theory: not the geometry, but the “event activity” in terms of the hadron multiplicity in a given region of rapidity will need to be properly modelled in theoretical descriptions of pPb collisions.
Lastly, the pPb data allowed for a first measurement ever in ALICE – that of W-boson production, performed at forward and backward rapidities with triggered data from the muon spectrometer.
The results for PbPb collisions also span a range of observables. A comprehensive study of hadron production was presented, with comparisons across systems, PbPb, pPb, and pp, as well as with measurements at Brookhaven’s Relativistic Heavy-Ion Collider (RHIC). Comparison of hadron abundances in PbPb collisions with thermal-model calculations lead to a temperature value lower than that extracted from data from RHIC. Some distinct discrepancies between model and data, like that for protons, have been a hot subject of discussion for some time. Those notwithstanding, the measured hadron abundances are, remarkably, predicted by the thermal model at chemical freeze-out (figure 3). The new ALICE data on (hyper)nuclei extend this conclusion to weakly-bound systems, such as the deuteron or the hypertriton.
Insights into collision dynamics are obtained from the analysis of spectra and flow of a variety of hadron species. Among them, the φ meson, containing a strange-antistrange quark pair and with a mass close to that of the proton, plays a prominent role. The data show that it is the mass of the φ meson, and not its quark content, that determines its pT spectrum and flow. The measurement of Υ production in PbPb reveals a rapidity-dependent suppression (figure 4), which finds no explanation in any theoretical model to date. This adds a new facet to the fascinating story of quarkonium in deconfined matter. Here, recent ALICE data on J/ψ agree with the theoretical understanding of charmonium production via (re)generation mechanisms at the chemical freeze-out or during the whole lifetime of the deconfined system.
The new measurement of the nuclear modification factor of electrons from beauty-hadron decays is a first step in ALICE into the sector of b-quark energy loss. ALICE has also performed the first measurement of coherent ψ(2S) photoproduction in ultra-peripheral PbPb collisions. Albeit not directly a “Quark Matter” topic, this is a relevant measurement because it constrains parton shadowing in the colliding nuclei. Another premiere is the first measurement of particle-type jet fragmentation at hadron colliders, achieved in pp collisions in ALICE with pions, kaons, and protons up to pT = 20 GeV/c.
A host of other results in PbPb collisions, covering jets, particle production and flow, and quarkonia – some already submitted for publication – mark the Quark Matter 2014 conference as a milestone in the ALICE physics output.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.