Comsol -leaderboard other pages

Topics

Bananaworld: Quantum Mechanics for Primates

By Jeffrey Bub
Oxford University Press

51SD0QRDsoL

This is not another “quantum mechanics for dummies” book, as the author himself states. Nevertheless, it is a text that talks about quantum mechanics but is not meant for experts in the field. It explains complex concepts of theoretical physics almost without bringing up formulas, and makes no reference to a specialist background.

The book focuses on an intriguing issue of present-day physics: nonlocality and the associated phenomenon of entanglement. Thinking in macroscopic terms, we know that what happens here affects only the surrounding environment. But going down to the microscopic level where quantum mechanics applies, we see that things work in a different way. Scientists discovered that in this case, besides the local effects, there are less evident effects that reveal themselves in strange correlations that occur instantaneously between remote locations. Even stronger nonlocal correlations, still consistent with relativity, have been theoretically supposed, but have not been observed up to now.

This complex subject is treated by the author using a particular metaphor, which is actually more than just that: he draws a metaphoric world made of magic bananas, and simple actions that can be performed on them. Thanks to this, he is able to explain nonlocality and other difficult physics concepts in a relatively easy and comprehensive way.

Even if it requires some general knowledge of mathematics and familiarity with science, this book will be accessible and interesting to a wide range of readers, as well as being an entertaining read.

Particles and the Universe: From the Ionian School to the Higgs Boson and Beyond

By Stephan Narison
World Scientific

particles-and-the-universe-from-the-ionian-school-to-the-higgs-boson-and-beyond

This book aims to present the history of particle physics, from the introduction of the concept of particles by Greek philosophers, to the discovery of the last tile of the Standard Model, the Higgs boson particle, which took place at CERN in 2012. Chronologically following the development of this field of science, the author gives an overview of the most important notions and theories of particle physics.

The text is divided into seven sections. The first part provides the basics concepts and a summary of the history of physics, arriving at the modern theory of forces, which are the subject of the second part. It carries on with the Higgs boson discovery and the description of some of the experimental apparatus used to study particles (from the LHC at CERN to cosmic rays and neutrino experiments). The author also provides a brief treatment of general relativity, the Big Bang model and the evolution of the universe, and discusses the future developments of particle physics.

In the main body of the book, the topics are presented in a non-technical fashion, in order to be accessible to non-experts. Nevertheless, a rich appendix provides demonstrations and further details for advanced readers. The text is accompanied by plenty of images, including paintings and photographs of many of the protagonists of particle physics.

Statistical Methods for Data Analysis in Particle Physics

By Luca Lista
Springer
Also available at the CERN bookshop

CCboo2_06_16

Particle-physics experiments are very expensive, not only in terms of the cost of building accelerators and detectors, but also due to the time spent by physicists and engineers in designing, building and running them. With the statistical analysis of the resulting data being relatively inexpensive, it is worth trying to use it optimally to extract the maximum information about the topic of interest, whilst avoiding claiming more than is justified. Thus, lectures on statistics have become regular in graduate courses, and workshops have been devoted to statistical issues in high-energy physics analysis. This also explains the number of books written by particle physicists on the practical applications of statistics to their field.

This latest book by Lista is based on the lectures that he has given at his home university in Naples, and elsewhere. As part of the Springer series of “Lecture Notes in Particle Physics”, it has the attractive feature of being short – a mere 172 pages. The disadvantage of this is that some of the explanations of statistical concepts would have benefited from a somewhat fuller treatment.

The range of topics covered is remarkably wide. The book starts with definitions of probability, while the final chapter is about discovery criteria and upper limits in searches for new phenomena, and benefits from Lista’s direct involvement in one of the large experiments at CERN’s LHC. It mentions such topics as the Feldman–Cousins method for confidence intervals, the CLs approach for upper limits, and the “look elsewhere effect”, which is relevant for discovery claims. However, there seems to be no mention of the fact that a motivation for the Feldman–Cousins method was to avoid empty intervals; the CLs method was introduced to protect against the possibility of excluding the signal plus background hypothesis when the analysis had little or no sensitivity to the presence or absence of the signal.

The book has no index, nor problems for readers to solve. The latter is unfortunate. In common with learning to swim, play the violin and many other activities, it is virtually impossible to become proficient at statistics by merely reading about it: some practical exercise is also required. However, many worked examples are included.

There are several minor typos that the editorial system failed to notice; and in addition, figure 2.17, in which the uncertainty region for a pair of parameters is compared to the uncertainties in each of them separately, is confusing.

There are places where I disagree with Lista’s emphasis (although statistics is a subject that often does produce interesting discussions). For example, Lista claims it is counter-intuitive that, for a given observed number of events, an experiment that has a larger than expected number of background events (b) provides a tighter upper limit than one with a smaller background (i.e. a better experiment). However, if there are 10 observed events, it is reasonable that the upper limit on any possible signal is better if b = 10 than if b = 0. What is true is that the expected limit is better for the experiment with smaller backgrounds.

Finally, the last three chapters could be useful to graduate students and postdocs entering the exciting field of searching for signs of new physics in high energy or non-accelerator experiments, provided that they have other resources to expand on some of Lista’s shorter explanations.

Path Integrals for Pedestrians

By E Gozzi, E Cattaruzza and C Pagani
World Scientific

CCboo3_06_16

The path integral formulation of quantum mechanics is one the basic tools used to construct quantum field theories, especially gauge-invariant theories. It is the bread and butter of modern field theory. Feynman’s original formulation developed and extended some of the work of Dirac in the early 1930s, and provided an elegant and insightful solution to a generic Schrödinger equation.

This short book provides a clear, pedagogical and insightful presentation of the subject. The derivations of the basic results are crystal clear, and the applications worked out to be rather original. It includes a nice presentation of the WKB approximation within this context, including the Van Vleck and functional determinant, the connections formulae and the semiclassical propagator.

An interesting innovation in this book is that the authors provide a clear presentation of the path integral formulation of the Wigner functions, which are fundamental in the study of quantum statistical mechanics; and, for the first time in an elementary book, the work of Koopman and von Neumann on classical and statistical mechanics.

The book closes with a well selected set of appendices, where some further technical details and clarifications are presented. Some of the more mathematical details in the basic derivations can be found there, as well as aspects of operator ordering as seen from the path integral point formulation, the formulation in momentum space, and the use of Grassmann variables, etc.

It will be difficult to find a better and more compact introduction to this fundamental subject.

Record-breaking production at the LHC

The past few weeks have been a record-breaking period for the LHC, with the machine now delivering long fills with unprecedented luminosity. Following the interruption in late May due to problems with the PS main power supply, on 1 June the operations team established collisions with 2040 bunches for the first time this year. This is the maximum number of bunches achievable with the current limitations from the SPS beam dump, which allows the injection of trains of 72 bunches spaced by 25 ns.

The following week saw LHC’s previous luminosity record at 6.5 TeV broken by a peak luminosity of just over 8 × 1033 cm–2 s–1, representing 80% of the design luminosity. This was followed by a new record for integrated luminosity in a single fill, with 370 pb–1 delivered in just 18 hours of colliding beams. The availability for collisions during this period was a remarkable 75%, more than double the annual average in 2015. Around 2 fb–1 were delivered during one week, breaking the previous record of 1.4 fb–1 established in June 2012.

These records follow the decision taken at the end of May to focus on delivering the highest possible integrated luminosity for the summer conferences. Following a short technical stop that ended on 9 June, the machine was re-validated from a machine-protection perspective for a sustained period of 2040 bunch operation at high luminosity. Afterwards, new records were set immediately, with one fill on 13–14 June producing more than 0.5 fb–1 in around 27 hours, and the following fill recording a peak luminosity of over 9 × 1033 cm–2 s–1. The record integrated luminosity delivered in seven days now stands at 2.4 fb–1. Finally, on 26 June, the team hit the LHC design luminosity (1034 cm–2 s–1) for the first time. With such performance, the operations team hopes to deliver over 10 fb–1 to both ATLAS and CMS before the summer conferences.

This is truly a new phase for the LHC and thanks are due to all the teams who have worked tirelessly to make it possible. This year the smaller beam size at the interaction points provides almost double the instantaneous luminosity compared to 2015, yet the machine is behaving impeccably. The stunning and surprising availability is due to a sustained effort over the years by hardware groups such as cryogenics, quench protection, power converters, RF, collimation, injection and others to maximise the reliability of their systems. Of particular note is the major effort co-ordinated by the radiation to electronics team to mitigate the effects of beam-induced radiation on tunnel electronics.

Another case in point concerns cryogenics. With so many bunches circulating, the heat load deposited by the electron cloud on the LHC beam screens in the arcs can reach 150 W per half-cell (a half-cell in an arc includes one quadrupole and three dipole magnets). This is just below the maximum of 160 W that can be sustained by the cryogenics system. With a new cryogenic feed-forward system in place to tune the beam-screen cooling parameters according to the intensity stored in the machine and the beam energy, operation with the high electron cloud currently present in the machine is significantly smoother than in 2015. Of course, the LHC remains a hugely complex machine and the availability is always liable to fluctuations.

CMS highlights from the fourth LHCP conference

The CMS collaboration presented 15 new results at the fourth annual Large Hadron Collider Physics (LHCP) conference on 13–18 June in Lund, Sweden. The results included a mixture of searches for new physics and Standard Model measurements at a centre-of-mass energy of 13 TeV. CMS also summarized its detector and physics-object performance on recently collected 2016 data, demonstrating that the collaboration has emerged from the winter shutdown ready for discovery physics.

The search for new physics in 13 TeV proton collisions continues in earnest, with six new results presented at LHCP. A combined search for high-mass resonances decaying to the Zγ final state, with Z bosons decaying to leptons, in the 8 and 13 TeV data sets yields no significant deviation from background expectations for masses ranging from a few hundred GeV to 2 TeV (EXO-16-021). A similar search in the same channel, but with Z bosons decaying to quarks, produced a similar conclusion (EXO-16-020). CMS has also searched for heavy Z´ bosons that decay preferentially to third-generation fermions, including decays to pairs of top quarks (B2G-15-003) and τ leptons (EXO-16-008), and found no excess above the Standard Model prediction.

The top quark-pair analysis uses special techniques to search the all-hadronic final state, where the highly boosted top quarks are reconstructed as single jets, while the search in the τ lepton channel is carried out in four final states depending on the decay mode. No significant signals are observed in either search, resulting in the exclusion of Z´ bosons up to a mass of 3.3 (3.8) TeV for widths of 10 (30)% relative to the mass in the top search, and 2.1 TeV in the τ lepton search. Another search using the τ lepton looks for heavy neutrinos from right-handed W bosons and third-generation scalar leptoquarks in events containing jets and two hadronically decaying taus. This is the first such search for heavy neutrinos using τ leptons, and CMS finds the data well described by Standard Model backgrounds.

CMS continues to probe for possible dark-matter candidates, most recently in final states that contain top quarks (EXO-16-017) or photons (EXO-16-014) plus missing energy. The data are consistent with Standard Model backgrounds and limits are placed on model parameters associated with the dark matter and graviton hypotheses. A search for supersymmetric particles in the lepton-plus-jets final state was also presented for the first time (SUS-16-011). This analysis targets so-called compressed spectra in which weakly interacting supersymmetric particles can have similar masses, giving rise to muons and electrons with very low transverse momentum. No significant signals are observed and limits are placed on the masses of top squarks and gluinos under various assumptions about the mass splittings of the intermediate states.

Finally, a search for a heavy vector-like top quark T decaying to a standard top quark and a Higgs boson (B2G-16-005) was presented for the first time at LHCP. For T masses above 1 TeV, the top quark and Higgs boson are highly boosted and their decay products are reconstructed using similar techniques as in B2G-15-003. Here the data are also consistent with background expectations, allowing CMS to set limits on the product of the cross section and branching fraction for T masses in the range 1.0–1.8 TeV.

Several new Standard Model measurements were shown for the first time at LHCP, including the first measurement of the top-quark cross section at 5 TeV (TOP-16-015) based on data collected during a special proton–proton reference run in 2015 (figure 1). A first measurement by CMS of the WW di-boson cross-section at 13 TeV was also reported (SMP-16-006), where the precision has already reached better than 10%. Finally, three new results on Higgs boson physics were presented for the first time, including the first searches at 13 TeV for vector boson fusion Higgs production in the bottom quark decay channel (HIG-16-003) and a search for Higgs bosons produced in the context of the MSSM model that decay via the τ lepton channel (HIG-16-006). A first look at Higgs lepton-flavor-violating decays in the 13 TeV data (HIG-16-005), using the μτ channel, does not confirm a slight (2.4σ) excess observed in Run 1, although more data is needed to make a definitive conclusion.

ATLAS clocks rare decay of B mesons into muon pairs

ATLAS

The decays of the B0s and B0 into muon pairs represent an important test of the Standard Model. Such decays take place through a flavour-changing neutral current process, which occurs only through loop diagrams and is further suppressed because the two muons are required to have equal helicity in order to conserve angular momentum.

Although the very small value of the predicted branching fractions (3.7 × 10–9 and 1.1 × 10–10 for the B0s and B0, respectively) opens the possibility to search for new physics, the decays present a challenge for experimental programs. Physicists have been placing upper limits on these processes for more than 30 years, with the values decreasing by roughly two orders of magnitude every decade.

ATLAS recently presented the result of a study based on data collected during LHC Run 1, completing the results obtained by CMS and LHCb (CERN Courier September 2013 p19). The new analysis exploits multivariate techniques for the reduction of background events that could mask the small signal from B-meson decays. A first classifier is used to reduce the background due to muons from uncorrelated decays of B hadrons, while a second classifier is used to reduce the fraction of hadrons wrongly identified as muons. Misidentification contributes to the background due to partially reconstructed decays, and is at the origin of the resonant background due to B0s decays into pairs of charged mesons when both are mistaken as muons. ATLAS has achieved values of about 0.10% and 0.05% for the probability of a kaon or a pion to be wrongly identified as a muon, pushing the resonant background below the predicted level of the signal.

For the B0s meson, the branching fraction measured by ATLAS is B(B0s → μ+μ) = (0.9+1.1–0.8) × 10–9, with an upper limit of 3.0 × 10–9 at a 95% confidence level. The result agrees, within uncertainties, with those of CMS and LHCb. It is lower than the Standard Model prediction but is compatible at the level of two standard deviations. For the B0 an upper limit B(B0 → μ+μ) < 4.2 × 10–10

is set at a confidence level of 95%, which again is compatible with previous evidence and predictions.

The new result constrains models for new physics that predict a significant enhancement of these B decays, such as some with an extended Higgs sector. Deviations in the direction of lower branching fractions require further clarification with data collected during LHC Run2.

LHCb updates Bs oscillation measurement

LHCb

The LHCb experiment has recently made the most precise measurement yet of the asymmetry in oscillations between the matter and antimatter versions of Bs mesons. The measurement exploits the full LHCb data set recorded during Run 1 of the LHC and is consistent with the Standard Model prediction.

Subtle quantum mechanics effects allow the Bs meson, which contains a strange quark and a beauty antiquark, to spontaneously transform into its own antiparticle, BS, in which the quark-antiquark assignment is reversed. Due to quantum interference effects, in the Standard Model this transition occurs at almost exactly the same rate as the reverse process, with the asymmetry between them being predicted to be two parts in a hundred thousand. Finding an asymmetry that is significantly different from this value would suggest that particle-antiparticle oscillations can be indirectly affected by the presence of heavy new particles, as are predicted in new physics models.

Many of the oscillations can occur within the finite lifetime of the Bs mesons, and an asymmetry would therefore appear as a difference in the numbers of Bs and BS meson decays observed by LHCb. Semi-leptonic decays into a charmed hadron, a muon and a neutrino are particularly suited, and the LHCb data set contains around two million of them. The challenge is to avoid being fooled by fake sources of asymmetry due to small imperfections in the detector. Novel methods have been developed to control these sources based on extensive use of the rich samples of signals with charm and charmonium decays.

The final measured asymmetry is 0.45±0.26±0.20%, which is a factor of two more precise than the next-best measurement from the D0 experiment at Fermilab (see figure). The 13 TeV data that are now being recorded will provide an increased rate of Bs and Bd mesons, which will enable LHCb to probe far smaller asymmetries and enhance its sensitivity to possible new physics effects.

ALICE separates hot and cold nuclear effects

ALICE

In the early stages of a high-energy collision, high-pT partons can be created, before producing sprays of hadrons that are measured experimentally as jets. Not only do high-pT partons carry information about the parton scattering itself, but they also serve as probes for the environment they cross. In nucleus–nucleus collisions, for instance, high-pT partons probe the strongly interacting medium of quarks and gluons (the quark-gluon plasma, QGP). Due to the interactions of these partons with the QGP, particle production is suppressed at large transverse momentum compared to an incoherent superposition of nucleon–nucleon collisions.

Although this observation is one of the key results in heavy-ion collisions, it is a priori not clear to which extent the suppression is caused by hot nuclear-matter effects, such as the jet-medium interaction, and to which extent by cold nuclear-matter effects, such as the presence of the nucleus itself. Unlike in lead–lead collisions, modifications of jet production due to hot nuclear-matter effects are not expected in proton–lead collisions. Therefore, measurements of the nuclear modification of jet spectra in proton–lead collisions can be used to disentangle cold from hot nuclear-matter effects.

ALICE has recently measured charged jet spectra and their nuclear modification in proton–lead collisions for transverse momenta within 20–120 GeV/c. The main corrections include the subtraction of a mean underlying event density and a statistical treatment of within-event fluctuations, as well as an unfolding of the detector response. One of the challenges in analysing proton–lead collisions is to be able to measure the collision geometry (called the event centrality), and also to evaluate the mean number of binary nucleon–nucleon collisions for different centralities. Several methods for centrality determination were tested in ALICE and the least-biased method was used for this measurement.

The measurement produces a clear result: for the probed acceptance, and within the systematic and statistical uncertainties, all nuclear modification factors are compatible with unity. The charged jet spectra measured in proton–lead collisions at an energy of 5.02 TeV do not show any significant centrality dependence and they scale with the jet spectra in proton–proton collisions at the same energy. Therefore, there is no evidence that high transverse momentum jets are modified by the cold nuclear medium, confirming the conclusion drawn from measurements of the nuclear modification factor for single high-transverse momentum hadrons.

ESO signs largest ever ground-based astronomy contract

The European Extremely Large Telescope (E-ELT) will be the largest optical/near-infrared telescope in the world, boasting a primary mirror 39 m in diameter. Its aim is to measure the properties of the first stars and galaxies and to probe the nature of dark matter and dark energy, in addition to tracking down Earth-like planets.

At a ceremony in Garching bei München, Germany, on 25 May, the European Southern Observatory (ESO) signed a contract with the ACe Consortium for the construction of the dome and telescope structure of the E-ELT. With an approximate value of €400 million it is the largest contract ever awarded by ESO and the largest contract ever in ground-based astronomy. The occasion also saw the unveiling of the construction design of the E-ELT, which is due to enter operation in 2024.

The construction of the E-ELT dome and telescope structure can now commence, taking telescope engineering into new territory. The contract includes not only the enormous 85 m-diameter rotating dome, with a total mass of around 5000 tonnes, but also the telescope mounting and tube structure, with a total moving mass of more than 3000 tonnes. Both of these structures are by far the largest ever built for an optical/infrared telescope and dwarf all existing ones.

The E-ELT is being built on Cerro Armazones, a 3000 m-high peak about 20 km from ESO’s Paranal Observatory. The access road and leveling of the summit have already been completed and work on the dome is expected to start on site in 2017.

bright-rec iop pub iop-science physcis connect