Comsol -leaderboard other pages

Topics

Gravity: Newtonian, Post-Newtonian, Relativistic

By Eric Poisson and Clifford M Will
Cambridge University Press
Hardback: £50 $85
E-book: $68
Also available at the CERN bookshop

CCboo2_01_15th

I heard good things about this book before I got my hands on it, and turning the pages I recognized a classic. Several random reads of its 788 large, dense pages offered a deeper insight into a novel domain, far away from my daily life where I work with the microscopic and cosmological worlds. On deeper inspection, it was nearly all that I hoped for, with only a couple of areas where I was disappointed.

The forward points out clearly that the reader should not expect any mention of cosmology. Yet the topic of the book has a clear interface with the expanding universe via its connection to our solar system, the so-called vacuole Einstein–Straus solution. Another topic that comes in too short for my taste is that of Eddington’s isotropic (Cartesian) co-ordinates. They appear on pages 268–269, and resurface in a minor mention on page 704 before the authors’ parametrized post-Newtonian approach is discussed. While this is in line with the treatment in the earlier book by one of the authors (Theory and Experiment in Gravitational Physics by C M Will, CUP 1993), it seems to me that this area has grown in significance in recent years.

The book is not about special relativity, but it is a topic that must of course appear. However, it is odd that Box 4.1 on pages 191–192 on “Tests of Special Relativity” relies on publications from 1977, 1966, 1941 and 1938. I can feel the pain of colleagues – including friends in particle and nuclear physics – who have worked hard during recent decades to improve limits by many orders of magnitude. And on page 190, I see a dead point in the history of special relativity – authors, please note. Lorentz failed to write down the transformation named after him by Poincaré, who guessed the solution to the invariance of Maxwell’s equations, a guess that escaped Lorentz. However, Einstein was first to publish his own brilliant derivation.

We know that no book is perfect and complete, entirely without errors and omissions. So the question to be asked is, how useful is this book to you? To find the answer, I’d recommend reading the highly articulate preface available, for example, under “Front Matter” on the publisher’s website. I quote a few words because I could not say it better: “This book is about approximations to Einstein’s theory of general relativity, and their applications to planetary motion around the Sun, to the timing of binary pulsars, to gravitational waves emitted by binary black holes and to many real-life, astrophysical systems…this book is therefore the physics of weak gravitational fields.”

Personally, I found in the book what I was looking for: the technical detail of the physics of large objects such as planets and stars, which can be as many times larger than the proton as they are smaller than the universe. I could not put the book down, despite its weight (1.88 kg). Some might prefer the Kindle edition, but I would hope for a shrunk-silk volume. Whichever you choose or is available, in dollars per page this book is a bargain. It is a great read that will enrich any personal library.

Data Analysis in High Energy Physics: A Practical Guide to Statistical Methods

By Olaf Behnke, Kevin Kröninger, Grégory Schott and Thomas Schörner-Sadenius (eds)
Wiley
Paperback: £60 €72
E-book: £48.99 €61.99
Also available at the CERN bookshop

CCboo1_01_15th

This book is actually 11 books in one, with 16 authors, four of whom are also editors. All are high-energy physicists, including one theorist, and all are experts in their assigned areas of data analysis, so the general level of the book is excellent. In addition, the editors have done a good job putting the 11 chapters together so that they work as a single book, and they have even given it a global index. Still, each chapter has its own author(s) and its own style, and I will comment on the individual contributions that I found most interesting.

Roger Barlow (“Fundamental Concepts”) gives a good introduction to the foundations, but surprisingly he has some trouble with frequentist probability, which is the one that physicists understand best because it is the probability of quantum mechanics. Instead of taking an example from physics, where experiments are repeatable and frequentist probability is applicable, he uses life insurance and finds problems. But his example for Bayes’s theorem works fine with frequentist probabilities, even if they are not from physics.

Olaf Behnke and Lorenzo Moneta (“Parameter Estimation”) have produced a useful practical guide for their chapter. The treatment is remarkably complete and concise. I especially liked figure 2.9, which illustrates the fit of a typical histogram to a single peak, showing the value of chi-square as a function of peak position across the whole range of the abscissa, with a local minimum at every fluctuation in the data.

Luc Demortier (“Interval Estimation”) displays an impressive knowledge of both frequentist and Bayesian methodologies, and is careful to list the good and bad features of both in a level of detail that I have seen nowhere else, and did not expect to find in a “practical guide”. He succeeds in presenting a balanced view overall, even though his personal prior shows through in the first sentence, where the point estimate is intuitively defined as “in some sense the most likely value”, instead of the more tangible “in some sense the value closest to the true value”.

The most remarkable aspect of this book is found in the chapters devoted to topics that are not usually covered in books on statistics. Therefore “Classification” (by Helge Voss) is treated separately from “Hypothesis Testing” (by Grégory Schott), describing techniques that are common in data analysis but not used in traditional statistics. In “Unfolding”, Volker Blobel reminds us that statistics is really an inverse problem, although it is not usually treated as such. There are two separate chapters on “Theory Uncertainties” and other “Systematic Uncertainties”, a chapter on “Constrained Fits” and two chapters on “Applications”, some of which duplicate subjects treated elsewhere, but of course from a different point of view. In the concluding chapter, Harrison Prosper, in his inimitable style, takes the reader on “a journey to the field of astronomy”.

In summary, this ambitious project has produced a useful book where experimental physicists will find expert knowledge about a range of topics that are indispensable to their work of data analysis.

The LHC gears up for season 2

With the end of the long shutdown in sight, teams at CERN have continued preparations for the restart of the Large Hadron Collider (LHC) this spring after reaching several important milestones by the end of 2014. Beams came knocking at the LHC’s door for the first time on 22–23 November, when protons from the Super Proton Synchrotron passed into the two LHC injection lines and were stopped by beam dumps just short of entering the accelerator. The LHC operations team used these tests to check the control systems, beam instrumentation and transfer-line alignment. Secondary particles – primarily muons – generated during the dump were in turn used to calibrate the two LHC experiments located close to the transfer lines: ALICE and LHCb.

During the same weekend, the operations team also carried out direct tests of LHC equipment. They looked at the timing synchronization between the beam and the LHC injection and extraction systems by pulsing the injection kicker magnets and triggering the beam-dump system in point 6, despite having no beam.

Tests of each of the eight LHC sectors continued apace. By the end of November, copper-stabilizer continuity measurements were underway in sector 4-5 and were about to start in sector 3-4. Electrical quality assurance tests were being carried out in sectors 2-3 and 7-8, and powering tests were progressing in sectors 8-1, 1-2, 5-6 and 6-7. Cooling and ventilation teams were also busy carrying out maintenance of the systems at points around the LHC ring.

Meanwhile, the operations team were training the magnets in sector 6-7. The first training quench was performed on 31 October, reaching a current of around 10,000 A, which corresponds to a magnetic field of 6.9 T and a proton beam energy of 5.8 TeV (during Run 1, the LHC ran with proton energies of up to 4 TeV). On 9 December, the team successfully commissioned sector 6-7 to the nominal energy for Run 2 – 6.5 TeV, for proton collisions at 13 TeV. The 154 superconducting dipole magnets that make up this sector were powered to around 11,000 A. This increase in nominal energy was possible thanks to the long shutdown, which began in February 2013 and allowed the consolidation of 1700 magnet interconnections, including more than 10,000 superconducting splices. The magnets in all of the other sectors are undergoing similar training prior to 6.5 TeV operation.

In mid-December, the cryogenics team finished filling the arc sections of the LHC with liquid helium. This marked an important step on the road to cooling the entire accelerator to 1.9 K. During the end-of-year break, the cryogenic system was then set to stand-by, with elements such as stand-alone magnets emptied of liquid helium. These elements were to return to cryogenic conditions in January, to allow the operations team to perform more tests on the road to the LHC’s Run 2.

The four large experiments of the LHC – ALICE, ATLAS, CMS and LHCb – are also undergoing major preparatory work for Run 2, after the long shutdown during which important programmes for maintenance and improvements were achieved. They are now entering their final commissioning phase. Here, members of the ATLAS collaboration are cleaning up the inside of the ATLAS detector prior to closing the cavern in preparation for Run 2.

Pakistan to become associate member state of CERN

CCnew3_01_15th

On 19 December, CERN’s director-general, Rolf Heuer, and the chairman of the Pakistan Atomic Energy Commission, Ansar Parvez, signed in Islamabad the agreement admitting the Islamic Republic of Pakistan to associate membership of CERN, in the presence of prime minister Nawaz Sharif and diplomatic representatives of CERN member states. This followed approval by CERN Council to proceed towards associate membership for Pakistan during its 172nd session held in September 2014. The agreement is still subject to ratification by the government of Pakistan.

The Islamic Republic of Pakistan and CERN signed a co-operation agreement in 1994. The signature of several protocols followed, and Pakistan contributed to building the CMS and ATLAS experiments. Today, Pakistan contributes to the ALICE, ATLAS and CMS experiments, and operates a Tier-2 computing centre in the Worldwide LHC Computing Grid that helps to process and analyse the massive amounts of data that the experiments generate. Pakistan is also involved in accelerator developments, making it an important partner for CERN.

The associate membership of Pakistan will open a new era of co-operation that will strengthen the long-term partnership between CERN and the Pakistani scientific community. Associate membership will allow Pakistan to participate in the governance of CERN, through attending the meetings of the CERN Council. Moreover, it will allow Pakistani scientists to become CERN staff members, and to participate in CERN’s training and career-development programmes. Finally, it will allow Pakistani industry to bid for CERN contracts, therefore opening up opportunities for industrial collaboration in areas of advanced technology.

CERN-JINR reciprocal observers

During its December meeting, Council also welcomed the Joint Institute for Nuclear Research, JINR, for the first time as an observer to Council, as part of a reciprocal arrangement that also sees CERN becoming an observer at JINR. Founded as an international organization at Dubna near Moscow in 1956, JINR soon forged a close partnership with CERN that saw exchanges of personnel and equipment throughout the cold war and beyond.

LHCf detectors are back in the LHC tunnel

The Large Hadron Collider forward (LHCf) experiment measures neutral particles emitted around zero degrees of the hadron interactions at the LHC. Because these “very forward” particles carry a large fraction of the collision energy, they are important for understanding the development of atmospheric air-shower phenomena produced by high-energy cosmic rays. Two independent detectors, Arm1 and Arm2, are installed in the target neutral absorbers (TANs) at 140 m from interaction point 1 (IP1) in the LHC, where the single beam pipe is split into two narrow pipes.

After a successful physics operation in 2009/2010, the LHCf collaboration immediately removed their detectors from the tunnel in July 2010 to avoid severe radiation damage. The Arm2 detector, in the direction of IP2, came back into the tunnel for data-taking with proton–lead collisions in 2013, while Arm1 was being upgraded to be a radiation-hard detector, using Gd2SiO5 scintillators. After completion of the upgrade for both Arm1 and Arm2, the performance of the detectors was tested at the Super Proton Synchrotron fixed beam line in Prévessin in October 2014. Both Arm1 and Arm2 were then reinstalled in the LHC tunnel on 17 and 24 November, respectively.

CCnew5_01_15th

The installation went smoothly, thanks to the well-equipped remote-handling system for the TAN instrumentation. During the following days, cabling, commissioning and the geometrical survey of the detectors took place without any serious trouble.

LHCf will restart the activity to relaunch the data-acquisition system in early 2015, to be ready for the dedicated operation time in May 2015 when the LHC will provide low luminosity, low pile-up and high β* (20 m) proton–proton collisions. At √s = 13 TeV, these collisions correspond to interactions in the atmosphere of cosmic rays with energy of 0.9 × 1017 eV. This is the energy at which the origins of the cosmic rays are believed to switch from galactic to extragalactic, and a sudden change of the primary mass is expected. Cosmic-ray physicists expect to confirm this standard scenario of cosmic rays based on the highest-energy LHC data.

Another highlight of the 2015 run will be common data-taking with the ATLAS experiment. LHCf will send trigger signals to ATLAS, and ATLAS will record data after pre-scaling. Based on a preliminary Monte Carlo study using PYTHIA8, which selected events with low central activity in ATLAS, LHCf can select very pure (99%) events produced by diffractive dissociation processes. The identification of the origin of the forward particles will help future developments of hadronic-interaction models.

Narrowing down the ‘stealth stop’ gap with ATLAS

In late 2011, ATLAS launched a dedicated programme targeting searches for the supersymmetric partner of the top quark – the scalar top, or “stop” – which could be pair-produced in high-energy proton–proton collisions. If not much heavier than the top quark, this new particle is expected to play a key role in explaining why the Higgs boson is light.

While earlier supersymmetry (SUSY) searches at the LHC have already set stringent exclusion limits on strongly produced SUSY particles, these generic searches were not very sensitive to the stop. If it exists, the stop could decay in a number of ways, depending on its mass and other SUSY parameters. Most of the searches at the LHC assume that the stop decays to the lightest SUSY particle (LSP) and one or more Standard Model particles. The LSP is typically assumed to be stable and only weakly interacting, making it a viable candidate for dark matter. Events with stop-pair production would therefore feature large missing transverse momentum as the two resulting LSPs escape the detector.

The first set of results from the searches by ATLAS were presented at the International Conference on High-Energy Physics (ICHEP) in 2012. A stop with mass between around 225 and 500 GeV for a nearly massless LSP was excluded for the simplest decay mode. Exclusion limits were also set for more complex stop decays.

CCnew7_01_15th

These searches revealed a sensitivity gap when the stop is about as heavy as the top quark – a scenario that is particularly interesting and well motivated theoretically. Such a “stealth stop” hides its presence in the data, because it resembles the top quark, which is pair-produced roughly six times more abundantly.

Use of the full LHC Run-1 data set, together with the development of novel analysis techniques, has pushed the stop exclusion in all directions. The figure shows the ATLAS limits as of the ICHEP 2014 conference, in the plane of LSP mass versus stop mass for each of the following stop decays: to an on-shell top quark and the LSP (right-most area); to an off-shell top quark and the LSP (middle area); to a bottom quark, off-shell W boson, and the LSP (left-most grey area); or to a charm quark and the LSP (left-most pink area). The exclusion is achieved by the complementarity of four targeted searches (ATLAS Collaboration 2014a–2014d). The results eliminate a stop of mass between approximately 100 and 700 GeV (lower masses were excluded by data from the Large Electron–Positron collider) for a light LSP. Gaps in the excluded region for intermediate stop masses are reduced but persist, including the prominent region corresponding to the stealth stop.

Standard Model top-quark measurements can be exploited to get a different handle on the potential presence of a stealth stop. The latest ATLAS high-precision top–antitop cross-section measurement, together with a state-of-the-art theoretical prediction, has allowed ATLAS to exclude a stealth stop between the mass of the top quark and 177 GeV, for a stop decaying to a top quark and the LSP.

The measurement of the top–antitop spin correlation adds extra sensitivity because the stop and the top quark differ by half a unit in spin. The latest ATLAS measurement (ATLAS Collaboration 2014e) uses the distribution of the azimuthal angle between the two leptons from the top decays, together with cross-section information, to extend the limit for the stealth stop up to 191 GeV.

The rigorous search programme undertaken by ATLAS has ruled out large parts of interesting regions of the stop model and closed in on a stealth stop. It leaves the door open for discovery of a stop beyond the current mass reach, or in remaining sensitivity gaps, at the higher-energy and higher-luminosity LHC Run 2.

CMS measures the ‘underlying event’ in pp collisions

Ever since the earliest experiments with hadron beams, and subsequently during the era of the hadron colliders heralded by CERN’s Intersecting Storage Rings, it has been clear that hadron collisions are highly complicated processes. Indeed, initially it was far from obvious whether it would be possible to do any detailed studies of elementary particle physics with hadron collisions at all.

The question was whether the physics of “interesting” particle production could be distinguished from that of the “background” contribution in hadron collisions. While the former is typically a single parton–parton scattering process at very high transverse momentum (pT), the latter consists of the remnants of the two protons that did not participate in the hard scatter, including the products of any additional soft, multiple-parton interactions. Present in every proton–proton (pp) collision, this soft-physics component is referred to as the “underlying event”, and its understanding is a crucial factor in increasing the precision of physics measurements at high pT. Now, the CMS collaboration has released its latest analysis of the underlying event data at 2.76 TeV at the LHC.

CCnew9_01_15th

The measurement builds on experimental techniques that have been developed at Fermilab’s Tevatron and previously at the LHC to perform measurements that are sensitive to the physics of the underlying event. The main idea is to measure particle production in the region of phase space orthogonal to the high-pT process – that is, in the transverse plane. In its latest analysis of the underlying event data at 2.76 TeV, CMS has measured both the average charged-particle multiplicity as well as the pT sum for the charged particles. The scale of the hard parton–parton scattering is defined by the pT of the most energetic jet of the event.

The measurements are expected to result in more accurate simulations of pp collisions at the LHC. Because the properties of the underlying event cannot be derived from first principles in QCD, Monte Carlo generators employ phenomenological models with several free parameters that need to be “tuned” to reproduce experimental measurements such as the current one from CMS.

An important part of the studies concerns the evolution of the underlying-event properties with collision energy. CMS has therefore presented measurements at centre-of-mass energies of 0.9, 2.76 and 7 TeV. Soon, there will be new data from Run 2 at the LHC. The centre-of-mass energy of 13 TeV will necessitate further measurements, and provide an opportunity to probe the ever-present underlying event in uncharted territory.

LHCb observes two new strange-beauty baryons

The LHCb collaboration has discovered two new particles, the Ξ´b and Ξ*–b. Predicted to exist by the quark model, they are both baryons containing three quarks, in this case, b, s and d. The new particles – which thanks to the heavyweight b quarks are more than six times as massive as the proton – join the Ξb, found several years ago by the D0 and CDF experiments at Fermilab.

The three particles are differentiated by the spin, j, of the sd diquark, and the overall spin-parity, JP, of the baryon, and in turn the relative spins of the quarks affect the masses of the particles. With j = 0 and JP = ½+, the Ξb is the lightest, and so decays relatively slowly through the weak interaction, leading to its discovery at Fermilab’s Tevatron. The Ξ´b and Ξ*–b have j = 1, and JP = ½+ and JP = 3/2+, respectively, and should decay either strongly or electromagnetically, depending on their masses.

CCnew11_01_15th

LHCb analysed proton–proton collision data from the LHC corresponding to an integrated luminosity of 3.0 fb–1, to observe the new particles through their decay to Ξ0b π. A third of the data were collected at a centre-of-mass energy of 7 TeV, the remainder at 8 TeV. Signal candidates were reconstructed in the final state Ξ0b π, where the Ξ0b was identified through its decay Ξ0b → Ξ+c π, Ξ+c → p K π+.

The figure shows the distribution of δm, defined as the invariant mass of the Ξbπ pair minus the sum of the π mass and the measured Ξ0b mass. This definition means that the lightest possible mass for the Ξ0π pair – the threshold for the decay – is at δm = 0. The two peaks are clear observation of the Ξ´b(left) and Ξ*–b (right) baryons above the hatched-red histogram representing the expected background. The Ξ*–b is clearly the more unstable of the two, because its peak is wider. This is consistent with the pattern of masses: the Ξ´–bmass is just slightly above the energy threshold, so it can decay to Ξ0b π, but only just – its width is consistent with zero, with an upper limit of Γ(Ξ´b) < 0.08 MeV at 95% confidence level.

The results show the extraordinary precision of which LHCb is capable: the mass difference between the Ξ´b and the Ξ0b is measured with an uncertainty of about 0.02 MeV/c2, less than four-millionths of the Ξ0b mass. By observing these particles and measuring their properties with such accuracy, LHCb is making a stringent test of models of nonperturbative QCD. Theorists will be able to use these measurements as an anchor point for future predictions.

Profiling jets with ALICE

Scaled pT spectra of charged particles in jets for different bins of jet transverse momentum.

 

“Jets are collimated sprays of particles.” This ubiquitous characterization used in many articles in the field of jet physics has once again been confirmed by the ALICE collaboration, in a measurement of the production cross-sections, fragmentation and spatial structure of charged jets reconstructed from charged particle tracks.

Jets observed in collisions of LHC beams emerge from the violent scattering of quarks and gluons. The highly energetic scattered partons develop a parton shower via sequential gluon splittings, which fragments into the measured hadrons – the constituents of the jet. In heavy-ion collisions, jets are an important diagnostic tool for studying quark–gluon plasma (QGP) at the LHC, where effects arising from the interaction of the scattered partons with the dense produced medium are expected. Indeed, a strong suppression of jet production in lead–lead collisions is observed, along with a modification of the jet-fragment distributions.

“The interpretation of these effects requires detailed reference measurements of the jet structure and fragmentation in proton–proton collisions, where no medium is formed. In ALICE, charged jets are reconstructed in the central barrel from tracks measured with the inner tracking system and the time-projection chamber. Full jets contain neutral as well as charged particles measured with the ALICE electromagnetic calorimeter ( CERN Courier May 2013 p8), but for this recent study the analysis did not include neutral particles in the jet reconstruction. Jets with transverse momenta (pT) from 20 to 100 GeV/c can be measured and analysed particle by particle. With the detector’s excellent low-momentum tracking capabilities, ALICE is unique in being able to measure constituents down to a pT of 150 MeV/c. Measurements at low jet and constituent pT are crucial for heavy-ion collisions, where gluon radiation induced by the medium is expected to enhance the yield of soft jet particles.

Scaled pT spectra of charged particles in jets for different bins of jet transverse momentum.

 

The left-hand part of the figure shows the ratios of cross-sections for jets measured with different choices of the resolution parameter, R. Using a distance measure that combines azimuthal angle and pseudo-rapidity differences as Δr2 = Δφ2 + Δη2, the jet pT for a given R is the summed pT of the jet constituents accumulated in a cone of size R. The ratio is a measure of the jet structure, i.e. the angular distribution of jet constituents, and the observed increase of R with jet pT indicates stronger collimation for more energetic jets. The ALICE measurements show that 80% of the energy of the reconstructed jet is typically found within 15° of the jet axis.

The right-hand part of the figure shows the jet-fragmentation distribution of constituent pT in the reduced transverse-momentum variable zch = pTparticle,ch/pTjet,ch, which measures the fraction of the total charged-jet pT carried by a given jet constituent. For zch > 0.1, the distributions for different charged-jet pT are consistent with each other. This scaling is broken for the lowest zch, owing to the increase of the multiplicity of soft jet constituents with higher jet pT.

The measurement of jet properties in proton–proton collisions is the first step towards studies of the “quenched” jets in the more complex environment of heavy-ion collisions. They provide a reference for future measurements of the modification of jet fragmentation and structure in heavy-ion collisions, including studies of identified hadrons in jets using the unique particle identification capabilities of ALICE at the LHC.

bright-rec iop pub iop-science physcis connect