Comsol -leaderboard other pages

Topics

Report reveals full reach of LHC programme

Projected uncertainties on the Higgs-boson couplings to SM particles

The High-Luminosity LHC (HL-LHC), scheduled to operate from 2026, will increase the instantaneous luminosity of the LHC by at least a factor of five beyond its initial design luminosity. The analysis of a fraction of the data already delivered by the LHC – a mere 6% of what is expected by the end of HL-LHC in the late-2030s – led to the discovery of the Higgs boson and a diverse set of measurements and searches that have been documented in some 2000 physics papers published by the LHC experiments. “Although the HL-LHC is an approved and funded project, its physics programme evolves with scientific developments and also with the physics programmes planned at future colliders,” says Aleandro Nisati of ATLAS, who is a member of the steering group for a new report quantifying the HL-LHC physics potential.

The 1000+ page report, published in January, contains input from more than 1000 experts from the experimental and theory communities. It stems from an initial workshop at CERN held in late 2017 (CERN Courier January/February 2018 p44) and also addresses the physics opportunities at a proposed high-energy upgrade (HE-LHC). Working groups have carried out hundreds of projections for physics measurements within the extremely challenging HL-LHC collision environment, taking into account the expected evolution of the theoretical landscape in the years ahead. In addition to their experience with LHC data analysis, the report factors in the improvements expected from the newly upgraded detectors and the likelihood that new analysis techniques will be developed. “A key aspect of this report is the involvement of the whole LHC community, working closely together to ensure optimal scientific progress,” says theorist and steering-group member Michelangelo Mangano.

Physics streams

The physics programme has been distilled into five streams: Standard Model (SM), Higgs, beyond the SM, flavour and QCD matter at high density. The LHC results so far have confirmed the validity of the SM up to unprecedented energy scales and with great precision in the strong, electroweak and flavour sectors. Thanks to a 10-fold larger data set, the HL-LHC will probe the SM with even greater precision, give access to previously unseen rare processes, and will extend the experiments’ sensitivity to new physics in direct and indirect searches for processes with low-production cross sections and more elusive signatures. The precision of key measurements, such as the coupling of the Higgs boson to SM particles, is expected to reach the percent level, where effects of new physics could be seen. The experimental uncertainty on the top-quark mass will be reduced to a few hundred MeV, and vector-boson scattering – recently observed in LHC data – will be studied with an accuracy of a few percent using various diboson processes.

The excavation of new shafts for the HL-LHC

The 2012 discovery of the Higgs boson opens brand-new studies of its properties, the SM in general, and of possible physics beyond the SM. Outstanding opportunities have emerged for measurements of fundamental importance at the HL-LHC, such as the first direct constraints on the Higgs trilinear self-coupling and the natural width. The experience of LHC Run 2 has led to an improved understanding of the HL-LHC’s ability to probe Higgs pair production, a key measure of its self-interaction, with a projected combined ATLAS and CMS sensitivity of four standard deviations. In addition to significant improvements on the precision of Higgs-boson measurements (figure 1), the HL-LHC will improve searches for heavier Higgs bosons motivated by theories beyond the SM and will be able to probe very rare exotic decay modes thanks to the huge dataset expected.

The new report considers a large variety of new-physics models that can be probed at HL-LHC. In addition to searches for new heavy resonances and supersymmetry models, it includes results on dark matter and dark sectors, long-lived particles, leptoquarks, sterile neutrinos, axion-like particles, heavy scalars, vector-like quarks, and more. “Particular attention is placed on the potential opened by the LHC detector upgrades, the assessment of future systematic uncertainties, and new experimental techniques,” says steering-group member Andreas Meyer of CMS. “In addition to extending the present LHC mass and coupling reach by 20–50% for most new-physics scenarios, the HL-LHC will be able to potentially discover, or constrain, new physics that is not in reach of the current LHC dataset.”

Pushing for precision

The flavour-physics programme at the HL-LHC comprises many different probes – the weak decays of beauty, charm, strange and top quarks, as well as of the τ lepton and the Higgs boson – in which the experiments can search for signs of new physics. ATLAS and CMS will push the measurement precision of Higgs couplings and search for rare top decays, while the proposed second phase of the LHCb upgrade will greatly enhance the sensitivity with a range of beauty-, charm-, and strange-hadron probes. “It’s really exciting to see the full potential of the HL-LHC as a facility for precision flavour physics,” says steering-group member Mika Vesterinen of LHCb. “The projected experimental advances are also expected to be accompanied by improvements in theory, enhancing the current mass-reach on new physics by a factor as large as four.”

Finally, the report identifies four major scientific goals for future high-density QCD studies at the LHC, including detailed characterisation of the quark–gluon plasma and its underlying parton dynamics, the development of a unified picture of particle production, and QCD dynamics from small to large systems. To address these goals, high-luminosity lead–lead and proton–lead collision programmes are considered as priorities, while high-luminosity runs with intermediate-mass nuclei such as argon could extend the heavy-ion programme at the LHC into the HL-LHC phase.

High-energy considerations

One of the proposed options for a future collider at CERN is the HE-LHC, which would occupy the same tunnel but be built from advanced high-field dipole magnets that could support roughly double the LHC’s energy. Such a machine would be expected to deliver an integrated proton–proton luminosity of 15,000 fb–1 at a centre-of-mass energy of 27 TeV, increasing the discovery mass-reach beyond anything possible at the HL-LHC. The HE-LHC would provide precision access to rare Higgs boson (H) production modes, with approximately a 2% uncertainty on the ttH coupling, as well as an unambiguous observation of the HH signal and a precision of about 20% on the trilinear coupling. An HE-LHC would enable a heavy new Z´ gauge boson discovered at the HL-LHC to be studied in detail, and in general double the discovery reach of the HL-LHC to beyond 10 TeV.

The HL/HE-LHC reports were submitted to the European Strategy for Particle Physics Update in December 2018, and are also intended to bring perspective to the physics potential of future projects beyond the LHC. “We now have a better sense of our potential to characterise the Higgs boson, hunt for new particles and make Standard Model measurements that restrict the opportunities for new physics to hide,” says Mangano. “This report has made it clear that these planned 3000 fb–1 of data from HL-LHC, and much more in the case of a future HE-LHC, will play a central role in particle physics for decades to come.”

Mysterious burst confounds astrophysicists

An optical image of the Cow and its host galaxy

On 16 June 2018, a bright burst of light was observed by the Asteroid Terrestrial-impact Last Alert System (ATLAS) telescope in Hawaii, which automatically searches for optical transient events. The event, which received the automated catalogue name “AT2018cow”, immediately received a lot of attention and acquired a shorter name: “the Cow”. While transient objects are observed on the sky every day – caused, for example, by nearby asteroids or supernovae – two factors make the Cow intriguing. First, the very short time it took for the event to reach its extreme brightness and fade away again indicates that this event is nothing like anything observed before. Second, it took place relatively close to Earth, 200 million light years away in a star-forming arm of a galaxy in the Hercules constellation, making it possible to study the event in a wide range of wavelengths.

Soon after the ATLAS detection, the object was observed by more than 20 different telescopes around the world, revealing it to be 10–100 times brighter than a typical supernova. In addition to optical measurements, the object was observed for several days by space-based X- and gamma-ray telescopes such as NuSTAR, XMM-Newton, INTEGRAL and Swift, which also observed it in the UV energy range, as well as by radio telescopes on Earth. The IceCube observatory in Antarctica also identified two possible neutrinos coming from the Cow, although the detection is still compatible with a background fluctuation. The combination of all the data – demonstrating the power of multi-messenger astronomy – confirmed that this was not an ordinary supernova, but potentially something completely different.

Bright spark

While standard supernovae take several days to reach maximum brightness, the Cow did so in just 1.5 days, after which the brightness also started to decrease much faster than a typical supernova. Another notable feature was the lack of heavy-element decays. Normally, elements such as 56Ni produced during the explosion are the main source of supernovae brightness, but the Cow only revealed signs of lighter elements such as hydrogen and helium. Furthering the event’s mystique is the variability of the X-ray emission several days after its discovery, which is a clear sign of an energy source at its centre. Half a year after its discovery, two opposing theories aim to explain these features.

The first theory states that an unlucky compact object was destroyed when coming too close to a black hole – a phenomenon called a tidal disruption event. The fast increase in brightness excludes normal stars. On the other hand, a smaller object (such as a neutron star, a very dense star consisting of neutron matter) cannot explain the hydrogen and helium observed in the remnant, since it contains no proper elements. The remaining possibility is a white dwarf, a dense star remaining after a normal star has ceased fusion but kept from gravitational collapse into a neutron star or black hole by the electron-degeneracy pressure in its core. The observed emission from the Cow could be explained if a white dwarf was torn apart by tidal forces in the vicinity of a massive black hole. One problem with this theory, however, is the event’s location, since black holes with the sizes required for such an event are normally not found in the spiral arms of galaxies.

The opposing theory is that the Cow was a special type of supernova in which either a black hole or a quickly rotating highly magnetic neutron star, a magnetar, is produced. While the bright emission in the optical and UV bands are produced by the supernova-like event, the variable X-ray emission is produced by radiating gas falling into the compact object. Normally the debris of a supernova blocks most of the light from reaching us, but the progenitor of the Cow was likely a relatively low-mass star that caused little debris. A hint of its low mass was also found in the X-ray data. If so, the observations would constitute the first observation of the birth of a compact object, making these data very valuable for further theoretical development. Such magnetar sources could also be responsible for ultra-high-energy cosmic rays as well as high-energy neutrinos, two of which might have been observed already. The debate on the nature of the Cow continues, but the wealth of information gathered so far indicates the growing importance of multi-messenger astronomy.

Probing gauge-boson polarisation

A report from the ATLAS experiment

Precision measurements of diboson processes at the LHC are powerful probes of the gauge structure of the Standard Model at the multi-TeV energy scale. Among the most interesting directions in the diboson physics programme is the study of gauge-boson polarisation. The existence of three polarisation states is predicted by the Standard Model. The transverse polarisation is composed of right- and left-handed states, with spin either parallel or antiparallel to the momentum vector of the boson. The third state, a longitudinally-polarised component, is generated when the W and Z bosons acquire mass through electroweak symmetry breaking, and is therefore under particular scrutiny.

New phenomena can alter the polarisation predicted by the Standard Model due to interference between new-physics amplitudes and diagrams with gauge-boson self-interactions. WZ production, with its clean experimental signature, offers a sensitive way to search for such anomalies by providing a direct probe of the WWZ gauge coupling, due to the s-channel “Z-strahlung” contribution, where the W radiates a Z.

Building on precision WZ measurements previously reported by the ATLAS and CMS collaborations, a recent ATLAS result constitutes the most precise WZ measurement at a centre-of-mass energy of 13 TeV, and provides the first measurement of the polarisation of pair-produced vector bosons in hadron collisions. Based on 36.1 fb-1 of data collected in 2015 and 2016 by the ATLAS detector, and using leptonic decay modes of the gauge bosons to electrons or muons, ATLAS has achieved a precision of 4.5% for the WZ cross section measured in a fiducial phase space closely matching the detector acceptance. The kinematics of WZ events, including the underlying dynamics of accompanying hadronic jets, has been studied in detail by measuring the cross section as a function of several observables.

Two graphs of W bosons in WZ-production events

The polarisation states for W and Z bosons can be probed through distributions of the angle of the leptons relative to the bosons from which they decayed (figure 1, left). A binned profile-likelihood fit of templates describing the three helicity states allowed ATLAS to extract the W and Z polarisations in the fiducial measurement region. Because of the incomplete knowledge of the neutrino momentum originating from the W-boson decay, it is more difficult to measure the helicity fractions of the W than of the Z. The fraction of a longitudinally-polarised W boson in WZ events is found to be 0.26 ± 0.06 (figure 1, right), while the longitudinal fraction of the Z boson is found to be 0.24 ± 0.04. The analysis leads to an observed significance of 4.2 standard deviations for the presence of longitudinally-polarised W bosons, and 6.5 standard deviations for longitudinally-polarised Z bosons.

Improved precision

The measurements are dominated by statistical uncertainties, but future datasets will improve precision and allow the collaboration to probe new-physics effects in events where both the Z and the W are longitudinally polarised. The ultimate target is to measure the scattering of longitudinally polarised vector bosons: this would be a direct test of electroweak symmetry breaking.

Charm mixing tests the Standard Model

Constraints on the parameters describing CP violation in charm mixing

A report from the LHCb experiment

The Standard Model (SM) allows neutral flavoured mesons such as the D0 to oscillate into their antiparticles. Having first observed this process in 2012, the LHCb collaboration has recently made some of the world’s most precise measurements of this behaviour, which is potentially sensitive to new physics. The oscillation of the D0 (cu̅) into its antiparticle, the 0 (c̅u), occurs through the exchange of massive virtual particles. These might include as-yet undiscovered particles, so the measurements are sensitive to non-Standard Model dynamics at large energy scales. By examining D0 and 0 mesons separately, it is also possible to search for the violation of charge–parity (CP) symmetry in the charm sector. Such effects are predicted to be very small. Therefore, given LHCb’s current level of experimental precision, any sign of CP violation would be a clear indication of physics beyond the Standard Model.

Given LHCb’s current level of experimental precision, any sign of CP violation would be a clear indication of physics beyond the Standard Model.

Due to quantum-mechanical mixing between the neutral charm meson’s mass and flavour eigenstates, the probabilities of observing either it or its antiparticle vary as a function of time. This mixing can be described by two parameters, x and y, which relate the properties of the mass eigenstates: x is the normalised difference in mass, and y is the normalised difference in width, or inverse lifetime. The mixing rate is very slow, making these parameters difficult to measure. Isolating the differences between the D0 and 0 mesons is an even greater challenge. For these two papers, LHCb was able to achieve small statistical uncertainties thanks to the large samples of charm mesons collected during Run 1, and minimised systematic uncertainties by measuring ratios of yields to cancel detector effects.

In the first paper, LHCb physicists studied the effective lifetime of the mesons. As a consequence of mixing, the effective decay width to CP-even final states, such as K+K and π+π, differs from the average width measured in decays such as D0K π+. The parameter yCP, which in the limit of CP symmetry is equal to y, can be deduced from the ratio of decay rates to these two final states as a function of time. LHCb measured yCP with the same precision as all previous measurements combined, obtaining a value consistent with the world-average value of y.

In the second analysis, LHCb reconstructed D0 decays into the final state K0S π+π to measure the parameter x, which had not previously been shown to differ from zero. In this mode, mixing manifests as small variations in the decay rate in different parts of phase space as a function of time. Measuring it requires good control over experimental effects as a function of both phase space and decay time. LHCb achieved this by measuring the ratios of the yields in complementary regions of phase space (mirrored in the Dalitz plane) as a function of time. The measured value of x is the world’s most precise, and in combination with previous measurements there is now evidence that it differs from zero.

As well as the mixing itself, both analyses are also sensitive to mixing-induced CP violation. While CP violation was not observed, the limits on its parameters were greatly improved (figure 1). This is a good example of how different decay modes give complementary information and, when taken together, can have a big impact. LHCb will continue to perform measurements with additional modes and the larger samples collected in Run 2.

CASTOR calorimetry delves into gluon saturation

A report from the CMS experiment

The fundamental structure of nucleons is described by the properties and dynamics of their constituent quarks and gluons, as described by QCD. The gluon’s self-interaction complicates this picture considerably. Non-linear recombination reactions, where two gluons fuse, are predicted to lead to a saturation of the gluon density. This largely unexplored phenomenon is expected to occur when the gluons in a hadron overlap transversally, and is enhanced for hadrons with high atomic numbers. Gluon saturation may be studied in lead-proton collisions at the LHC in the kinematic region where the gluon density is high and the gluons have sizable transverse dimensions.

Gluon saturation has been at the focal point of the heavy-ion community for decades. Precision measurements at HERA, RHIC and previously at the LHC agree with the predictions made by saturation models, however, the measurements do not allow an unambiguous interpretation of whether gluon saturation occurs in nature. This is a strong motivation both for the LHC experiments and for the planned Electron Ion Collider (CERN Courier October 2018 p31).

The differential jet cross section as a function of jet energy as measured in the CASTOR calorimeter

The CMS collaboration recently submitted a paper on gluon saturation in proton-lead collisions to the Journal of High Energy Physics (JHEP). The collisions that were used for this analysis occurred in 2013 at a centre-of-mass energy of 5 TeV and were detected using the CMS experiment’s CASTOR calorimeter. This is a very forward calorimeter of CMS, where “forward” refers to regions of the detector that are close to the beam pipe. Therefore, unlike any other LHC experiment, CMS can measure jets at very forward angles (–6.6<|η|<–5.2) and with transverse momenta (pT) as low as 3 GeV. This is the first time that a jet-energy spectrum measurement from the CASTOR calorimeter has been submitted to a journal.

Forward jets with a small pT can target high-density-regime gluons with ample transverse dimensions, making CASTOR ideal for a study of gluon saturation. By colliding protons with lead ions, the sensitivity of the CASTOR jet spectra to saturation effects was further enhanced. This enabled CASTOR to overcome the ambiguity associated with the interpretation of the previous measurements.

The jet-energy spectrum obtained using CASTOR was compared to two saturation models (figure 1, left). These were the “Katie KS” model and predictions from the AAMQS collaboration; the latter are based on the colour-glass-condensate model. In the Katie KS model, the strength of the non-linear gluon recombination reactions can be varied. Upon comparison with the model, it was seen that the linear and non-linear predictions differed by an order of magnitude for the lowest energy bins of the spectrum, which correspond to low-pT jets. Meanwhile, they converged at the highest energies, confirming the high sensitivity of the measurement to gluon saturation. The AAMQS predictions underestimated the data progressively, up to an order of magnitude, in the region most strongly affected by saturation. Overall, neither model described the spectrum correctly.

The spectrum was also compared to two cosmic ray models (EPOS-LHC and QGSJetII-04) and to the HIJING event generator (figure 1, right). The former models underestimated the data by over two orders of magnitude while HIJING, which incorporates an implementation of nuclear shadowing, agreed well with the data. Nuclear shadowing is an interference effect between the nucleons of a heavy ion. Like gluon saturation, it is expected to lead to a decrease in the probability for a proton-lead collision to occur, however further data analysis is required for more definite conclusions on nuclear shadowing.

These results establish CASTOR jets as an experimental reality and their sensitivity to saturation effects is encouragement for further, more refined CASTOR jet studies.

Colliders join the hunt for dark energy

Dark analysis

It is 20 years since the discovery that the expansion of the universe is accelerating, yet physicists still know precious little about the underlying cause. In a classical universe with no quantum effects, the cosmic acceleration can be explained by a constant that appears in Einstein’s equations of general relativity, albeit one with a vanishingly small value. But clearly our universe obeys quantum mechanics, and the ability of particles to fluctuate in and out of existence at all points in space leads to a prediction for Einstein’s cosmological constant that is 120 orders of magnitude larger than observed. “It implies that at least one, and likely both, of general relativity and quantum mechanics must be fundamentally modified,” says Clare Burrage, a theorist at the University of Nottingham in the UK.

With no clear alternative theory available, all attempts to explain the cosmic acceleration introduce a new entity called dark energy (DE) that makes up 70% of the total mass-energy content of the universe. It is not clear whether DE is due to a new scalar particle or a modification of gravity, or whether it is constant or dynamic. It’s not even clear whether it interacts with other fundamental particles or not, says Burrage. Since DE affects the expansion of space–time, however, its effects are imprinted on astronomical observables such as the cosmic microwave background and the growth rate of galaxies, and the main approach to detecting DE involves looking for possible deviations from general relativity on cosmological scales.

Unique environment

Collider experiments offer a unique environment in which to search for the direct production of DE particles, since they are sensitive to a multitude of signatures and therefore to a wider array of possible DE interactions with matter. Like other signals of new physics, DE (if accessible at small scales) could manifest itself in high-energy particle collisions either through direct production or via modifications of electroweak observables induced by virtual DE particles.

Last year, the ATLAS collaboration at the LHC carried out a first collider search for light scalar particles that could contribute to the accelerating expansion of the universe. The results demonstrate the ability of collider experiments to access new regions of parameter space and provide complementary information to cosmological probes.

Unlike dark matter, for which there exists many new-physics models to guide searches at collider experiments, few such frameworks exist that describe the interaction between DE and Standard Model (SM) particles. However, theorists have made progress by allowing the properties of the prospective DE particle and the strength of the force that it transmits to vary with the environment. This effective-field-theory approach integrates out the unknown microscopic dynamics of the DE interactions.

The new ATLAS search was motivated by a 2016 model by Philippe Brax of the Université Paris-Saclay, Burrage, Christoph Englert of the University of Glasgow, and Michael Spannowsky of Durham University. The model provides the most general framework for describing DE theories with a scalar field and contains as subsets many well-known specific DE models – such as quintessence, galileon, chameleon and symmetron. It extends the SM lagrangian with a set of higher dimensional operators encoding the different couplings between DE and SM particles. These operators are suppressed by a characteristic energy scale, and the goal of experiments is to pinpoint this energy for the different DE–SM couplings. Two representative operators predict that DE couples preferentially to either very massive particles like the top quark (“conformal” coupling) or to final states with high-momentum transfers, such as those involving high-energy jets (“disformal” coupling).

Signatures

“In a big class of these operators the DE particle cannot decay inside the detector, therefore leaving a missing energy signature,” explains Spyridon Argyropoulos of the University of Iowa, who is a member of the ATLAS team that carried out the analysis. “Two possible signatures for the detection of DE are therefore the production of a pair of top-anti­top quarks or the production of high-energy jets, associated with large missing energy. Such signatures are similar to the ones expected by the production of supersymmetric top quarks (“stops”), where the missing energy would be due to the neutralinos from the stop decays or from the production of SM particles in association with dark-matter particles, which also leave a missing energy signature in the detector.”

The ATLAS analysis, which was based on 13 TeV LHC data corresponding to an integrated luminosity of 36.1 fb–1, re-interprets the result of recent ATLAS searches for stop quarks and dark matter produced in association with jets. No significant excess over the predicted background was observed, setting the most stringent constraints on the suppression scale of conformal and disformal couplings of DE to normal matter in the context of an effective field theory of DE. The results show that the characteristic energy scale must be higher than approximately 300 GeV for the conformal coupling and above 1.2 TeV for the disformal coupling.

The search for DE at colliders is only at the beginning, says Argyropoulos. “The limits on the disformal coupling are several orders of magnitudes higher than the limits obtained from other laboratory experiments and cosmological probes, proving that colliders can provide crucial information for understanding the nature of DE. More experimental signatures and more types of coupling between DE and normal matter have to be explored and more optimal search strategies could be developed.”

With this pioneering interpretation of a collider search in terms of dark-energy models, ATLAS has become the first experiment to probe all forms of matter in the observable universe, opening a new avenue of research at the interface of particle physics and cosmology. A complementary laboratory measurement is also being pursued by CERN’s CAST experiment, which studies a particular incarnation of DE (chameleon) produced via interactions of DE with photons.

But DE is not going to give up its secrets easily, cautions theoretical cosmologist Dragan Huterer at the University of Michigan in the US. “Dark energy is normally considered a very large-scale phenomenon, but you may justifiably ask how the study of small systems in a collider can say anything about DE. Perhaps it can, but in a fairly model-dependent way. If ATLAS finds a signal that departs from the SM prediction it would be very exciting. But linking it firmly to DE would require follow-up work and measurements – all of which would be very exciting to see happen.”

German–Japanese centre to focus on precision physics

On time

On 1 January a new virtual centre devoted to some of the most precise measurements in science was established by researchers in Germany and Japan. The Centre for Time, Constants and Fundamental Symmetries will offer access to ultra-sensitive equipment to allow experimental groups in atomic and nuclear physics, antimatter research, quantum optics and metrology to collaborate closely on fundamental measurements. Three partners – the Max Planck Institutes for nuclear physics (MPI-K) and for quantum optics (MPQ), the National Metrology Institute of Germany (PTB) and RIKEN in Japan – agreed to fund the centre in equal amounts with a total of around €7.5 million for five years, and scientific activities will be coordinated at MPI-K.

A major physics target of the German–Japanese centre is to investigate whether the fundamental constants really are constant or if they change in time by tiny amounts. Another goal concerns the subtle differences in the properties of matter and antimatter, namely C, P and T invariance, which have not yet shown up, even though such differences intrinsically must exist, otherwise the universe would consist of almost pure radiation. Closely related to these tests of fundamental symmetries is the search for physics beyond the Standard Model. The broad research portfolio also includes the development of novel optical clocks based on atoms, nuclei and highly charged ions.

“It is fascinating that nowadays manageable laboratory experiments make it possible to investigate such fundamental questions in physics and cosmology by means of their high precision”, says Klaus Blaum of MPI-K.

Stringent tests of fundamental interactions and symmetries using the protons and antiprotons available at the BASE experiment at CERN are another key aspect of the German–Japanese initiative, explains Stefan Ulmer, co-director of the centre, chief scientist at RIKEN, and spokesperson of the BASE experiment: “This centre will strongly promote fundamental physics in general, in addition to the research goals of BASE. Given this support we are developing new equipment to improve both the precision of the proton-to-antiproton charge-to-mass ratio as well as the proton/antiproton magnetic moment comparison by factors of 10 to 100.”

To reach these goals, the researchers intend to develop novel experimental techniques – such as transportable antiproton traps, sympathetic cooling of antiprotons by laser-cooled beryllium ions, and optical clocks based on highly charged ions and thorium nuclei – which will outperform contemporary methods and enable measurements at even shorter time scales and with improved sensitivity. “The combined precision-physics expertise of the individual groups with their complementary approaches and different methods using traps and lasers has the potential for substantial progress,” says Ulmer. “The low-energy, ultra-high-precision investigations for physics beyond the Standard Model will complement studies in particle physics.”

New measurements shine a light on the proton

A report from the ALICE experiment

The electromagnetic field of the highly charged lead ions in the LHC beams provides a very intense flux of high-energy quasi-real photons that can be used to probe the structure of the proton in lead–proton collisions. The exclusive photoproduction of a J/ψ vector meson is of special interest because it samples the gluon density in the proton. Previous measurements by ALICE have shown that this process could be measured in a wide range of centre-of-mass energies of the photon–proton system (Wγp), enlarging the kinematic reach by more than a factor of two with respect to that of calculations performed at the former HERA collider.

Fig. 1.

Recently, the ALICE collaboration has performed a measurement of exclusive photoproduction of J/ψ mesons off protons in proton–lead collisions at a centre-of-mass energy of 5.02 TeV at the LHC using two new configurations. In both cases, the J/ψ meson is reconstructed from its decay into a lepton pair. In the first case, the leptons are measured at mid-rapidity using ALICE’s central-barrel detectors. The excellent particle-identification capabilities of these detectors allow the measurement of both the e+e and μ+μ channels. The second configuration combines a muon measured with the central-barrel detectors with a second muon measured by the muon spectrometer located at forward rapidity. By this clever use of the detector configuration, we were able to significantly extend the coverage of the J/ψ measurement.

The energy of the photon–proton collisions, Wγp, is determined by the rapidity (which is a function of the polar angle) of the produced J/ψ with respect to the beam axis. Since the direction of the proton and the lead beams was inverted halfway through the data-taking period, ALICE covers both backward and forward rapidities using a single-arm spectrometer.

These two configurations, plus the one used previously where both muons were measured in the muon spectrometer, allow ALICE to cover – in a continuous way – the range in Wγp from 20 to 700 GeV. The typical momentum at which the structure of the proton is probed is conventionally given as a fraction of the beam momentum, x, and the new measurements extend over three orders of magnitude in x from 2 × 10–2 to 2 × 10–5. The measured cross section for this process as a function of Wγp is shown in figure 1 and compared with previous measurements and models based on different assumptions such as the validity of DGLAP evolution (JMRT), the vector-dominance model (STARlight), next-to-leading order BFKL, the colour–glass condensate (CGC), and the inclusion of fluctuating sub-nucleonic degrees-of-freedom (CCT). The last two models include the phenomenon of saturation, where nonlinear effects reduce the gluon density in the proton at small x.

The new measurements are compatible with previous HERA data where available, and all models agree reasonably well with the data. Nonetheless, it is seen that at the largest energies, or equivalently the smallest x, some of the models predict a slower growth of the cross section with energy. This is being studied by ALICE with data taken in 2016 in p–Pb collisions at a centre-of-mass energy of 8.16 TeV, allowing exploration of the Wγp energy range up to 1.5 TeV, potentially shedding new light on the question of gluon saturation.

Good strategy demands the right balance

Community call

Strategy is a base that allows resources to be prioritised in the pursuit of im­p­ortant goals. No strategy would be needed if enough resources were available – we would just do what appears to be necessary.

Elementary particle physics generally requires large and expensive facilities, often on an international scale, which take a long time to develop and are heavy consumers of resources during operations. For this reason, in 2005 the CERN Council initiated a European Strategy for Particle Physics (ESPP), resulting in a document being adopted the following year. The strategy was updated in 2013 and the community is now working towards a second ESPP update (CERN Courier April 2018 p7).

The making of the ESPP has three elements: bottom-up activities driven by the scientific community through document submission and an open symposium (the latter to be held in Spain in May 2019); strategy drafting (to take place in Germany in January 2020) by scientists, who are mostly appointed by CERN member states; and the final discussion and approval by the CERN Council. Therefore, the final product should be an amalgamation of the wishes of the community and the political and financial constraints defined by state authorities. Experience of the previous ESPP update suggests that this is entirely achievable, but not without effort and compromise.

Out of four high-priority items in the current ESPP, which concluded in 2013, three of them are well under way: the full exploitation of the LHC via a luminosity upgrade; R&D and design studies for a future energy-frontier machine at CERN; and establishing a platform at CERN for physicists to develop neutrino detectors for experiments around the world. The remaining item, relating to an initiative of the Japanese particle-physics community to host an international linear collider in Japan, has not made much progress.

In physics, discussions about strategy usually start with a principled statement: “Science should drive the strategy”. This is of course correct, but unfortunately not always sufficient in real life, since physics consideration alone does not provide a practical solution most of the time. In this context, it is worth recalling the discussion about long-baseline neutrino experiments that took place during the previous strategy exercises.

Optimal outcome

At the time of the first ESPP almost 15 years ago, so little was known about the neutrino mass-mixing parameters that several ambitious facilities were discussed so as to cover necessary parameter spaces. Some resources were directed into R&D, but most probably they were too little and not well prioritised. In the meantime, it became clear that a state-of-the-art neutrino beam based on conventional technology would be sufficient to make the next necessary step of measuring the neutrino CP-violation parameter and mass hierarchy. What should be done was therefore clear from a scientific point of view, but there simply were not enough resources in Europe to construct a long-baseline neutrino experiment together with a high performance beam line while fully exploiting the LHC at the same time. The optimal outcome was found by considering global opportunities and this was one of the key ingredients that drove the strategy.

Tatsuya Nakada

The challenge facing the community now in updating the current ESPP is to steer the field into the mid-2020s and beyond. As such, discussions about the various ideas for the next big machine at CERN will be an important focus, but numerous other projects, including proposals for non-collider experiments, will be jostling for attention. Many brilliant people are working in our field with many excellent ideas, with different strengths and weaknesses. The real issue of the strategy update is how we can optimise the resources using time and location, and possibly synergies with other scientific fields.

The intention of the strategy is to achieve a scientific goal. We may already disagree about what this goal is, since it is people with different visions, tastes and habits who conduct research. But let us at least agree this to be “to understand the most fundamental laws of nature” for now. Also, depending on the time scales, the relative importance of elements in the decision-making might change and factors beyond Europe cannot be neglected. Strategy that cannot be implemented is not useful for anyone and the key is to make a judgement on the balance among many elements. Lastly, we should not forget that the most exciting scenario for the ESPP update will be the appearance of an unexpected result –then there would be a real paradigm shift in particle physics.

Large Hadron Collider: the experiments strike back

Forging ahead

The features in this first issue of 2019 bring you all the shutdown news from the seven LHC experiments, and what to expect when the souped-up detectors come back online in 2021.

During the next two years of long-shutdown two (LS2), the LHC and its injectors will be tuned up for high-luminosity operations: Linac2 will leave the floor to Linac4 to enable more intense beams; the Proton Synchrotron Booster will be equipped with completely new injection and acceleration systems; and the Super Proton Synchrotron will have new radio-frequency power. The LHC is also being tested for operation at its design energy of 14 TeV, while, in the background, civil-engineering works for the high-luminosity upgrade (HL-LHC), due to enter service in 2026, are proceeding apace.

The past three years of Run 2 at a proton–proton collision energy of 13 TeV have seen the LHC achieve record peak and integrated luminosities, forcing the detectors to operate at their limits. Now, the four main experiments ALICE, ATLAS, CMS and LHCb, and the three smaller experiments LHCf, MoEDAL and TOTEM, are gearing up for the extreme conditions of Run 3 and beyond.

At the limits

Since the beginning of the LHC programme, it was clear that the original detectors would last for approximately a decade due to radiation damage. That time has now come. Improvements, repairs and upgrades have been taking place in the LHC detectors throughout the past decade, but significant activities will take place during LS2 (and LS3, beginning 2024), capitalising on technology advances and the ingenuity of thousands of people over a period of several years. Combined, the technical design reports for the LHC experiment upgrades number some 20 volumes each containing hundreds of pages.

Wired

For LHCb, the term “upgrade” hardly does it justice, since large sections of the detector are to be completely replaced and a new trigger system is to be installed (LHCbs momentous metamorphosis). ALICE too is undergoing major interventions to its inner detectors during LS2 (ALICE revitalised), and both collaborations are installing new data centres to deal with the higher data rate from future LHC runs. ATLAS and CMS are upgrading numerous aspects of their detectors while at the same time preparing for major installations during LS3 for HL-LHC operations (CMS has high luminosity in sight and ATLAS upgrades in LS2). At the HL-LHC, one year of collisions is equivalent to 10 years of LHC operations in terms of radiation damage. Even more challenging, HL-LHC will deliver a mean event pileup of up to 200 interactions per beam crossing – 10 times greater than today – requiring totally new trigger and other capabilities.

Three smaller experiments at the LHC are also taking advantage of LS2. TOTEM, which comprises two detectors located 220 m either side of CMS to measure elastic proton–proton collisions (see “Forging ahead” image), aims to perform total-cross-section measurements at maximal LHC energies. For this, the collaboration is building a new scintillator detector to be integrated in CMS, in addition to service work on its silicon-strip and spectrometer detectors.

Forward physics

Another “forward” experiment called LHCf, made up of two detectors 140 m either side of ATLAS, uses forward particles produced by the LHC collisions to improve our knowledge about how cosmic-ray showers develop in Earth’s atmosphere. Currently, the LHCf detectors are being prepared for 14 TeV proton–proton operations, higher luminosities and also for the possibility of colliding protons with light nuclei such as oxygen, requiring a completely renewed data-acquisition system. Finally, physicists at MoEDAL, a detector deployed around the same intersection region as LHCb to look for magnetic monopoles and other signs of new physics, are preparing a request to take data during Run 3. For this, among other improvements, a new sub-detector called MAPP will be installed to extend MoEDAL’s physics reach to long-lived and fractionally charged particles.

The seven LHC experiments are also using LS2 to extend and deepen their analyses of the Run-2 data. Depending on what lies there, the collaborations could have more than just shiny new detectors on their hands by the time they come back online in the spring of 2021.

bright-rec iop pub iop-science physcis connect