The large amount of Run-2 data (collected in 2015–2018) allows the LHC experiments to probe previously unexplored rare processes, search for new physics and improve Standard Model measurements. The amount of data collected in Run 2 can be quantified by the integrated luminosity – a number which, when multiplied by the cross section for a process, yields the expected number of interactions of that type. It is a crucial figure. The uncertainty of several ATLAS Run-1 cross-section measurements, particularly of W and Z production, was dominated by systematic uncertainty on the integrated luminosity. To minimise this, ATLAS performs precise absolute and relative calibrations of several luminosity-sensitive detector systems in a three-step procedure.
The first step is an absolute calibration of the luminosity using a van-der-Meer beam-separation scan under specialised beam conditions. By displacing the beams horizontally and vertically and scanning them through each other, it is possible to measure the combined size of the colliding proton bunches. Determining in addition the total number of protons in each colliding bunch from the measurement of the beam currents, the absolute luminosity of each colliding bunch pair can be derived. Relating this to the mean number of interactions observed in the LUCID-2 detector – a set of photomultiplier tubes located 17 m in either direction along the beam pipe that detect the Cherenkov light of particles which come from the interaction – the scale for the absolute luminosity measurement of LUCID-2 is set.
The second step is to extrapolate this calibration to LHC physics conditions, where the number of interactions increases from fewer than one to around 20–50 interactions per crossing, and the pattern of proton bunches changes from isolated bunches to trains of consecutive bunches with 25 ns spacing. The LUCID-2 response is sensitive to these differences. It is corrected with the help of a track counting algorithm, which relates the number of interactions to the number of tracks reconstructed in ATLAS’s inner detector.
The final step is to monitor the stability of the LUCID-2 calibration over time. This is evaluated by comparing the luminosity estimate of LUCID-2 to those from track counting in the inner detector and various ATLAS calorimeters over the course of the data-taking year (figure 1). The agreement between detectors quantifies the stability of the LUCID-2 response.
Using this three-step method and taking into account correlations between years, ATLAS has obtained a preliminary uncertainty on the luminosity estimate for the combined Run-2 data of 1.7%, improving slightly on the Run-1 precisions of 1.8% at 7 TeV and 1.9% at 8 TeV. The full 13 TeV Run-2 data sample corresponds to an integrated luminosity of 139 fb–1 – about 1.1 × 1016 proton collisions.
One of the most useful ways to understand the properties of the quark–gluon plasma (QGP) formed in relativistic heavy-ion collisions is to study how various probes interact when propagating though it. Heavier quarks, such as charm, can provide unique insights as they are produced early in the collisions, and their interactions with the QGP differ from their lighter cousins. One important input to these studies is a detailed understanding of hadronisation, by which quarks form experimentally detectable mesons and baryons.
The lightest charm baryon and meson are the Λ+c (udc) and the D0 (cu̅). In proton– proton (pp) collisions, charm hadrons are formed by fragmentation, in which charm quarks and antiquarks move away from each other and combine with newly generated quarks. In heavy-ion collisions, hadron production can also occur via “coalescence”, whereby charm quarks combine with other quarks while traversing the QGP. The contribution of coalescence depends strongly on the transverse momentum (pT) of the hadrons, and is expected to be much more significant for charm baryons than for charm mesons, as they contain more quarks.
The CMS experiment has recently determined the Λ+c/D0yield ratio over a broad range of pT using the Λ+c→ pK–π+and D0 → K–π+ decay channels in both pp and lead–lead (PbPb) collisions, at a nucleon–nucleon centre-of-mass energy of 5.02TeV. Comparing the behaviour of the Λ+c/D0 ratio in different collision systems allows physicists to study the relative contributions of fragmentation and coalescence.
The measured Λ+c/D0-production cross-section ratio in pp-collisions (figure 1) is found to be significantly larger than that calculated in the standard version of the popular Monte-Carlo event generator PYTHIA, while the inclusion of an improved description of the fragmentation (“PYTHIA8+CR”) can better describe the CMS data. The data can also be reasonably described by a different model that includes Λ+c baryons produced by the decays of excited charm baryons (dashed line). However, an attempt to incorporate the coalescence process characteristic of hadron production in heavy-ion collisions (solid line) fails to reproduce the pp-collision measurements.
The CMS collaboration also measured Λ+c production in PbPb collisions. The Λ+c/D0-production ratio for pT>10GeV/c is found to be consistent with that from pp collisions. This similarity suggests that the coalescence process does not contribute significantly to charm hadron production in this pT range for PbPb collisions. These are the first measurements of the ratios at high pT for both the pp and PbPb systems at a nucleon–nucleon centre-of-mass energy of 5.02TeV.
In late 2018, CMS collected data corresponding to about 10 times more PbPb collisions than were used in the current measurement. These will shed new light on the interplay between the different processes in charm–quark hadronisation in heavy-ion collisions. In the meantime, the current results highlight the lack of understanding of charm–quark hadronisation in pp collisions, a subject that requires further experimental measurements and theoretical studies.
Neutron stars consist of extremely dense nuclear matter. Their maximum size and mass are determined by their equation of state, which in turn depends on the interaction potentials between nucleons. Due to the high density, not only neutrons but also heavier strange baryons may play a role.
The main experimental information on the interaction potentials between nucleons and strange baryons comes from bubble-chamber scattering experiments with strange-hadron beams undertaken at CERN in the 1960s, and is limited in precision due to the short lifetimes (< 200 ps) of the hadrons. The ALICE collaboration is now using the scattering between particles produced in collisions at the LHC to constrain interaction potentials in a new way. So far, pK–, pΛ, pΣ0, pΞ– and pΩ– interactions have been investigated. Recent data have already yielded the first evidence for a strong attractive interaction between the proton and the Ξ– baryon.
Strong final-state interactions between pairs of particles make their momenta more parallel to each other in the case of an attractive interaction, and increase the opening angle between them in the case of a repulsive interaction. The attractive potential of the p-Ξ– interaction was observed by measuring the correlation of pairs of protons and Ξ– particles as a function of their relative momentum (the correlation function) and comparing it with theoretical calculations based on different interaction potentials. This technique is referred to as “femtoscopy” since it simultaneously measures the size of the region in which particles are produced and the interaction potential between them.
Data from proton–lead collisions at a centre-of-mass energy per nucleon pair of 5.02 TeV show that p-Ξ– pairs are produced at very small distances (~1.4 fm); the measured correlation is therefore sensitive to the short-range strong interaction. The measured p-Ξ– correlations were found to be stronger than theoretical correlation functions with only a Coulomb interaction, whereas the prediction obtained by including both the Coulomb and strong interactions (as calculated by the HAL-QCD collaboration) agrees with the data (figure 1).
As a first step towards evaluating the impact of these results on models of neutron-star matter, the HAL-QCD interaction potential was used to compute the single-particle potential of Ξ– within neutron-rich matter. A slightly repulsive interaction was inferred (of the order of 6 MeV, compared to the 1322 MeV mass of the Ξ–), leading to better constraints on the equation of state for dense hadronic systems that contain Ξ– particles. This is an important step towards determining the equation of state for dense and cold nuclear matter with strange hadrons.
New sources of CP violation (CPV) are needed to explain the absence of antimatter in our matter-dominated universe. The LHCb collaboration has reported new results describing CPV in B+→π+K+K– and B+→π+π+π– decays. Until very recently, all observations of CPV in B mesons were made in two-body and quasi-two-body decays; however, it has long been conjectured that the complex dynamics of multi-body decays could give rise to other manifestations. For CPV to occur in B decays, competing decay amplitudes with different weak phases (which change sign under CP) and strong phases (which do not) are required. The weak phase differences are tied to fundamental parameters of the Standard Model (SM), but the strong phase difference can arise from loop-diagram contributions, final-state re-scattering effects, and phases associated with intermediate resonant structure.
The three-body B decays under study proceed mainly via various intermediate resonances – effectively, a cascade of two-body decays – but also include contributions from non-resonant three-body interactions. The phase space is two-dimensional (it can be fully described by two kinematic variables) and its size allows a rich tapestry of resonant structures to emerge, bringing quantum-mechanical interference into play. Much as in Young’s double-slit experiment, the total amplitude comprises the sum of all possible decay paths. The interference pattern and its phase variation could contribute to CPV in regions where resonances overlap.
One of the most intriguing LHCb results was the 2014 observation of large CPV effects in certain phase-space regions of B+→π+K+K– and B+→π+π+π– decays. In the new analysis, these effects are described with explicit amplitude models for the first time (figure 1). A crucial step in the phenomenological description of these amplitudes is to include unitarity-conserving couplings between final states, most notably ππ and KK. Accounting for these is essential to accurately model the complex S-wave component of the decays, which is the configuration where there is no relative angular momentum between a pair of oppositely-charged final-state particles, and which contains broad resonances that are difficult to model. Three complementary approaches were deployed to describe the complicated spin-0 S-wave component of the B+→π+π+π– decay: the classical isobar model, which explicitly associates a line-shape with a clear physical interpretation to each contribution in the phase space; the K-matrix method, which takes data from scattering experiments as an input; and finally a quasi-model-independent approach, in which the S-wave magnitude and phase are extracted directly from the data.
LHCb’s amplitude analyses of these decays are based on data from Run 1 of the LHC and contain several groundbreaking results, including the largest CP asymmetry in a single component of an amplitude analysis, found in the ππ↔ KK re-scattering amplitude; the first observation of CPV in the interference between intermediate states, seen in the overlap between the dominant spin-1 ρ(770)0 resonance and the π+π+ S-wave; and the first observation of CPV involving a spin-2 resonance of any kind, found in the decay B+→ f2(1270)π+. These results provide significant new insights into how CPV in the SM manifests in practice, and motivate further study, particularly into the strong-phase-generating QCD processes that govern CP violation.
On 20 May, 144 years after the signing of the Metre Convention in 1875, the kilogram was given a new definition based on Planck’s constant, h. Long tied to the International Prototype of the Kilogram (IPK) – a platinum-iridium cylinder in Paris – the kilogram is the last SI base unit to be redefined based on fundamental constants or atomic properties rather than a human-made artefact.
The dimensions of h are m2 kg s–1. Since the second and the metre are defined in terms of a hyperfine transition in caesium-133 and the speed of light, knowledge of h allows the kilogram to be set without reference to the IPK.
Measuring h to a suitably high precision of 10 parts per billion required decades of work by international teams across continents. In 1975 British physicist Bryan Kibble proposed a device, then known as a watt balance and now renamed the Kibble balance in his honour, which linked h to the unit of mass. A coil is placed inside a precisely calibrated magnetic field and a current driven through it such that an electromagnetic force on the coil counterbalances the force of gravity. The experiment is then repeated thousands of times over a period of months in multiple locations. The precision required is such that the strength of the gravitational field, which varies across the laboratory, must be measured before each trial.
Once the required precision was achieved, the value of h could be fixed and the definitions inverted, removing the kilogram’s dependence on the IPK. Following several years of deliberations, the new definition was formally adopted at the 26th General Conference on Weights and Measures in November last year. The 2019 redefinition of the SI base units came into force in May, and also sees the ampere, kelvin and mole redefined by fixing the numerical values for the elementary electric charge, the Boltzmann constant and the Avogadro constant, respectively.
“The revised SI future-proofs our measurement system so that we are ready for all future technological and scientific advances such as 5G networks, quantum technologies and other innovations that we are yet to imagine,” says Richard Brown, head of metrology at the UK’s National Physical Laboratory.
But the SI changes are controversial in some quarters. While heralding the new definition of the kilogram as “huge progress”, CNRS research director Pierre Fayet warns of possible pitfalls of fixing the value of the elementary charge: the vacuum magnetic permeability (μo) then becomes an unfixed parameter to be measured experimentally, with the electrical units becoming dependent on the fine structure constant. “It appears to me as a conceptual weakness of the new definitions of electrical units, even if it does not have consequences for their practical use,” says Fayet.
One way out of this, he suggests, is to embed the new SI system within a larger framework in which c = ħ = μo = εo = 1, thereby fixing the vacuum magnetic permeability and other characteristics of the vacuum (C. R. Physique20 33). This would allow all the units to be expressed in terms of the second, with the metre and joule identified as fixed numbers of seconds and reciprocal seconds, respectively. While likely attractive to high-energy physicists, however, Fayet accepts that it may be some time before such a proposal could be accepted.
In the heart of Beirut in a five-storey house owned by the Lebanese national telecommunication company, floors are about to be coated to make them anti-static, walls and ceilings will be insulated, and cabling systems installed so wires don’t become tangled. These and other details are set to be complete by mid-2020, when approximately 3000 processor cores, donated by CERN, will arrive.
The High-Performance Computing for Lebanon (HPC4L) project is part of efforts by Lebanese scientists to boost the nation’s research capabilities. Like many other countries that have been through conflict and seen their highly-skilled graduates leave to seek better opportunities, Lebanon is trying to stem its brain-drain. Though the new facility will not be the only HPC centre in the country, it is different because it involves both public and private institutions and has the full support of the government. “There are a few small-scale HPC facilities in different universities here, but they suffer from being isolated and hence are quickly outdated and underused,” says physicist Haitham Zaraket of Lebanese University in Beirut. “This HPC project puts together the main players in the realm of HPC in Lebanon.”
Having joined the LHC’s CMS experiment in 2016, Lebanese physicists want to develop the new facility into a CMS Tier-2 computing centre. High-speed internet will connect it to universities around the world and HPC4L has a mandate to ensure operation, maintenance, and user-interfacing for smooth and effective running of the facility. “We’ve been working with the government, private and public partners to prepare not just the infrastructure but also the team,” explains HPC4L project coordinator Martin Gastal of CERN. “CERN/CMS’s expertise and knowledge will help set up the facility and train users, but the team in Lebanon will run it themselves.” The Lebanese facility will also be used for computational biology, oil and gas discovery, financial forecasting, genome analysis and the social sciences.
Nepal is another country striving for greater digital storage and computing power. In 2017 Nepal signed a cooperation agreement with CERN. The following year, around 2500 cores from CERN enabled an HPC facility to be established at the government-run IT Park, with experts from Kathmandu University forming its core team. Rajendra Adhikari, project leader of Nepal’s HPC centre (pictured, second from right), also won an award from NVIDIA for the latest graphicscard worth USD 3000 and added it to the system. Nepal has never had computing on such a scale before, says Adhikari. “With this facility, we can train our students and conduct research that requires high-performance computing and data storage, from climate modelling, earthquake simulations to medical imaging and basic research.”
The Nepal facility is planning to store health data from hospitals, which is often deleted because of lack of storage space, and tests are being carried out to process drone images taken to map topography for hydropower feasibility studies. Even in the initial phases of the new centre, says Adhikari, computing tasks that used to take 45 days can now be processed in just 12 hours.
The SESAME light source in Jordan, which itself received 576 cores from CERN in 2017, is also using its experience to assist neighbouring regions in setting up and maintaining HPC facilities. “High-performance computing is a strong enabler of research capacity building in regions challenged by limited financial resources and talent exodus,” says Gastal. “By supporting the set up of efficient data processing and storage facilites, CERN, together with affiliated institutes, can assist fellow researchers in investing in the scientific potential of their own countries.”
An event held at CERN on 20–21 May revealed 170 projects that have been granted €100,000 of European Union (EU) funding to develop disruptive detection and imaging technologies. The successful projects, drawn from more than 1200 proposals from researchers in scientific and industrial organisations across the world, now have one year to prove the scientific merit and innovation potential of their ideas.
The 170 funded projects are part of the Horizon 2020 ATTRACT project funded by the EU and a consortium of nine partners, including CERN, the European Southern Observatory (ESO), European Synchrotron Radiation Facility (ESRF), European XFEL and Institut Laue-Langevin. The successful projects are grouped into four broad categories: data acquisition systems and computing; front-end and back-end electronics; sensors; and software and integration.
CERN researchers are involved in 19 of the projects, in areas from magnets and cryogenics to electronics and informatics. Several of the selected projects involve the design of sensors or signal-transmission systems that operate at very low temperatures or in the presence of radiation, and many target applications in medical imaging and treatment or in the aerospace sector. Others seek industrial applications, such as 3D printing of systems equipped with sensors, the inspection of operating cryostats or applications in environmental monitoring.
ESO’s astronomical technology and expertise will be applied to an imaging spectrograph suitable for clinical cancer studies and to single-photon visible-light imagers for adaptive optics systems and low-light-level spectroscopic and imaging applications. Among other projects connected with Europe’s major research infrastructures, four projects at the ESRF concern adaptive algebraic speckle tomography for clinical studies of osteoarticular diseases, a novel readout concept for 2D pixelated detectors, the transferral of indium-gallium-nitride epilayers onto substrates for full-spectrum LEDs, and artificial intelligence for the automatic segmentation of volumetric microtomography images.
“170 breakthrough ideas were selected based on a combination of scientific merit, innovation readiness and potential societal impact,” explained Sergio Bertolucci, chair of ATTRACT’s independent research, development and innovation committee. “The idea is to speed up the process of developing breakthrough technologies and applying them to address society’s key challenges.”
The outcomes of the ATTRACT seed-funding will be presented in Brussels in autumn 2020, and the most promising projects will receive further funding.
The open symposium of the European Strategy for Particle Physics (ESPP), which took place in Granada, Spain, from 13–16 May, revealed a vibrant field in flux as it grapples with how to attack the next big questions. Opening the event, chair of the ESPP strategy secretariat, Halina Abramowicz, remarked: “This is a very strange symposium. Normally we discuss results at conferences, but here we are discussing future results.” More than 10 different future-collider modes were under discussion, and the 130 or so talks and discussion sessions showed that elementary particle physics – in the wake of the discovery of the Higgs boson but so far no evidence of particles beyond the Standard Model (SM) – is transitioning into a new and less well-mapped realm of fundamental exploration.
Plain weird
Theorist Pilar Hernández of the University of Valencia described the SM as plain “weird”. The model’s success in describing elementary particles and their interactions is beyond doubt, but as an all-encompassing theory of nature it falls short. Why are the fermions arranged into three neat families? Why do neutrinos have an almost imperceptibly small mass? Why does the discovered Higgs boson fit the simplest “toy model” of itself? And what lies beneath the SM’s numerous free parameters? Similar puzzles persist about the universe at large: the mechanism of inflation; the matter–antimatter asymmetry; and the nature of dark energy and dark matter.
While initial results from the LHC severely constrain the most natural parameter spaces for new physics, said Hernández, the 10–100 TeV region is an interesting scale to explore. At the same time, she argued, there is a shift to more “bottom-up, rather than top-down”, approaches to beyond-SM (BSM) physics. The new quarries includes axion-like and long-lived particles, and searches for hidden, dark and feebly-interacting sectors – in addition to studying the Higgs boson, which has deep connections to many puzzles in the SM, with much greater precision. “Particle physics could be heading to crisis or revolution,” said Hernández.
Normally we discuss results at conferences, but here we are discussing future results
The accelerator, detector and computing technology needed for future fundamental exploration are varied and challenging. Reviewing Higgs-factory programmes, Vladimir Shiltsev, head of Fermilab’s Accelerator Physics Center, weighed up the pros and cons of linear versus circular machines. The former includes the International Linear Collider (ILC) and the Compact Linear Collider (CLIC); the latter a future circular electron–positron collider at CERN (FCCee) and the Circular Electron Positron Collider in China (CEPC). Linear colliders, said Shiltsev, are based on mature designs and organisation, are expandable to higher energies, and draw a wall-plug power similar to that of the LHC. On the other hand, they face challenges including their luminosity and number of interaction points. Circular Higgs factories offer a higher luminosity and more interaction points than linear options but require R&D into high-efficiency RF sources and superconducting cavities, said Shiltsev.
For hadron colliders, the three current options – CERN’s FCC-hh (100 TeV), China’s SppC (75 TeV) and a high-energy LHC (27 TeV) – demand next-generation superconducting dipole magnets. Akira Yamamoto of CERN/KEK said that while a lepton collider could begin construction in the next few years, the dipoles necessary for a hadron collider might take 10 to 15 years of R&D before construction could start.
The symposium also saw much discussion about muon colliders, which offer an energy-frontier lepton collider but for which it was widely acknowledged the technology is not yet ready. Concerning more futuristic acceleration technologies based on plasma wakefields, impressive results at facilities such as BELLA at Berkeley and AWAKE at CERN were on show.
Thinking ahead
From colliders to fixed-target to astrophysics experiments, said Francesco Forti of INFN and the University of Pisa, detectors face a huge variety of operating conditions and employ technologies deeply entwined with developments in industry. Another difficulty, he said, is how to handle non-standard physics signals, such as long-lived particles and monopoles. Like accelerators, detectors require long time scales – it was the very early 1990s when the first conceptual design reports for the LHC detectors were written.
In terms of data processing, the challenges ahead are immense, said Simone Campana of CERN and the HEP software foundation. The high-luminosity LHC (HL-LHC) presents a particular challenge, but DUNE, FAIR, BELLE II and other experiments will also create unprecedented data samples, plus there is the need to generate ever-more Monte Carlo samples. At the same time, noted Campana, the rate of advance in hardware performance has slowed in recent years, forcing the community to towards graphics processing units, high-performance computing and commercial cloud services. Forti and Campana both argued for better career opportunities and greater recognition for physicists who devote their time to detector and computing efforts.
The symposium also showed that the strategic importance of communications, education and outreach is becoming increasingly recognised.
Discussions in Granada revealed a community united in its desire for a post-LHC collider, but not in its choice of that collider’s form. Stimulating some heated exchanges, the ESPP saw proposals for future machines pitted against each other and against expectations from the HL-LHC in terms of their potential physics reach for key targets such as the Higgs boson.
Big questions
Gian Giudice, head of CERN’s Theory Department, said that the remaining BSM-physics space is “huge”, and pointed to four big questions for colliders: to what extent can we tell whether the Higgs is fundamental or composite? Are there new interactions or new particles around or above the electroweak scale? What cases of thermal relic WIMPs are still unprobed and can be fully covered by future collider searches? And to what extent can current or future accelerators probe feebly interacting sectors?
Though colliders dominated discussions, the enormous progress in neutrino physics since the previous ESPP was clear from numerous presentations. The open-symposium audience was reminded that neutrino masses, as established by neutrino oscillations, are the first particle-physics evidence for BSM phenomena. A vibrant programme is under way to fully measure the neutrino mixing matrix and in particular the neutrino mass ordering and CP violation phase, while other experiments are probing the neutrino’s absolute mass scale and testing whether they are of a Dirac or Majorana nature.
On 17 May in Granada, following the open symposium of the European Strategy for Particle Physics, the first meeting of a new international working group on the International Linear Collider (ILC) took place. The ILC is the most technologically mature of all current future-collider options, and was at the centre of discussions at the previous strategy update in 2013. Although its technology and costs have been revised since then, there is still no firm decision on the project’s location, governance or funding model. The new working group was set up by Japan’s KEK laboratory in response to a recent statement on the ILC from Japan’s Ministry of Education, Sports, Culture, Science and Technology (MEXT) that called for further discussions on these thorny issues. Comprising two members from Europe, two from North America and three from Asia (including Japan), the group will investigate and update several points, including: cost sharing for construction and operation; organisation and governance of the ILC; and the international sharing of the remaining technical preparations. The working group will submit a report to KEK by the end of September 2019 and the final report will be used by MEXT for discussions with other governments.
Around a fifth of the 160 input documents to the ESPP were linked to flavour physics, which is crucial for new-physics searches because it is potentially sensitive to effects at scales as high as 105 TeV, said Antonio Zoccoli of INFN. Summarising dark-matter and dark-sector physics, Shoji Asai of the University of Tokyo said that a shift was taking place from the old view, where dark-matter solutions arose as a byproduct of beyond-SM approaches such as supersymmetry, to a new paradigm where dark matter needs an explanation of its own. Asai called for more coordination and support between accelerator-based direct detection and indirect detection dark-sector searches, as exemplified by the new European Center for Astro-Particle Theory.
Jorgen D’Hondt of Vrije Universiteit Brussel listed the many dedicated experiments in the strong-physics arena and the open questions, including: how to reach an adequate precision of perturbative and non-perturbative QCD predictions at the highest energies? And how to probe the quark–gluon plasma equation of state and to establish whether there is a first-order phase transition at high baryon density?
Of all the scientific themes of the week, electroweak physics generated the liveliest discussions, especially concerning how well the Higgs boson’s couplings to fermions, gauge bosons and to itself can be probed at current and future colliders. Summary speaker Beate Heinemann of DESY cautioned that such quantitative estimates are extremely difficult to make, though a few things stand out. One is the impressive estimated performance from the HL-LHC in the next 15 years or so; another is that a long-term physics programme based on successive machines in a 100 km-circumference tunnel offers the largest overall physics reach on the Higgs boson and other key parameters. There is broad agreement, however, that the next major collider immediately after the LHC should collide electrons and positrons to fully explore the Higgs and make precision measurements of other electroweak parameters.
The big picture
The closer involvement of particle physics with astroparticle physics, in particular following the discovery of gravitational waves, was a running theme. It was argued that, in terms of technology, next-generation gravitational-wave detectors such as the Einstein Telescope are essentially “accelerators without beams” and that CERN’s expertise in vacuum and cryogenics would help to make such facilities a reality. Inputs from the astroparticle– and nuclear-physics communities, in addition to dedicated perspectives from Asia and the Americas, brought into sharp focus the global nature of modern high-energy physics and the need for greater coordination at all levels.
The open symposium of the ESPP update was a moment for physicists to take stock of the field’s status and future. The community rose to the occasion, aware that the decisions ahead will impact generations of physicists yet to be born. A week of high-quality presentations and focused discussions proved how far things have moved on since the previous strategy update concluded in 2013. Discussions illuminated both the immensity of efforts to evaluate the physics reach of the HL-LHC and future colliders, and the major task faced by the European Strategy Group (ESG) in plotting a path to the future. It is clear that new thinking, from basic theory to instrumentation, computing, analysis and global organisation, is required to sustain progress in the field.
No decisions were taken in Granada, stresses Abramowicz. “During the open symposium we mainly discussed the science. Now comes the time to assess the capacity of the community to realise the proposed scientific goals,” she says. “The Physics Preparatory Group is preparing the briefing book, which will summarise the scientific aspirations of the community, including the physics case for them.”
The briefing book is expected to be completed in September. The ESG drafting session will take place on 20–24 January 2020 in Bad Honnef, Germany, and the update of the ESPP is due to be completed and approved by CERN Council in May 2020.
Dieter Renker, who made some key contributions to the design and construction of the CMS experiment at the LHC, passed away on 16 March after a short illness. Dieter was born in Bavaria and studied physics in Munich and Berlin. He obtained his PhD from the Ludwig Maximilian University in Munich, based on experiments performed at SIN, now the Paul Scherrer Institute (PSI), in Villigen, Switzerland. In 1982 he joined SIN as a staff physicist, where he remained until his retirement at the end of 2009.
At SIN/PSI he participated in many experiments, providing excellent technical support, as well as designing new beamlines at the accelerator there. His technical aptitude in due course turned to detector development, which led to his greatest achievement. In the early days of CMS there were various ideas for the design of the electromagnetic calorimeter. Among these was the use of lead tungstate crystals, which although having many suitable properties for operation at the LHC, have a relatively small scintillation-light yield. Dieter contributed the key measurements which showed that avalanche photodiodes (APDs), with their key properties of internal gain and insensitivity to shower leakage, could be used to read out the crystals. This led to lead-tungstate crystals being adopted by CMS for the design of the calorimeter. Not only did they provide superb energy resolution for electrons and photons, enabling key discoveries such as the Higgs boson in 2012, but they also enabled a more compact detector with significantly reduced overall cost.
The development of the final APD was carried out over a period of many years by Hamamatsu Photonics (Japan), but under the close guidance of Dieter. Nearly 100 different APD prototypes were tested before the technology was deemed fit to be used in CMS. The size, capacitance, speed and, above all, radiation tolerance were the key parameters that needed to be improved, and the final choice was made very close to the deadline for commencing construction of the calorimeter. A complex multi- step screening process involving gamma irradiation and annealing also needed to be developed to ensure that the APDs installed met the demanding reliability requirements of CMS. Until now there has been no recorded failure of any of the 122,000 APDs installed in CMS.
Later, Dieter turned his attention to Geiger-mode APDs, which are now widely used in particle and astroparticle physics, as well as in PET scanners. Together with researchers at ETH Zurich, he started the development of the first camera based on these novel photo sensors for Cherenkov telescopes to measure very high-energy gamma rays from astrophysical sources. This camera was installed at the FACT telescope, located in La Palma, Spain, where the HEGRA experiment had also been operated with Dieter’s active participation. The FACT telescope has now been operating successfully for more than seven years, without any sensor-related problems.
After his retirement Dieter returned to his spiritual home, Munich, where he continued his work at the Technical University.
Dieter was a curious physicist with an exceptional talent for novel detector concepts. He pursued new ideas with a strong focus on achieving his goals. He had a very open mind, and was willing to advise and assist colleagues with great patience and good humour. In his free time his interests included classical music and cooking as well as searching the woods for unusual edible mushrooms. Many colleagues and visitors have fond memories of invitations to his home, embellished with fine cooking.
His sudden illness was a shock to many. Dieter leaves behind his partner, Ulrike.
Nikhef particle physicist and prominent member of the ATLAS experiment at CERN, Olga Igonkina, passed away on 19 May in Amsterdam at the age of 45.
Olya, as she was known to most of us, was born in 1973 in Moscow. Her father was an engineer, her mother a biological scientist. At age 14 she went to a special school for children talented in mathematics and in 1991 started her studies in physics at the Moscow Institute for Physics and Technology. Two years later Olya moved to the ITEP institute to specialise in particle physics, working at the ARGUS experiment and later the HERA-B experiment at DESY.
Olya wrote her dissertation about J/ψ production in HERA-B, with Mikhail Danilov as her supervisor. In 2002 she moved to BaBar at SLAC as a postdoc with the University of Oregon in the group of Jim Brau, where she worked on searches for lepton-flavour-violating tau decays and became convener of the BaBar tau working group. In 2006 she moved to CERN to spearhead Oregon’s new ATLAS group. Her work in ATLAS concentrated on the trigger, where she contributed to many activities with great ideas and enthusiasm, in particular as the trigger-menu coordinator during the startup of the LHC, and later on physics with tau leptons. She began her appointment at Nikhef in 2008 and in 2015 became a professor at Radboud University in Nijmegen.
For her efforts on the ATLAS trigger, Olya was given an ATLAS outstanding achievement award in 2018. Physics-wise, her passion was lepton flavour violation, in particular in tau decays. Intrigued by the hints of lepton-flavour violation in B decays reported by the LHCb experiment and B factories, and always on the lookout for a niche in a large collaboration, in 2018 Olya moved some of her efforts from tau to B physics. She took responsibility for the B-hadron triggers with the aim of collecting an even larger sample of B decays in ATLAS for the final year of Run 2. She was working on preparations for an RK measurement until her very last days.
Besides being a talented scientist, Olya was a dedicated teacher. She supervised an impressive number of PhD students and was very successful in obtaining research grants. She was also very active in outreach activities, with masterclasses and open days at Nikhef, and in community building at ATLAS. Recently she organised the 15th International Workshop on Tau Lepton Physics conference in Amsterdam.
Olya was a passionate physicist who was bursting with ideas. Among several tributes from her colleagues, Olya was described as a future experiment leader. She had a memorably strong work ethos, and until the very last moment refused to let her illness affect her work. She was always cheerful and always positive. Her attitude to work and life will remain a source of inspiration to many of us.
Olya leaves behind her husband, Wouter Hulsbergen of Nikhef, and two children.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.