Think “neutrino detector” and images of giant installations come to mind, necessary to compensate for the vanishingly small interaction probability of neutrinos with matter. The extreme luminosity of proton-proton collisions at the LHC, however, produces a large neutrino flux in the forward direction, with energies leading to cross-sections high enough for neutrinos to be detected using a much more compact apparatus.
In March, the CERN research board approved the Scattering and Neutrino Detector (SND@LHC) for installation in an unused tunnel that links the LHC to the SPS, 480 m downstream from the ATLAS experiment. Designed to detect neutrinos produced in a hitherto unexplored pseudo-rapidity range (7.2 < ? < 8.6), the experiment will complement and extend the physics reach of the other LHC experiments — in particular FASERν, which was approved last year. Construction of FASERν, which is located in an unused service tunnel on the opposite side of ATLAS along the LHC beamline (covering |?|>9.1), was completed in March, while installation of SND@LHC is about to begin.
Both experiments will be able to detect neutrinos of all types, with SND@LHC positioned off the beamline to detect neutrinos produced at slightly larger angles. Expected to commence data-taking during LHC Run 3 in spring 2022, these latest additions to the LHC-experiment family are poised to make the first observations of collider neutrinos while opening new searches for feebly interacting particles and other new physics.
Neutrinos galore
SND@LHC will comprise 800 kg of tungsten plates interleaved with emulsion films and electronic tracker planes based on scintillating fibres. The emulsion acts as vertex detector with micron resolution while the tracker provides a time stamp, the two subdetectors acting as a sampling electromagnetic calorimeter. The target volume will be immediately followed by planes of scintillating bars interleaved with iron blocks serving as a hadron calorimeter, followed downstream by a muon-identification system.
During its first phase of operation, SND@LHC is expected to collect an integrated luminosity of 150 fb-1, corresponding to more than 1000 high-energy neutrino interactions. Since electron neutrinos and antineutrinos are predominantly produced by charmed-hadron decays in the pseudorapidity range explored, the experiment will enable the gluon parton-density function to be constrained in an unexplored region of very small x. With projected statistical and systematic uncertainties of 30% and 22% in the ratio between ?e and ??, and about 10% for both uncertainties in the ratio between ?e and ?? at high energies, the Run-3 data will also provide unique tests of lepton flavour universality with neutrinos, and have sensitivity in the search for feebly interacting particles via scattering signatures in the detector target.
“The angular range that SND@LHC will cover is currently unexplored,” says SND@LHC spokesperson Giovanni De Lellis. “And because a large fraction of the neutrinos produced in this range come from the decays of particles made of heavy quarks, these neutrinos can be used to study heavy-quark particle production in an angular range that the other LHC experiments can’t access. These measurements are also relevant for the prediction of very high-energy neutrinos produced in cosmic-ray interactions, so the experiment is also acting as a bridge between accelerator and astroparticle physics.”
A FASER first
FASERν is an addition to the Forward Search Experiment (FASER), which was approved in March 2019 to search for light and weakly interacting long-lived particles at solid angles beyond the reach of conventional collider detectors. Comprising a small and inexpensive stack of emulsion films and tungsten plates measuring 0.25 x 0.25 x 1.35 m and weighing 1.2 tonnes, FASERν is already undergoing tests. Smaller than SND, the detector is positioned on the beam-collision axis to maximise the neutrino flux, and should detect a total of around 20,000 muon neutrinos, 1300 electron neutrinos and 20 tau neutrinos in an unexplored energy regime at the TeV scale. This will allow measurements of the interaction cross-sections of all neutrino flavours, provide constraints on non-standard neutrino interactions, and improve measurements of proton parton-density functions in certain phase-space regions.
The final detector should do much better — it will be a hundred times bigger
Jamie Boyd
In May, based on an analysis of pilot emulsion data taken in 2018 using a target mass of just 10 kg, the FASERν team reported the detection of the first neutrino-interaction candidates, based on a measured 2.7σ excess of a neutrino-like signal above muon-induced backgrounds. The result paves the way for high-energy neutrino measurements at the LHC and future colliders, explains FASER co-spokesperson Jamie Boyd: “The final detector should do much better — it will be a hundred times bigger, be exposed to much more luminosity, have muon identification capability, and be able to link observed neutrino interactions in the emulsion to the FASER spectrometer. It is quite impressive that such a small and simple detector can detect neutrinos given that usual neutrino detectors have masses measured in kilotons.”
Felix H Boehm, who was William L Valentine Professor of Physics at Caltech, passed away on 25 May in his Altadena home. He was a pioneer in the study of fundamental laws in nuclear- physics experiments.
Born in Basel, Switzerland, in 1924, Felix studied physics at ETH Zürich, earning a diploma in 1948 and a PhD in 1951 for a measurement of the (p,n) reaction at the ETH cyclotron. In 1951 he moved to the US and joined the group of Chien-Shiung Wu at Columbia University, which was investigating beta decay. He joined Caltech in 1953 and spent the rest of his academic career there.
Felix worked first with Jesse DuMond, who had developed the bent-crystal spectrometer, an instrument with unrivalled energy resolution in gamma-ray spectrometry. He used it to determine nuclear radii by measuring X-ray isotope shifts in atoms. Later, he installed such devices at LAMPF, SREL and PSI to investigate pionic atoms, which led to a precise determination of the strong-interaction shift in pionic hydrogen. At Caltech, he also became interested in parity violation and time-reversal invariance. In 1957, in an experiment performed with Aaldert Wapstra, he demonstrated that electrons in beta decay have a predominantly negative helicity.
In the mid 1970s, discussions with Harald Fritzsch and Peter Minkowski convinced Felix that the study of neutrino masses and mixings might provide answers to fundamental questions. From then on, long before it was fashionable, it became his main field of activity. He first looked at neutrino oscillations and initiated an electron–neutrino disappearance experiment with Rudolf Mössbauer.
Theirs was the first dedicated search for neutrino oscillations, beginning with a short-baseline phase at the ILL reactor in Grenoble. The concept of the experiment was presented at the Neutrino ′79 conference in Bergen, at which the Gargamelle collaboration also reported limits on νμ ↔ νe oscillations. Both talks were relegated to a session on exotic phenomena. The ILL experiment was continued at the Gösgen reactor in Switzerland with a longer baseline. No evidence of oscillations was found and stringent limits in a given parameter space were derived, contradicting positive claims made by others. A larger detector was later built at the Palo Verde nuclear power station in Arizona, where again no evidence for oscillations was found. A logical continuation of the effort initiated by Felix was the KamLAND experiment in Japan, which was exposed to several reactors and eventually, in 2002, observed neutrino oscillations in the disappearance mode at a still-longer baseline.
In parallel, Felix decided to probe neutrino masses by searching for neutrinoless double- beta decay. He led a small collaboration that installed a germanium detector in the Gotthard underground laboratory in Switzerland to probe 76Ge, and then searched for the process using a time-projection chamber (TPC) filled with xenon enriched with 136Xe. The TPC, a novel device at the time, improved the event signature and thus reduced the background, allowing stringent constraints to be placed on the effective neutrino mass. The ongoing EXO experiment can be seen as a continuation of this programme, vastly improving the sensitivity in its first phase (EXO-200 at WIPP, New Mexico) and expected to do even better in the second phase, nEXO.
Felix Boehm had a talent to identify important issues on the theoretical side, and to select the appropriate technical methods on the experimental side. He was always ready to innovate. In particular, he realised very early on the importance of selecting radio-pure materials in low-count-rate, low-background experiments. All those who worked with him appreciated his open mind, his determination, his great culture and his kindness.
The high-energy and particle physics division of the European Physical Society (EPS-HEPP) has announced the recipients of its 2021 prizes. The five awards will be presented during the EPS-HEP Conference on 26 July, which will take place online.
2021 EPS High Energy and Particle Physics Prize
Torbjrn Sjöstrand (Lund University) and Bryan Webber (Cambridge University) have been announced as the winners of the the 2021 EPS-HEPP Prize for “for the conception, development and realisation of parton shower Monte Carlo simulations, yielding an accurate description of particle collisions in terms of quantum chromodynamics and electroweak interactions, and thereby enabling the experimental validation of the Standard Model, particle discoveries and searches for new physics.” Both Sjöstrand and Webber were also warded the 2012 Sakurai Prize for Theoretical Particle Physics by the American Physical Society, along with the late Guido Altarelli.
2021 Giuseppe and Vanna Cocconi Prize
The 2021 Giuseppe and Vanna Cocconi Prize has been awarded to the Borexino Collaboration “for their ground-breaking observation of solar neutrinos from the pp and CNO chains that provided unique and comprehensive tests of the Sun as a nuclear fusion engine.” Gianpaolo Bellini, a former spokesperson of try experiment commented: “The Cocconi prize awarded to us by EPS is the recognition of a more than 30-year history that began in the late 1980s, when the experiment was conceived in the context of the scientific debate triggered by the then unsolved problem of the solar neutrino, and by the need for studying solar neutrinos from very low energies.”
2021 Gribov Medal
Bernhard Mistlberger (SLAC) has received the 2021 Gribov Medal “for his ground-breaking contributions to multi-loop computations in QCD and to high-precision predictions of Higgs and vector boson production at hadron colliders.” Mistlberger also recently won the $5000 Wu-Ki Tung Award for Early-Career Research on QCD for his work.
2021 Young Experimental Physicist Prize
The 2021 Outreach Prize of the High Energy and Particle Physics Division of the EPS has been awarded to Nathan Jurik (CERN) “for his outstanding contributions to the LHCb experiment, including the discovery of pentaquarks, and the measurements of CP violation and mixing in the B and D meson systems”; and to Ben Nachman (LBNL Berkeley) “for exceptional contributions to the study of QCD jets as a probe of QCD dynamics and as a tool for new physics searches, his innovative application of machine learning for characterising jets, and the development of novel strategies on jet reconstruction and calibration at the ATLAS experiment.”
2021 Outreach Prize
The three winners of the 2021 EPS-HEPP Outreach Prize are: Uta Bilow (TU Dresden) and Kenneth Cecire (University of Notre Dame), “for the long-term coordination and major expansion of the International Particle Physics Master Classes to include a range of modern methods and exercises, and connecting scientists from all the major LHC and Fermilab experiments to school pupils across the world”, and Sascha Mehlhase (LMU München) “for the design and creation of the ATLAS detector and other interlocking-brick models, creating an international outreach program that reaches to an unusually young audience.” After building the ATLAS detector out of 9500 Lego pieces in 2011, Mehlhase set up the popular “Build Your Own Particle Detector” programme.
Thanks to their large volumes and cost effectiveness, particle-physics experiments rely heavily on gaseous detectors. Unfortunately, environmentally harmful chlorofluorocarbons known as freons play an important role in traditional gas mixtures. To address this issue, more than 200 gas-detector experts participated in a workshop hosted online by CERN on 22 April to study the operational behaviour of novel gases and alternative gas mixtures.
Large gas molecules absorb energy in vibrational and rotational modes of excitation
Freon-based gases are essential to many detectors currently used at CERN, especially for tracking and triggering. Examples run from muon systems, ring-imaging Cherenkov (RICH) detectors and time-projection chambers (TPCs) to wire chambers, resistive-plate chambers (RPCs) and micro-pattern gas detectors (MPGDs). While the primary gas in the mixture is typically a noble gas, adding a “quencher” gas helps achieve a stable gas gain, well separated from the noise of the electronics. Large gas molecules such as freons absorb energy in relevant vibrational and rotational modes of excitation, thereby preventing secondary effects such as photon feedback and field emission. Extensive R&D is needed to reach the stringent performance required of each gas mixture.
CERN has developed several strategies to reduce greenhouse gas (GHG) emissions from particle detectors. As demonstrated by the ALICE experiment’s TPC, upgrading gas-recirculation systems can reduce GHGs by almost 100%. When it is not possible to recirculate all of the gas mixture, gas recuperation is an option – for example, the recuperation of CF4 by the CMS experiment’s cathode-stripchamber (CSC) muon detector and the LHCb experiment’s RICH-2 detector. A complex gas-recuperation system for the C2H2F4 (R134a) in RPC detectors is also under study, and physicists are exploring the use of commonplace gases. In the future, new silicon photomultipliers could reduce chromatic error and increase photon yield, potentially allowing CF4 to be replaced with CO2. Meanwhile, in LHCb’s RICH-1 detector, C4F10 could possibly be replaced with hydrocarbons like C4H10 if the flammability risk is addressed.
Eco-gases
Finally, alternative “eco-gases” are the subject of intense R&D. Eco-gases have a low global-warming potential because of their very limited stability in the atmosphere as they react with water or decompose in ultraviolet light. Unfortunately, these conditions are also present in gaseous detectors, potentially leading to detector aging. In addition to their stability, there is also the challenge of adapting current LHC detectors, given that access is difficult and many components cannot be replaced.
Roberto Guida (CERN), Davide Piccolo (Frascati), Rob Veenhof (Uludağ University) and Piet Verwilligen (Bari) convened workshop sessions at the April event. Groups from Turin, Frascati, Rome, CERN and GSI presented results based on the new hydro-fluoro-olefin (HFO) mixture with the addition of neutral gases such as helium and CO2 as a way of lowering the high working-point voltage. Despite challenges related to the larger signal charge and streamer probability, encouraging results have been obtained in test beams in the presence of LHClike background gamma rays. CMS’s CSC detector is an interesting example where HFO could replace CF4. In this case, its decomposition could even be a positive factor, however further studies are needed.
We now need to create a compendium of simulations and measurements for “green” gases in a similar way to the concerted effort in the 1990s and 2000s that proved indispensable to the design of the LHC detectors. To this end, the INRS-hosted LXCAT database enables the sharing and evaluation of data to model non-equilibrium low-temperature plasmas. Users can upload data on electron- and ion-scattering cross sections and compare “swarm” parameters. The ETH (Zürich), Aachen and HZDR (Dresden) groups illustrated measurements of transport parameters, opening possibilities of collaboration, while the Bari group sought feedback and collaboration on a proposal to precisely measure transport parameters for green gases in MPGDs using electron and laser beams.
Obtaining funding for this work can be difficult due to a lack of expected technological breakthroughs in low-energy plasma physics
Future challenges will be significant. The volumes of detector systems for the High-Luminosity LHC and the proposed Future Circular Collider, for example, range from 10 to 100 m3, posing a significant environmental threat in the case of leaks. Furthermore, since 2014 an EU “F-gas” regulation has come into force, with the aim of reducing sales to one-fifth by 2030. Given the environmental impact and the uncertain availability and price of freon-based gases, preparing a mitigation plan for future experiments is of fundamental importance to the high-energy-physics community, and the next generation of detectors must be completely designed around eco-mixtures. Although obtaining funding for this work can be difficult, for example due to a lack of expected technological breakthroughs in low-energy plasma physics, the workshop showed that a vibrant cadre of physicists is committed to taking the field forward. The next workshop will take place in 2022.
Gravitational waves (GWs) crease and stretch the fabric of spacetime as they ripple out across the universe. As they pass through regions where beams circulate in storage rings, they should therefore cause charged-particle orbits to seem to contract, as they climb new peaks and plumb new troughs, with potentially observable effects.
Proposals in this direction have appeared intermittently over the past 50 years, including during and after the construction of LEP and the LHC. Now that the existence of GWs has been established by the LIGO and VIRGO detectors, and as new, even larger storage rings are being proposed in Europe and China, this question has renewed relevance. We are on the cusp of the era of GW astronomy — a young and dynamic domain of research with much to discover, in which particle accelerators could conceivably play a major role.
From 2 February to 31 March this year, a topical virtual workshop titled “Storage Rings and Gravitational Waves” (SRGW2021) shone light on this tantalising possibility. Organised within the European Union’s Horizon 2020 ARIES project, the meeting brought together more than 100 accelerator experts, particle physicists and members of the gravitational-physics community to explore several intriguing proposals.
Theoretically subtle
GWs are extremely feebly interacting. The cooling and expanding universe should have become “transparent” to them early in its history, long before the timescales probed through other known phenomena. Detecting cosmological backgrounds of GWs would, therefore, provide us with a picture of the universe at earlier times that we can currently access, prior to photon decoupling and Big-Bang nucleosynthesis. It could also shed light on high-energy phenomena, such as high-temperature phase transitions, inflation and new heavy particles that cannot be directly produced in the laboratory.
In the opening session of the workshop, Jorge Cervantes (ININ Mexico) presented a vivid account of the history of GWs, revealing how subtle they are theoretically. It took about 40 years and a number of conflicting papers to definitively establish their existence. Bangalore S. Sathyaprakash (Penn State and Cardiff) reviewed the main expected sources of GWs: the gravitational collapse of binaries of compact objects such as black holes, neutron stars and white dwarfs; supernovae and other transient phenomena; spinning neutron stars; and stochastic backgrounds with either astrophysical or cosmological origins. The GW frequency range of interest extends from 0.1 nHz to 1 MHz (see figure “Sources and sensitivities”).
The frequency range of interest extends from 0.1 nHz to 1 MHz
Raffaele Flaminio (LAPP Annecy) reviewed the mindboggling precision of VIRGO and LIGO, which can measure motion 10,000 times smaller than the width of an atomic nucleus. Jörg Wenninger (CERN) reported the similarly impressive sensitivity of LEP and the LHC to small effects, such as tides and earthquakes on the other side of the planet. Famously, LEP’s beam-energy resolution was so precise that it detected a diurnal distortion of the 27 km ring at an amplitude of a single millimetre, and the LHC beam-position-monitor system can achieve measurement resolutions on the average circumference approaching the micrometre scale over time intervals of one hour. While impressive, given that these machines are designed with completely different goals in mind, it is still far from the precision achieved by LIGO and VIRGO. However, one can strongly enhance the sensitivity to GWs by exploiting resonant effects and the long distances travelled by the particles over their storage times. In one hour, protons at the LHC travel through the ring about 40 million times. In principle, the precision of modern accelerator optics could allow storage rings and accelerator technologies to cover a portion of the enormous GW frequency range of interest.
Resonant Responses
Since the invention of the synchrotron, storage rings have been afflicted by difficult-to-control resonance effects which degrade beam quality. When a new ring is commissioned, accelerator physicists work diligently to “tune” the machine’s parameters to avoid such effects. But could accelerator physicists turn the tables and seek to enhance these effects and observe resonances caused by the passage of GWs?
In accelerators and storage rings, charged particles are steered and focused in the two directions transverse to their motion by dipole, quadrupole and higher-order magnets — the “betatron motion” of the beam. The beam is also kept bunched in the longitudinal plane as a result of an energy-dependent path length and oscillating electric fields in radio-frequency (RF) cavities — the “synchrotron motion” of the beam. A gravitational wave can resonantly interact with either the transverse betatron motion of a stored beam, at a frequency of several kHz, or with the longitudinal synchrotron motion at a frequency of tens of hertz.
Katsunobu Oide (KEK and CERN) discussed the transverse betatron resonances that a gravitational wave can excite for a beam circulating in a storage ring. Typical betatron frequencies for the LHC are a few kHz, offering potentially sensitivity to GWs with frequencies of a similar order of magnitude. Starting from a standard 30 km ring, Oide-san proposed special beam-optical insertions with a large beta function, which would serve as “GW antennas” to enhance the resonance strength, resulting in 37.5 km-long optics (see figure “Antenna optics”). Among several parameters, the sensitivity to GWs should depend on the size of the ring. Oide derived a special resonance condition of kGWR±2=Qx, with R the ring radius, kGW the GW wave number and Qx the horizontal betatron tune.
Suvrat Rao (Hamburg University) presented an analysis of the longitudinal beam response of the LHC. An impinging GW affects the revolution period, in a similar way to the static gravitational gradient effect due to the presence of the Mont Blanc (which alters the revolution time at the level of 10-16 s) and the diurnal effect of the changing locations of sun and moon (10-18 s) — the latter effect being about six orders of magnitude smaller than the tidal effect on the ring circumference.
The longitudinal beam response to a GW should be enhanced for perturbations close to the synchrotron frequency, which, for the LHC, would be in the range 10 to 60 Hz. Raffaele D’Agnolo (IPhT) estimated the sensitivity to the gravitational strain, h, at the synchrotron frequency, without any backgrounds, as h~10-13, and listed three possible paths to further improve the sensitivity by several orders of magnitude. Rao also highlighted that storage-ring GW detection potentially allows for an earth-based GW observatory sensitive to millihertz GWs, which could complement space-based laser interferometers such as LISA, which is planned to be launched in 2034. This would improve the sky-localisation GW-source, which is useful for electromagnetic follow-up studies with astronomical telescopes.
Out of the ordinary
More exotic accelerators were also mooted. A “coasting-beam” experiment might have zero restoring voltage and no synchrotron oscillations. Cold “crystalline” beams of stable ordered 1D, 2D or 3D structures of ions could open up a whole new frequency spectrum, as the phonon spectrum which could be excited by a GW could easily extend up to the MHz range. Witek Krasny (LPNHE) suggested storing beams consisting of “in the LHC: decay times and transition rates could be modified by an incident GW. The stored particles could, for example, include the excited partially stripped heavy ions that are the basis of a “gamma factory”.
Finally on the storage-ring front, Andrey Ivanov (TU Vienna) and co-workers discussed the possibly shrinking circumference of a storage ring, such as the 1.4 km light source SPring-8 in Japan, under the influence of the relic GW background.
Delegates at SRGW2021 also proposed completely different ways of using accelerator technology to detect GWs. Sebastian Ellis (IPhT) explained how an SRF cavity might act as a resonant bar or serve as a Gertsenshtein converter, in both cases converting a graviton into a photon in the presence of a strong background magnetic field and yielding a direct electromagnetic signal — similar to axion searches. Related attempts at GW detection using cavities were pioneered in the 1970s by teams in the Soviet Union and Italy, but RF technology has made big strides in quality factors, cooling and insulation since then, and a new series of experiments appears to be well justified.
Another promising approach for GW detection is atomic-beam interferometry. Instead of light interference, as in LIGO and VIRGO, an incident GW would cause interference between carefully prepared beams of cold atoms. This approach is being pursued by the recently approved AION experiment using ultra-cold-strontium atomic clocks over increasingly large path lengths, including the possible use of an LHC access shaft to house a 100-metre device targeting the 0.01 to 1 Hz range. Meanwhile, a space-based version, AEDGE, could be realised with a pair of satellites in medium earth orbit separated by 4.4×107 m.
Storage rings as sources
Extraordinarily, storage rings could act not only as GW detectors, but also as observable sources of GWs. Pisin Chen (NTU Taiwan) discussed how relativistic charged particles executing circular orbital motion can emit gravitational waves in two channels: “gravitational synchrotron radiation” (GSR) emitted directly by the massive particle, and “resonant conversion” in which, via the Gertsenshtein effect, electromagnetic synchrotron radiation (EMSR) is converted into GWs.
Gravitons could be emitted via “gravitational beamstrahlung”
John Jowett (GSI, retired from CERN) and Fritz Caspers (also retired from CERN) recalled that GSR from beams at the SPS and other colliders had been discussed at CERN as early as the 1980s. It was realised that these beams would be among the most powerful terrestrial sources of gravitational radiation although the total radiated power would still be many orders of magnitude lower than from regular synchrotron radiation. The dominant frequency of direct GSR is the revolution frequency, 10 kHz, while the dominant frequency of resonant EMSR-GSR conversion is a factor γ3 higher, around 10 THz at the LHC, conceivably allowing the observation of gravitons. If all particles and bunches of a beam excited the GW coherently, the space-time metric perturbation has been estimated to be as large as hGSR~10-18. Gravitons could also be emitted via “gravitational beamstrahlung” during the collision with an opposing beam, perhaps producing the most prominent GW signal at future proposed lepton colliders. At the LHC, argued Caspers, such signals could be detected by a torsion-balance experiment with a very sensitive, resonant mechanical pickup installed close to the beam in one of the arcs. In a phase-lock mode of operation, an effective resolution bandwidth of millihertz or below could be possible, opening the exciting prospect of detecting synthetic sources of GWs.
Towards an accelerator roadmap
The concluding workshop discussion, moderated by John Ellis (King’s College London), focused on the GW-detection proposals considered closest to implementations: resonant betatron oscillations near 10 kHz; changes in the revolution period using “low-energy” coasting ion-beams without a longitudinally focusing RF system; “heterodyne” detection using SRF cavities up to 10 MHz; beam-generated GWs at the LHC; and atomic interferometry. These potential components of a future R&D plan cover significant regions of the enormous GW frequency space.
Apart from an informal meeting at CERN in the 1990s, SRGW2021 was the first workshop to link accelerators and GWs and bring together the implicated scientific communities. Lively discussions in this emerging field attest to the promise of employing accelerators in a completely different way to either detect or generate GWs. The subtleties of the particle dynamics when embedded in an oscillating fabric of space and time, and the inherent sensitivity problems in detecting GWs, pose exceptional challenges. The great interest prompted by SRGW2021, and the tantalising preliminary findings from this workshop, call for more thorough investigations into harnessing future storage rings and accelerator technologies for GW physics.
Open science has become a pillar of the policies of national and international research-funding bodies. The ambition is to increase scientific value by sharing data and transferring knowledge within and across scientific communities. To this end, in 2015 the European Union (EU) launched the European Open Science Cloud (EOSC) to support research based on open-data science.
To help European research infrastructures adapt to this future, in 2019 the domains of astrophysics, nuclear and particle physics joined efforts to create an open scientific analysis infrastructure to support the principles of data “FAIRness” (Findable, Accessible, Interoperable and Reusable) through the EU Horizon 2020 project ESCAPE (European Science Cluster of Astronomy & Particle physics ESFRI research infrastructures). The ESCAPE international consortium brings together ESFRI projects (CTA, ELT, EST, FAIR, HL-LHC, KM3NeT and SKA) and other pan-European research infrastructures (RIs) and organisations (CERN, ESO, JIVE and EGO), linking them to EOSC.
Launched in February 2019, the €16M ESCAPE project recently passed its mid-point, with less than 24 months remaining to complete the work programme. Several milestones have already been achieved, with much more in store.
Swimming in data ESCAPE has implemented the first functioning pilot ‘Data Lake’ infrastructure, which is a new model for federated computing and storage to address the exabyte-scale of data volumes expected from the next generation of RIs and experiments. The Data Lake consists of several components that work together to provide a unified namespace to users who wish to upload, download or access data. Its architecture is based on existing and proven technologies: the Rucio platform for data management; the CERN-developed File Transfer Service for data movement and transfer; and connection to heterogenous storage systems in use across scientific data centres. These components are deployed and integrated in a service that functions seamlessly regardless of which RI the data belong to.
ESCAPE aims to deploy an integrated open “virtual research environment”
The Data Lake is an evolution of the current Worldwide LHC Computing Grid model for the advent of HL-LHC. For the first time, thanks to ESCAPE, it is the product of a cross-domain and cross-project collaboration, where scientists from HL-LHC, SKA, CTA, FAIR and others co-develop and co-operate from the beginning. The first data orchestration tests have been successfully accomplished, and the pilot phase demonstrated a robust architecture that serves the needs and use-cases of the participant experiments and facilities. Monitoring and dashboard services have enabled user access and selection of datasets. A new data challenge also including scientific data-analysis workflows in the Data Lake is planned for later this year.
ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services. It will house software and services for data processing and analysis, as well as test datasets of the partner ESFRI projects, and provide user-support documentation, tutorials, presentations and training.
Open software The collaborative, open-innovation environment and training actions provided by ESCAPE have already enabled the development of original open-source software. High-performance programming methods and deep-learning approaches have been developed, benchmarked and in some cases included in the official analysis pipelines of partner RIs. Definition of data formats has been pursued as well as the harmonisation of approaches for innovative workflows. A common meta-data description of the software packages, community implementation based on an available standard (CodeMeta) and standard guidelines (including licensing) for the full software development lifecycles have been gathered to enable interoperability and re-use.
Following the lead of the HEP Software Foundation (HSF), the community-based foundation of ESCAPE embraces a large community. Establishing a cooperative framework with the HSF will enable HSF packages to be added to the ESCAPE catalogue, and to align efforts.
From the user-access point of view, ESCAPE aims to build a prototype ‘science analysis platform’ that supports data discovery and integration, provides access to the repository, enables user-customised processing and workflows, interfaces with the underlying distributed Data Lake and links to existing infrastructures such as the Virtual Observatory. It also enables researchers’ participation in large citizen-powered research projects such as Zooniverse. Every ESFRI project customizes the analysis platform for their own users on top of some common lower-level services such as JupyterHub, a pre-defined Jupyter Notebook environment and Kubernetes deployment application that ESCAPE is building. First prototypes are under evaluation for SKA, CTA and for the Vera C. Rubin Observatory.
In summary, ESCAPE aims to deploy an integrated open “virtual research environment” through its services for multi-probe data research, guaranteeing and boosting scientific results while providing a mechanism for acknowledgement and rewarding of researchers committing to open science. In this respect, together with four other thematic clusters (ENVRI-Fair, EOSC-Life, PANOSC and SSHOC), ESCAPE is partner of a new EU funded project ‘EOSC Future’ which aims to gather the efforts of more researchers in some cross-domain open-data ‘Test Science Projects’ (TSP). TSPs are collaborative projects, including two named Dark Matter and Extreme Universe, in which data, results and potential discoveries from a wealth of astrophysics, particle-physics and nuclear-physics experiments, combined with theoretical models and interpretations, will increase our understanding of the universe. This requires the engagement of all scientific communities, as already recommended by the 2020 update of the European Strategy for Particle Physics.
Open-data science projects In particular, the Dark Matter TSP aims at further understanding the nature of dark matter by performing new analyses within the experiments involved, and collecting all the digital objects related to those analyses (data, metadata and software) on a broad open-science platform that will allow these analyses to be reproducible by the entire community wherever possible.
The Extreme Universe TSP, meanwhile, intends to develop a platform to enable multi-messenger/multi-probe astronomy (MMA). There are many studies of transient astrophysical phenomena that benefit from the combined use of multiple instruments at different wavelengths and different probe types. Many of these are based on the trigger of one instrument generating follow-ups from others at different timescales, from seconds to days. Such observations could lead to images of strong gravitational effects that are expected near a black hole, for example. Extreme energetic astrophysical pulsing phenomena such as gamma-ray bursts, active galactic nuclei and fast radio bursts are also high-energy phenomena not yet fully understood. The intention within ESCAPE is to build such a platform for MMA science in such a way as to make it sustainable.
ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services
The idea in both of these TSPs is to exploit for validation purposes all the prototype services developed by ESCAPE and the uptake of its virtual research environment. At the same time the TSPs aim to promote the innovative impact of data analysis in open science, validate the reward scheme acknowledging scientists’ participation, and demonstrate the increased scientific value implied by sharing data. This approach was discussed at the last JENAS 2019 workshop and will be linked to two homologue joint ECFA-NuPECC-APPEC actions (iDMEu and gravitational-wave probes of fundamental physics).
Half-way through, ESCAPE is clearly proving itself as a powerful catalyst to make the world’s leading research infrastructures in particle physics and astronomy as open as possible. The next two years will see the consolidation of the cluster programme and the inclusion of further world-class RIs in astrophysics, nuclear and particle physics. Through the TSPs and further science projects, the ESCAPE community will continue to engage in building within EOSC the open-science virtual research environment of choice for European researchers. In the even longer term, ESCAPE and the other science clusters are exploring how to evolve into sustained “Platform Infrastructures” federating large domain-based RIs. The platforms would operate to study, define and set up a series of new focuses around which they engage with the European Commission and national research institutes to take part in the European data strategy at large.
The Higgs boson was hypothesised to explain electroweak symmetry breaking nearly 50 years before its discovery. Its eventual discovery at the LHC took half a century of innovative accelerator and detector development, and extensive data analysis. Today, several outstanding questions in particle physics could be answered by higgsinos – theorised supersymmetric partners of an extended Higgs field. The higgsinos are a triplet of electroweak states, two neutral and one charged. If the lightest neutral state is stable, it can provide an explanation of astronomically observed dark matter. Furthermore, an intimate connection between higgsinos and the Higgs boson could explain why the mass of the Higgs boson is so much lighter than suggested by theoretical arguments. While higgsinos may not be much heavier than the Higgs boson, they would be produced more rarely and are significantly more challenging to find, especially if they are the only supersymmetric particles near the electroweak scale.
Higgsinos mix with other supersymmetric electroweak states, the wino and the bino, to form the physical particles that would be observed
The ATLAS collaboration recently released a set of results based on the full LHC Run 2 dataset that explore some of the most challenging experimental scenarios involving higgsinos. Each result tests different assumptions. Owing to quantum degeneracy, the higgsinos mix with other supersymmetric electroweak states, the wino and the bino, to form the physical particles that would be observed by the experiment. The mass difference between the lightest neutral and charged states, ∆m, depends on this mixing. Depending on the model assumptions, the phenomenology varies dramatically, requiring different analysis techniques and stimulating the development of new tools.
If ∆m is only a few hundred MeV, the small phase space suppresses the decay from the heavier states to the lightest one. The long-lived charged state flies partway through the inner tracker before decaying, and its short track can be measured. A search targeting this anomalous “disappearing track” signature was performed by exploiting novel requirements on the quality of the signal candidate and the ability of the ATLAS inner detectors to reconstruct short tracks. Finding that the number of short tracks is as expected from background processes alone, this search rules out higgsinos with lifetimes of a fraction of a nanosecond for masses up to 210 GeV.
If higgsinos mix somewhat with other supersymmetric electroweak states, they will decay promptly to the lightest stable higgsino and low-energy Standard Model particles. These soft decay products are extremely challenging to detect at the LHC, and ATLAS has performed several searches for events with two or three leptons to maximise the sensitivity to different values of ∆m. Each search features innovative optimisation and powerful discriminants to reject background. For the first time, ATLAS has performed a statistical combination of these searches, constraining higgsino masses to be larger than 150 GeV for ∆m above 2 GeV.
A final result targets higgsinos in models in which the lightest supersymmetric particle is not stable. In these scenarios, higgsinos may decay to triplets of quarks. A search designed around an adversarial neural network and employing a completely data-driven background estimation technique was developed to distinguish these rare decays from the overwhelming multi-jet background. This search is the first at the LHC to obtain sensitivity to this higgsino model, and rules out scenarios of the pair production of higgsinos with masses between 200 and 320 GeV (figure 1).
Together, these searches set significant constraints on higgsino masses, and for certain parameters provide the first extension of sensitivity since LEP. With the development of new techniques and more data to come, ATLAS will continue to seek higgsinos at higher masses, and to test other theoretical and experimental assumptions.
On 1 May, experimental astroparticle physicist Ignacio Taboada of the Georgia Institute of Technology began his two-year term as spokesperson of the IceCube collaboration. He replaces Darren Grant who has served as spokesperson for the South Pole neutrino observatory since 2019, during which time the collaboration made the first measurements of tau-neutrino appearance with IceCube DeepCore and reported the first observation of high-energy astrophysical tau neutrinos.
Taboada currently leads a research group at the Center for Relativistic Astrophysics at Georgia Tech, which has made significant contributions to IceCube by using data to search for neutrinos from transient sources, including blazar flares. Among his goals as spokesperson is to help consolidate the potential future of IceCube, IceCube-Gen2 – a proposed $350M upgrade that would increase the annual rate of cosmic neutrino observations by an order of magnitude, while increasing the sensitivity to point sources by a factor of five.
I want to make sure that everybody that is related to IceCube in one way or another feels welcome
Ignacio Taboada
“IceCube was initially conceived to study astrophysical neutrinos and to search for the sources of astrophysical neutrinos. However, the breadth of science that it can do in other areas — glaciology, cosmic rays, PeV gamma ray sources, searches for dark matter, etc. — has allowed IceCube to produce really good scientific results for a decade or longer,” says Taboada. “Because Gen2 is standing on similar premises, I think it has a really bright future.”
Another goal is to make every IceCube member feel welcome, he explains. “There are 350 authors whose names go into papers, but I want to make sure that everybody that is related to IceCube in one way or another feels welcome within IceCube. When I joined the AMANDA collaboration, the predecessor of IceCube, in the late 1990s it was maybe 25 people. Now that it’s a gigantic enterprise, it is very easy, for example, for new PhD students to feel intimidated by professors, the analysis coordinator, the spokesperson. That’s not what I want—what I want is for everybody to feel welcome, because every single one of these people has tremendous potential to contribute to the experiment.”
Theorist Tord Riemann, who made key contributions to e+e– collider phenomenology, left us on 2 April.
Tord was born in 1951 in East Berlin, educated at the Heinrich-Hertz-Gymnasium specialist mathematics school in Berlin and studied physics at Humboldt University in Berlin from 1970. He graduated in 1977 with a doctorate devoted to studies of the lattice approach to quantum field theory. He obtained a research position in the theory group of the Institute of High Energy Physics of the Academy of Sciences of the GDR in Zeuthen (later DESY Zeuthen), and in 1983–1987 worked at JINR, then in the Soviet Union, in the group of Dmitry Bardin.
In 1989/1990 Tord visited the L3 experiment at CERN, starting a fruitful collaboration on the application of the ZFITTER project at the Large Electron–Positron (LEP) collider. In 1991–1992 he was a research associate in the CERN theory division, working out the so-called S-matrix approach to the Z resonance. This was a profound contribution to the field, and a breakthrough for the interpretation of LEP data. Tord was one of the first to realise the great potential of a new e+e– “Tera-Z” factory at the proposed Future Circular Collider, FCC-ee, and led the charge reviving precision calculations for it.
Tord’s scientific fields of interest were broad
Tord’s scientific fields of interest were broad, and aimed at predicting observables measured at accelerators. His research topics included linear-collider physics; Higgs, WW, ZZ, 2f and 4f production in e+e– scattering; physics at LEP and FCC-ee; methods in the calculation of multi-loop massive Feynman integrals; NNLO Bhabha scattering in QED; higher-order corrections in the electroweak Standard Model and some extensions; and electroweak corrections for deep inelastic scattering at HERA. Apart from ZFITTER, he co-authored several programmes, including topfit, GENTLE/4fan, HECTOR, SMATASY, TERAD91, DISEPNC, DISEPCC, DIZET, polHeCTOR and AMBRE.
While being an active research scientist throughout his career, Tord will also be warmly remembered as a great mentor to many of us. He was a thesis advisor for two diploma and seven PhD students, and was actively engaged in supporting many postdoctoral researchers. He was co-founder and organiser of the bi-annual workshop series Loops and Legs in Quantum Field Theory and of the biannual DESY school Computer Algebra and Particle Physics.
In 2000, Tord and the ZFITTER collaboration were awarded the First Prize of JINR, and in 2014 the article “The ZFITTER Project” was awarded the JINR prize for the best publication of the year in Physics of Elementary Particles and Nuclei. In 2015 Tord was awarded an Alexander von Humboldt Polish Honorary Research Fellowship.
Tord Riemann cared about high standards in scientific research, including ethical issues. He was a true professional of the field. Despite illness, he continued working until his last day.
Tord was an outstanding scientist, a just person of great honesty, a reliable friend, colleague and family man. We feel a great loss, personally and as a scientific community, and remain thankful for his insights, dedication and all the precious moments we have shared.
After many years of research and development, the ALPHA collaboration has succeeded in laser-cooling antihydrogen – opening the door to considerably more precise measurements of antihydrogen’s internal structure and gravitational interactions. The seminal result, reported on 31 March in Nature, could also lead to the creation of antimatter molecules and the development of antiatom interferometry, explains ALPHA spokesperson Jeffrey Hangst. “This is by far the most difficult experiment we have ever done,” he says. “We’re over the moon. About a decade ago, laser cooling of antimatter was in the realm of science fiction.”
The ALPHA collaboration synthesises antihydrogen from cryogenic plasmas of antiprotons and positrons at CERN’s Antiproton Decelerator (AD), storing the antiatoms in a magnetic trap. Lasers with particular frequencies are then used to measure the antiatoms’ spectral response. Finding any slight difference between spectral transitions in antimatter and matter would challenge charge–parity–time symmetry, and perhaps cast light on the cosmological imbalance of matter and antimatter.
Historically, researchers have struggled to laser-cool normal hydrogen, so this has been a bit of a crazy dream for us for many years.
Makoto Fujiwara
Following the first antihydrogen spectroscopy by ALPHA in 2012, in 2017 the collaboration measured the spectral structure of the antihydrogen 1S–2S transition with an outstanding precision of 2 × 10–12 – marking a milestone in the AD’s scientific programme. The following year, the team determined antihydrogen’s 1S–2P “Lyman–alpha” transition with a precision of a few parts in a hundred million, showing that it agrees with the prediction for the equivalent transition hydrogen to a precision of 5 × 10–8. However, to push the precision of spectroscopic measurements further, and to allow future measurements of the behaviour of antihydrogen in Earth’s gravitational field, the kinetic energy of the antiatoms must be lowered.
In their new study, the ALPHA researchers were able to laser-cool a sample of magnetically trapped antihydrogen atoms by repeatedly driving the antiatoms from the 1S to the 2P state using a pulsed laser with a frequency slightly below that of the transition between them. After illuminating the trapped antiatoms for several hours, the researchers observed a more than 10-fold decrease in their median kinetic energy, with many of the antiatoms attaining energies below 1 μeV. Subsequent spectroscopic measurements of the 1S–2S transition revealed that the cooling resulted in a spectral line about four times narrower than that observed without laser cooling – a proof-of-principle of the laser-cooling technique, with further statistics needed to improve the precision of the previous 1S–2S measurement (see figure).
“Historically, researchers have struggled to laser-cool normal hydrogen, so this has been a bit of a crazy dream for us for many years,” says Makoto Fujiwara, who proposed the use of a pulsed laser to cool trapped antihydrogen in ALPHA. “Now, we can dream of even crazier things with antimatter.”
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.