Safety is a priority for CERN. It spans all areas of occupational health and safety, including the protection of the environment and the safe operation of facilities. Continuous exchanges with similar research infrastructures on best practices and techniques ensures that CERN maintains the highest standards. From 25 to 28 October, more than 100 people from CERN and research institutes worldwide gathered in the Globe of Science and Innovation at CERN for the International Technical Safety Forum (ITSF). This key conference in matters of health and safety is a forum for exchanging new ideas, processes, procedures and technologies in personnel, environmental and equipment safety among a variety of high-energy physics, synchrotron and other research infrastructures.
It is a pleasure to share new ways of thinking and acting in matters of occupational health & safety and environmental protection
Yves Loertscher
“In its 25-year existence, the Forum has evolved with the times, all the while increasing its attractiveness for experts to share their knowledge, experience and challenges,” says Ralf Trant of the CERN technology department. “The scope has broadened from high-energy physics to a wider range of disciplines and participating institutes, in Europe and beyond with Asian labs joining in addition to American institutes, who have been involved since the beginning.”
Opening the event, Benoît Delille, head of the CERN Health, Safety & Environment (HSE) unit, noted: “For colleagues from different institutes who visit CERN for the first time, it is an occasion for us to share the values on which this Organization is built, that we are proud of, and also how we make them come to life through the prism of Safety.” A first session on environmental protection and sustainability saw CERN share its approach to minimise its environmental footprint in key domains, alongside a presentation from the European Spallation Source (ESS) on environmental management during its post-construction phase. Sessions including continuous improvements in health & safety, fire safety, equipment certification, incidents and lessons learned, risk assessment and technical risks unfolded during the week, ending with new projects and challenges, safety culture and behaviour and safety training.
“Listening to your colleagues from other research institutes informing about occurred events, lessons learned and recent developments in safety assessment is the pure essence of ITSF,” said Peter Jakobsson, head of environment, safety, health & quality at ESS and member of the ITSF organising committee, who chaired the “Incidents and lessons learned” session. “We openly share information in different subject safety areas such as fire hazards, handling of chemicals and inspection of pressurised equipment. In doing so, we all learn from each other to create a safe work environment for our staff and scientific users: a true sign of the safety culture that we all strive for.”
In addition to a rich programme of presentations, the event featured an interactive fire workshop in which participants shared ongoing projects and challenges related to fire safety in accelerator facilities. CERN also shared its experiences of the fire-induced radiological integrated assessment (FIRIA) project whose objective is to develop a general methodology for assessing the fire-related risks present in CERN’s facilities and provide a forum to keep experts connected and updated. Participants also enjoyed visits of the installations, complemented with a tour of the CERN safety training centre in Prévessin on the final day.
“This event gave us the possibility to share our knowledge through presentations but also through networking breaks, visits and social events,” said Yves Loertscher, head of the CERN HSE occupational health & safety group and organiser of this year’s ITSF event. “After a break of almost three years owing to the pandemic, it is a pleasure to interact directly with peers again and share new ways of thinking and acting in matters of occupational health & safety and environmental protection”.
On 7 September colleagues and friends of Gabriele Veneziano gathered at CERN for an informal celebration of the renowned theorist’s 80th birthday. While a visitor in the CERN theory division (TH) in 1968, Veneziano wrote a paper “Construction of a crossing-simmetric, Regge-behaved amplitude for linearly rising trajectories”. It was an attempt to explain the strong interaction, but ended up marking the beginning of string theory. During the special TH colloquium, talks by Paolo Di Vecchia (NBI&Nordita), Thibault Damour (IHES) and others explored this and numerous other aspects of Veneziano’s work, much of which was undertaken during his 30 year-long career at CERN. Concluding the day’s proceedings, Veneziano thanked his mentors, CERN TH and chance – “the chance of having lived through one of most interesting periods in the history of physics… during which, through a wonderful cooperation between theory and experiment, enormous progress has been made in our understanding of nature at its deepest level.”
The second joint ECFA (European Committee for Future Accelerators), NuPECC (Nuclear Physics European Collaboration Committee) and APPEC (AstroParticle Physics European Consortium) symposium, JENAS, was held from 3 to 6 May in Madrid, Spain. Senior and junior members of the astroparticle, nuclear and particle-physics communities presented their challenges and discussed common issues with the goal of achieving a more comprehensive assessment of overlapping research topics. For many of the more than 160 participants, it was their first in-person attendance at a conference after more than two years due to the COVID-19 pandemic.
Focal point
The symposium began with the research highlights and strategies of the three research fields. A major part of this concerned the progress and plans of the six joint projects that have emerged since the first JENAS event in 2019: dark matter (iDMEu initiative); gravitational waves for fundamental physics; machine-learning optimised design of experiments; nuclear physics at the LHC; storage rings to search for charged-particle electric dipole moments; and synergies between the LHC and future electron–ion collider experiments. The discussions on the joint projects were complemented by a poster session where young scientists presented the details of many of these activities.
The goal was achieving a more comprehensive assessment of overlapping research topics
Detector R&D, software and computing, as well as the application of artificial intelligence, are important examples where large synergies between the three fields can be exploited. On detector R&D there is interest in collaborating on important research topics such as those identified in the 2021 ECFA roadmap on detector R&D. In this roadmap, colleagues from the astroparticle and nuclear-physics communities were involved. Likewise, the challenges of processing and handling large datasets, distributed computing, as well as developing modern analysis methods for complex data analyses involving machine learning, can be addressed together.
Overview talks and round-table discussions related to education, outreach, open science and knowledge transfer allowed participants to emphasise and exchange best practices. In addition, the first results of surveys on diversity and the recognition of individual achievements in large collaborations were presented and discussed. For the latter, a joint APPEC–ECFA–NuPECC working group has presented an aggregation of best practices already in place. A major finding is that many collaborations have already addressed this topic thoroughly. However, they are encouraged to further monitor progress and consider introducing more of the best practices that were identified.
Synergy
One day was dedicated to presentations and closed-session discussions with representatives from both European funding agencies and the European Commission. The aim was to evaluate whether appropriate funding schemes and organisational structures can be established to better exploit the synergies between astroparticle, nuclear and particle physics, and thus enable a more efficient use of resources. The positive and constructive feedback will be taken into account when carrying out the common projects and towards the preparation of the third JENAS event, which is planned to take place in about three years’ time.
Cosmology, along with quantum mechanics, is probably among the most misunderstood physics topics for the layperson. Many misconceptions exist, for instance whether the universe had a beginning or not, what the cosmic expansion is, or even what exactly is meant by the term “Big Bang”. Will Kinney’s book An Infinity of Worlds: Cosmic Inflation and the Beginning of the Universe clarifies and corrects these misconceptions in the most accessible way.
Kinney’s main aim is to introduce cosmic inflation – a period of exponential expansion conjectured to have taken place in the very early universe – to a general audience. He starts by discussing the Standard Model of cosmology and how we know that it is correct. This is done most successfully and in a very succinct way. In only 24 pages, the book clarifies all the relevant concepts about what it means for the universe to expand, its thermal history and what a modern cosmologist means by the term Big Bang.
The book continues with an accessible discussion about the motivation for inflation. There are plenty of comments about the current evidence for the theory, its testability and future directions, along with discussions about the multiverse, quantum gravity, the anthropic principle and how all these combine together.
A clear understanding
There are two main points that the author manages to successfully induce the reader to reflect on. The first is the extreme success of the cosmic microwave background (CMB) as a tool to understand cosmology: its black-body spectrum established the Big Bang; its analysis demonstrated the flatness of the universe and its dark contents and motivated inflation; its fluctuations play a large part in our understanding of structure formation in the universe; and, along with the polarisation of the CMB, photons provide a window into the dynamics of inflation. Kinney notes that there are also plenty of features that have not been measured, which are especially important for inflation, such as the B-modes of the CMB and primordial gravitational waves, meaning that CMB-related observations have a long way to go.
The second main point is the importance of a clear understanding of what we know and what we do not know in cosmology. The Big Bang, which is essentially the statement that the universe started as a hot plasma of particles and cooled as it expanded, is a fact. The evidence, which goes well beyond the observation of cosmic expansion, is explained very well in Kinney’s book. Beyond that there are many unknowns. Despite the excellent motivation for and the significant observational successes of inflationary models, they are yet to be experimentally verified. It is probably safe to assume, along with the author, that we will know in the future whether inflation happened or not. Even if we establish that it did and understand its mechanism, it is not clear what we can learn beyond that. Most inflationary models make statements about elements, such as the inflationary multiverse, that in principle cannot be observed.
Steven Weinberg once commented that we did not have to wait to see the dark side of the moon to conclude that it exists. Whether this analogy can be extended successfully to include inflation or string theory is definitely debatable. What is certain, however, is that there will be no shortage of interesting topics and discussions in the years to come about cosmology and fundamental physics in general. Kinney’s book can serve as a useful introduction for the general public, but also for physics students and even physicists working in different fields. As such, this book is a valuable contribution to both science education and dissemination.
The use of deep learning in particle physics has exploded in recent years. Based on INSPIRE HEP’s database, the number of papers in high-energy physics and related fields referring to deep learning and similar topics has grown 10-fold over the last decade. A textbook introducing these concepts to physics students is therefore timely and valuable.
When teaching deep learning to physicists, it can be difficult to strike a balance between theory and practice, physics and programming, and foundations and state-of-the-art. Born out of a lecture series at RWTH Aachen and Hamburg universities, Deep Learning for Physics Research by Martin Erdmann, Jonas Glombitza, Gregor Kasieczka and Uwe Klemradt does an admiral job of striking this balance.
The book contains 21 chapters split across four parts: deep-learning basics, standard deep neural-networks, interpretability and uncertainty quantification, and advanced concepts.
In part one, the authors cover introductory topics including physics data, neural-network building blocks, training and model building. Part two surveys and applies different neural-network structures, including fully connected, convolutional, recurrent and graph neural-networks, while also reviewing multi-task learning. Part three covers introspection, interpretability, uncertainty quantification, and revisits different objective functions for a variety of learning tasks. Finally, part four touches on weakly supervised and unsupervised learning methods, generative models, domain adaptation and anomaly detection. Helping to lower the barrier to entry for physics students to use deep learning in their work, the authors contextualise these methods in real physics-research studies, which is an added benefit compared to similar textbooks.
Deep learning borrows many concepts from physics, which can provide a way of connecting similar ideas in the two fields. A nice example explained in the book is the cross-entropy loss function, which has its origins in the definition of entropy according to Gibbs and Boltzmann. Another example that crops up, although rather late in part three, is the connection between the mean-squared-error loss function and the log-likelihood function for a Gaussian probability distribution, which may be more familiar to physics students accustomed to performing maximum likelihood fits.
Hands-on
Accompanying the textbook is a breadth of free, online Jupyter notebooks (executable Python code in an interactive format), which are available at http://deeplearningphysics.org. These curated notebooks are paired with different chapters and immerse students in hands-on exercises. Both the problem and corresponding solution notebooks are available online,and are accessible to students even without expensive computing hardware as they can be launched on free cloud services such as Google Colab or Binder. In addition, students who have a CERN account can launch the notebooks on CERN’s service for web-based analysis (SWAN) platform.
Advanced exercises include the training and evaluation of a denoising autoencoder for speckle removal in X-ray images and a Wasserstein generative adversarial network for the generation of cosmic-ray-induced air-shower footprints. What is truly exciting about these exercises is their use of physics research examples, many taken from recent publications. Students can see how close their homework exercises and solutions are to cutting-edge research, which can be highly motivating.
In a book spanning less than 300 pages (excluding references), it is impossible to cover everything, especially as new deep-learning methods are developed almost daily. For a more theoretical understanding of the fundamentals of deep learning, readers are advised to consult the classic Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville, while for more recent deep-learning developments in particle physics they are directed to the article “A Living Review of Machine Learning for Particle Physics” by Matthew Feickert and Benjamin Nachman.
With continued interest in deep learning, coverage of a variety of real physics-research examples and a breadth of accessible, online exercises, Deep Learning in Physics Research is poised to be a standard textbook on the bookshelf of physics students for years to come.
Neutrinos are the least understood of all elementary particles, and the fact that they have mass is a firm indication of physics beyond the Standard Model. Decades of effort have been devoted to exploring the properties of neutrinos. However, there are still many important questions to address. For example, little is known about the absolute mass scale and neutrino-mass ordering. Also, we have not achieved a decent measurement of the CP phase in the neutrino mixing “PMNS” matrix. Furthermore, the nature of neutrino masses, i.e. whether they are Dirac or Majorana, remains unknown.
From 30 July to 6 August the 23rd NuFACT workshop hosted by the University of Utah reviewed recent developments in neutrino physics, particle physics and astroparticle physics. The workshop brought together experts from all leading neutrino experiments and discussed theoretical aspects, with the aim of facilitating new connections between different disciplines and theorists and experimentalists.
Talking points
NuFACT2022 topics were spread into seven working groups: neutrino oscillations; neutrino scattering physics; accelerator physics; muon physics; neutrinos beyond PMNS; detectors; and inclusion, diversity, equity, education and outreach. The latter was newly established at this year’s workshop to become an integral part of the series.
Three mini-workshops took place. One explored plans for the second phase of the European Spallation Source neutrino Super Beam (ESSνSB) project, for which the European Union has recently decided to continue its support for another four years. This second phase will study new components that open additional physics opportunities including muon studies, precise neutrino cross-section measurements and sterile-neutrino searches.
The two-day mini-workshop “Multi- messenger Tomography of the Earth”, involving 22 talks, saw leading neutrino physicists and geoscientists exchange ideas on how Earth’s interior models may impact high-precision measurements of neutrino oscillation parameters. Participants also addressed the potential of using neutrino absorption at high energies (PeV–TeV) and neutrino oscillation at low energies (~GeV) inside Earth to locate the core–mantle boundary, determine the density of the core and mantle, and measure the chemical composition of the core. A third workshop targeted career development, with the aim of improving communication and negotiation skills among early-career scientists.
Progress in using neutrino-oscillation measurements to search for hints of new physics and symmetries in nature was discussed extensively. Central questions to be addressed include: is the neutrino-mixing angle θ23 exactly 45°, which might hint at a new symmetry in nature? Is the PMNS matrix unitary or could it indicate there are additional neutrinos or something fundamentally wrong with our understanding of the neutrino sector? Are there more than the three active neutrinos? Do we see indications for CP violation in the neutrino sector or is it even maximal? Do neutrino-mass eigenstates follow the same “normal” ordering as observed for quarks, for which there is currently a slight preference in the global fit data ?
The latest results from leading experiments including IceCube, KM3NeT/ORCA, NOvA,Super-K and T2K were presented. T2K presented a new analysis using the same data runs as last year, but using more data from the near and far detector samples combined with upgraded cross-section and flux models. T2K and NOvA data preferences on δCP and sin2θ23 are broadly compatible and joint fit results can be expected for late 2022. For the normal-mass ordering case, the most probable regions are distinct, and the significant contour overlap of 1σ, while no preference on CP violation can be inferred. For the inverted mass ordering case, T2K and NOvA contours overlap and are consistent with maximal CP violation in the neutrino sector.
Particularly competitive results of neutrino oscillation-parameter measurements with neutrino telescopes are available from IceCube–DeepCore and ORCA, and are now approaching the precision of accelerator-based neutrino experiments.
Various theoretical aspects of neutrino physics were covered. The nature of the neutrino mass, either Dirac or Majorana, remains a key focus. Different see-saw mechanism types and their experimental consequences were intensively discussed. In particular, recent progress in Majorana neutrino tests using both neutrinoless double-beta decay experiments as well as LHC measurements by the new FASER experiment were reported. Connecting neutrino and muon experiments, such as charged-lepton-flavour violation and the application of a possible muon collider to neutrino physics, were extensively addressed. The existence of sterile neutrinos and their properties remain of high importance to the field and future experimental results are highly anticipated, such as the short-baseline program at Fermilab and JSNS2 at J-PARC. Alternative explanations for various neutrino anomalies were also discussed, including more general dark-sector searches using neutrino experiments. The electron low-energy excess at MicroBooNE in particular draws attention. The focus is on improved event reconstructions, which may unveil the nature of this anomalous excess. Assuming the existence of one species of sterile neutrino, 3+1 oscillation analyses have been carried out to interpret the anomaly and compare with results from other experiments. Although inconclusive, this anomaly triggers many interesting ideas that will motivate follow-up studies.
Taking place shortly after the Snowmass Summer Meeting in Seattle (see Charting the future of US particle physics), NuFACT2022 also offered an opportunity to summarise the scientific vision for the future of neutrino physics in the US. The neutrino frontier in Snowmass has 10 topical groups, with physics beyond the Standard Model and neutrinos as messengers emerging as major focuses. Many possible synergies between neutrino physics and other branches of physics were also highlighted.
The International Union of Pure and Applied Physics (IUPAP) is an offspring of the International Research Council, a temporary body created after the First World War to rebuild and promote research across the sciences. IUPAP was established in 1922 with 13 member countries and held its first general assembly in Paris the following year. Originally, neither the International Research Council nor IUPAP included any of the countries of the Central Powers (Germany, Austria–Hungary, Bulgaria and the Ottoman Empire). Many lessons in science diplomacy had to be learned before IUPAP and the other scientific unions became truly international and physicists from all countries could apply to join. Today, with 60 member countries, the union strongly advocates that no scientist shall be excluded from the scientific community as long as their work is based on ethics and the principles of science in its highest ideals – an aspect that certainly will be further elaborated by the working group on ethics established by IUPAP in October last year.
Information exchange
Among IUPAP’s commissions covering all the different disciplines of physicsis the Commission on Symbols, Units, Nomenclature, Atomic Masses and Fundamental Constants (C2), formed in 1931. The task of this commission is to promote the exchange of information and views among the members of the international scientific community in the general field of fundamental constants. As an example, the international system of units (SI) was originally recommended by IUPAP in 1960, and C2 has maintained its role in recommending further improvements, including resolutions supporting the choice of constants to define the new SI as well as the decision to proceed with the redefinition of four of the seven units made in May 2019.
From 11 to 13 July, around 250 physicists from some 70 countries gathered to celebrate the 100th birthday of IUPAP at a symposium held at the Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste, Italy. The symposium was one of the official events of the International Year of Basic Sciences for Sustainable Development, which was officially inaugurated only a few days earlier at the UNESCO headquarters in Paris. About 40% of the participants were physically present, while the rest connected online. Various panels composed of international experts discussed important issues in alignment with the IUPAP’s core aims, including: how to support and encourage early-career physicists, how to improve diversity in physics, how to strengthen the ties to physicists working in industry, how to improve the quality of physics education, and how to promote physics in less developed countries.
IUPAP continues to promote physics as an essential tool for development and sustainability in the next century
A number of influential scientists, including Giorgio Parisi (La Sapienza) and Laura Greene (Florida State University), described their roles in providing evidence-based advice to their respective governments on science and shared best practices that could be useful across borders. Other prominent speakers included William Phillips (Maryland), who covered the quantum reform of modern metric systems; Donna Strickland (Waterloo), who discussed the physics of high-intensity lasers; and Takaaki Kajita (Tokyo), who presented 100 years of neutrino physics via an online connection with the International Conference on High Energy Physics (ICHEP) in Bologna. Climate scientist Tim Palmer (Oxford) argued that a supercomputing facility modelled on the organisation of CERN would enable a step-change in quantifying climate change, while Stewart Prager (Princeton) outlined a new project sponsored by the American Physical Society to engage physicists in reducing nuclear threat. Dedicated panels discussed the development of physics in Africa and the Middle East, Asia and the Pacific, and Latin America. It is clear that in these regions IUPAP has a large potential to foster further international collaboration.
IUPAP enhances the vital role of young physicists, among others, through the award of early-career scientist prizes. In Trieste, several recent recipients of the prize were invited to present their research. The talks were all striking and left the audience with high hopes for the future of physics. Furthermore, the logistics in the auditorium and the handling of all the questions that came in from online participants were smoothly taken care of by members of the International Association of Physics Students.
The centennial symposium was an opportunity to reflect on IUPAP’s role in promoting international cooperation and to welcome Ukraine as a new member. The decision to admit Ukraine was expedited to send a strong signal of support for the war-torn country – a war that has not spared its scientific institutions and the people who work there, as expressed by the president of the Ukrainian Academy of Sciences Anatoly Zagorodny in a powerful online presentation. IUPAP has issued a statement strongly condemning the Russian aggression in Ukraine, while also expressing the principle that no scientist should be excluded from union-sponsored conferences, as long as he or she carries out work not contributing to weapons development. To overcome difficulties related to conferences, IUPAP has put in place that excluded scientists can participate using the Union as their affiliation – similar to the model applied for the Olympic Games.
IUPAP has served the physics community for 100 years and has strong ambitions to continue to assist in the worldwide development of physics and to promote physics as an essential tool for development and sustainability in the next century.
During the past several decades of intense experimental and theoretical research, particle physicists have come to rely on the Standard Model to describe phenomena at the smallest scales and highest energies. This highly predictive, relativistic spontaneously-broken gauge theory has pointed the way to a sequence of discoveries, including that of the W and Z bosons, the gluons, and the charm and top quarks. At each point, it gave us an approximate mass scale or energy range to explore, which told us what kind of facilities we needed to build to observe predicted phenomena. Finally, in 2012, its most remarkable prediction – the existence of a Higgs particle associated with an apparently fundamental scalar field responsible for electroweak symmetry breaking – was confirmed. There are, however, big questions in particle physics to which we don’t know the answers.
Every seven to 10 years since 1982, high-energy physicists in the US have undertaken a community planning exercise to identify the most important questions for the following two decades and the facilities, infrastructure and R&D needed to pursue them. For many years these efforts, which are sponsored by the Division of Particles and Fields (DPF) of the American Physical Society (APS) and include scientists from other countries and related fields, concluded with a summer workshop in Snowmass, Colorado. The planning exercise focuses on scientific issues, whereas establishing project priorities is the task of a Particle Physics Project Prioritization Panel “P5”, charged by the US Department of Energy (DOE) and the National Science Foundation (NSF).
The latest study, “Snowmass 2021” (CERN Courier January/February 2022 p43) was meant to conclude in July 2021, but had to be delayed due to the COVID-19 pandemic. Despite the challenges, our community accomplished an amazing amount of work. The final discussions and synthesis of all the white papers, seminars, workshops and other materials took place at the University of Washington in Seattle from 17–26 July 2022. At the end of themeeting, Hitoshi Murayama (UC Berkeley and the University of Tokyo) was named chairperson of the new P5 subpanel, which will take input from Snowmass 2021.
Snowmass in context
The last US community planning exercise was held in 2013. The subsequent P5 report synthesised the questions identified into five physics drivers: use the Higgs boson as a tool for discovery; pursue the physics associated with neutrino mass; identify the new physics of dark matter; understand cosmic acceleration; and explore the unknown. It also made 29 project-oriented recommendations. The two projects assigned the highest priority were participation in the High-Luminosity LHC and the ATLAS and CMS experiments; and the construction of the LBNF/DUNE long-baseline neutrino experiment, which will detect neutrinos produced at Fermilab interacting in massive underground detectors 1300 km away in the Homestake mine in South Dakota.
Nearly a decade since the last Snowmass/P5 exercise, some elements of the recommended experimental programme have taken data and have succeeded in pushing the boundaries of our knowledge. But despite some hints, they have not yet produced a result that points us in a specific direction. Snowmass 2021 reconfirmed the relevance of the physics drivers, and added a proposal for a sixth: flavour physics as a tool for discovery. Specifically, we don’t understand why three generations of matter particles exist nor the origin of the mass patterns that they exhibit. We do not know why the quark and the lepton mixing matrices are so different, or whether CP violation exists in the neutrino sector and how it relates to the observed matter–antimatter asymmetry of the universe. There are, currently, several tantalising hints of new particles and interactions that could explain various anomalies in the weak decays of B mesons and the anomalous magnetic moment of the muon. Depending on what the near-future brings, dedicated next-generation flavour experiments are likely to be required and could play a key role in the quest for physics beyond the Standard Model.
Snowmass 2021 was organised into 10 working groups or “frontiers”: accelerator, cosmic, community engagement, computational, energy, instrumentation, neutrinos, rare processes and precision measurements, theory, and underground facilities and infrastructure. Each frontier divided its work into several topical groups, taking into account input from the 2020 update of the European strategy for particle physics and other international studies. More than 500 new white papers were produced. An early-career organisation assisted young physicists in contributing to the Snowmass process and international participation was encouraged, with leaders of international institutes and laboratories including Fabiola Gianotti (CERN), Masanori Yamauchi (KEK) and Yifang Wang (IHEP) giving presentations during special plenary sessions at the Seattle workshop. In describing the US programme, Fermilab director Lia Merminga emphasised the importance of international collaboration, citing the close relationship between the US and CERN.
There was broad agreement that a successful future programme should include a healthy breadth and balance of physics topics, experiment sizes and timescales, supported via a dedicated, robust and ongoing funding process. Completion of existing experiments and execution of DUNE and the HL-LHC programmes are critical for addressing the science drivers in the near-term. Strong and continued support for formal theory, phenomenology and computational theory is needed, as are stronger, targeted efforts connecting theory to experiment. Both R&D directed to specific future projects and generic research needs to be supported in critical enabling technologies such as accelerators, instrumentation/detectors and computation, and in new ones such as quantum science and machine learning. Finally, a cohesive, strategic approach to promoting diversity, equity and inclusion, and to improving outreach and engagement, is required.
A panoply of ideas were discussed at Snowmass 2021. Here, in the context of the 10 frontiers, we list some of the larger projects and programmes that are proposed to be carriedout, or at least started, in the next two decades, and some important conclusions concerning enabling technologies and infrastructure, with the disclaimer that these may change as the final Snowmass frontier reports are written.
The cosmic frontier
The cosmic frontier is focused on understanding how fundamental physics shapes the behaviour of the universe, in particular concerning the nature of dark matter (DM) and dark energy. The space of DM models encompasses a dizzying array of possibilities representing many orders of magnitude in mass and couplings, making the DM programme one of the most interdisciplinary investigations in high-energy and particle physics. The cosmic frontier DM programme will “delve deep, search wide” by employing a range of direct searches for WIMPs interacting with targets on Earth or produced at accelerators, indirect searches for the products of DM annihilation and probes based on analyses of cosmic structure. A complementary thrust is building the next generation of cosmological probes. The next big project in this arena is CMB-S4, a system of telescopes to study the cosmic microwave background and address the mystery of cosmic inflation, which is expected to operate through to at least 2036 (CERN Courier March/April 2022 p34). Additional projects that would start after 2029 are Spec-S5 (the follow-on spectroscopic device to DESI), a project to carry out line intensity mapping (LIM), and planning efforts to increase the sensitivity of gravity wave detection by at least a factor of 10 (103 in sensitive volume)beyond what will be achieved by LIGO/Virgo.
The immediate goal for the energy frontier is to carry out the 2014 P5 recommendations to complete the HL-LHC upgrade and execute its physics programme. A new aspect of the proposed programme is the emergence of a variety of auxiliary experiments, examples of which are FASER (operational) and MATHULSA (proposed), that can use the existing LHC interaction regions to explore parts of discovery space in the far-forward regions. These are mid-scale detectors in cost and complexity, and provide room for additional innovation at the HL-LHC. The energy frontier supports the construction of a global e+e– Higgs factory as soon as possible. Either a linear collider or a circular collider can provide the necessary sensitivity, and a programme of directed detector and accelerator R&D for a Higgs factory is needed immediately to enable US participation. To ensure the long-term viability of the field, the energy frontier wants to begin accelerator and detector R&D towards a 10 TeV muon collider or a 100 TeV-scale hadron collider, in collaboration with partners worldwide. Finally, the US energy-frontier community has expressed renewed interest and ambition to develop options for an energy-frontier collider that could be sited in the US, specifically either an e+e– Higgs factory or a muon collider, while maintaining its international collaborative partnerships and obligations with, for example, CERN future-collider R&D projects.
The neutrino frontier
What are the neutrino masses? Are neutrinos their own antiparticles? How are the masses ordered? What is the origin of neutrino mass and flavour? Do neutrinos and antineutrinos oscillate differently? And are there new particles and interactions that can be discovered? These are among the fascinating questions elaborated by the neutrino frontier. The DUNE R&D programme, propelled by the development of large-scale liquid-argon detectors in the US and Europe, in particular through the CERN Neutrino Platform, has demonstrated the power and feasibility of this technique. Following the completion of DUNE Phase 1 by 2030, DUNE Phase 2 is the neutrino community’s highest priority project for 2030–2040. The Phase 2 project has three components: a replacement of the Fermilab 8 GeV Booster to deliver 2.4 MW to the DUNE target and possibly to provide beam for other experiments; the construction of an additional 20 kT (fiducial) of far-detectors at Homestake; and a fully capable near-detector complex at Fermilab to provide very precise control of the systematic uncertainties for the far-detector measurements, besides carrying out a rich physics programme of their own. DUNE will perform definitive studies of neutrino oscillations, test the three-flavour paradigm, search for new neutrino interactions, and will resolve the mass hierarchy question and hopefully observe CP violation. There are many other aspects of neutrino physics that merit study, including the absolute mass, the search for neutrinoless double beta decay (which bears on the issue of whether the neutrino is a Dirac or a Majorana fermion), the measurement of cross sections, and the search for sterile neutrinos. Several of these will be part of the US neutrino programme, either based in the US or through collaboration abroad.
Rare processes and precision measurements
The rare processes and precision measurements frontier is currently working on two mid-sized US projects at Fermilab endorsed by P5 in 2014: the Muon g−2 experiment, which has produced exciting results and will continue to take data for at least a few more years; and the Mu2e experiment, which is under construction. The programme also has important investments in flavour physics through support of the Belle II experiment in Japan and LHCb at CERN. Priorities for the next few years are to complete g−2, begin taking data with Mu2e, and continue collaboration at Belle II and LHCb, including participation in future upgrades. Looking ahead, the central themes are to understand quark and lepton flavour and its violation measurements, and the search for dark-matter production in the mass range from sub-MeV to a few GeV in fixed-target proton and electron experiments. There is a proposal to study muon science in an advanced muon facility at Fermilab that would greatly improve the search for lepton-flavour violation in µ → eγ, µN → eN and µ → 3e decays. This would require an intense proton beam with unique characteristics and accumulator rings to manage the production of muon beams with different energies and time profiles.
Theory frontier
Theoretical particle physics seeks to provide a predictive mathematical description of matter, energy, space and time that synthesises our knowledge of the universe, analyses and interprets existing experimental results and motivates future experimental investigation. Theory connects particle physics to other areas (e.g. gravity and cosmology) and extends the boundaries of our understanding (e.g. quantum information). Together, fundamental, phenomenological and computational theory form a vibrant ecosystem whose health is essential to all aspects of the US high-energy physics programme. The theory frontier recommends, among others, invigorated support for a broad programme of research as part of a balanced portfolio and an emphasis on targeted initiatives to connect theory to experiment.
Nearly a decade since the last Snowmass exercise, the recommended experimental programme has succeeded in pushing the boundaries of our knowledge
Accelerator frontier
The accelerator frontier, which has many crossovers with the energy frontier, aims to prepare for the next generations of major accelerator-based particle physics projects to explore the energy, neutrino and rare-process-and-precision frontiers. In the near term, a multi-MW beam-power upgrade of the Fermilab proton accelerator complex is required for DUNE phase 2. Studies are required to understand what other requirements the Fermilab accelerator complex needs to meet if the same upgrade is to be used for related rare-decay and precision experiments. In the energy frontier, a global consensus for an e+e– Higgs factory as the next collider has been reaffirmed. While some options (e.g. the International Linear Collider) have mature designs, other options (such as FCC-ee, C3, HELEN and CLIC) require further R&D to understand if they are viable. In order to further explore the energy frontier, a very high-energy circular hadron collider or a multi-TeV muon collider will be needed, both of which require substantial study to see if construction is feasible in the decade starting 2040 or beyond. It is proposed that the US establish a national integrated R&D programme on future colliders to carry out technology R&D and accelerator design studies for future collider concepts. Since machines of this magnitude will require international collaboration, the US R&D programme must be well-aligned and consistent with international efforts. Also under consideration are new acceleration techniques, such as wakefield acceleration, and ERLs, along with proposed R&D programmes that could indicate how they would contribute to the design of future colliders.
Computational frontier
Software and computing are essential to all high-energy physics experiments and many theoretical studies. However, computing has entered a new “post-Moore’s law” phase. Speed-ups in processing now come from the use of heterogeneous resources such as GPUs and FPGAs developed in the commercial sector, with significant implications for the way we develop and maintain software. We are also beginning to rely on community hardware resources such as high-performance computing centres and the cloud rather than dedicated experiment resources. Finally, new machine-learning approaches are changing the way we work. This new computing environment requires new approaches to address the long-term development, maintenance and user support of essential software packages and cross-cutting R&D efforts. Additionally, strong investment in career development for software and computing researchers is needed to ensure future success. The computational frontier therefore recommends the creation of a standing coordinating panel for software and computing under the auspices of the APS DPF, mirroring the Coordinating Panel for Advanced Detectors established in 2012.
Instrumentation frontier
Improved instrumentation is the key to progress in neutrino physics, collider physics and the physics of the cosmic and rare-processes frontiers. Many aspects now at the cutting-edge of detector development were hardly present 10 years ago, including quantum sensors, machine-learning and precision timing. Funding for instrumentation in the US, however, is actually declining. Key elements of a rejuvenated instrumentation effort include programmes to develop and maintain a sufficiently large and diverse workforce, including physicists, engineers and technicians at universities and national laboratories; double the US detector R&D budget over the next five years and modify funding models to enable R&D consortia; expand and sustain support for innovative detector R&D and establish a separate review process for such pathfinding endeavours; and develop and maintain critical facilities, centres and capabilities for sharing knowledge and tools.
Underground experiments address some of the most important areas of particle physics, including the search for dark matter, neutrino physics (including neutrinoless double beta decay and atmospheric neutrinos), cosmic-ray physics and searches for proton decay. The underground frontier concluded that future experiments and their enabling R&D require more space than is currently planned. They proposed a possible addition of the underground space at a depth of 4850 feet at SURF/Homestake and possible additional space at a depth of 7400 feet. These would open up space to develop new experiments and would provide the opportunity for SURF to host next-generation dark-matter or neutrinoless double beta decay experiments.
Community engagement
The community engagement frontier concentrated on seven areas: interaction with industry; career pipeline and development; diversity, equity and inclusion; physics education; public education and outreach; public policy and government engagement; and environmental and societal impacts. The inclusion of this broad array of issues as a “frontier” was a novel aspect of Snowmass 2021 and led to the formulation of many proposals for consideration and implementation by the community as a whole. These issues impact the ability of all frontiers to successfully complete their work, and some, such as the need to broaden representation, are highlighted by other frontiers too. While many recommendations apply directly to the DOE and NSF programmes and could be considered by P5, many others are directed to the HEP community as a whole. We in DPF are considering how best to pursue these issues with government agencies, APS and other groups.
The exciting road ahead
Almost three months since the Seattle workshop, the individual frontier reports are now nearly all complete and the process of synthesising the results has begun. One important theme is to stay the course on the programme approved by the last P5 in the hopes that the hints and anomalies that have shown up since then will provide some guidance for physics beyond the Standard Model. The second theme is that, in the absence of a specific target, we will have to plan a very diverse programme of experiments, theoretical studies and machine and detector R&D in which we broadly explore the large space of possibilities. In all cases, a global effort will be required, and much thought is being applied to ensuring that the US can play an appropriate role.
A global effort will be required, and much thought is being applied to ensuring that the US can play an appropriate role
We believe that members of the US high-energy physics community left the Seattle workshop with an appreciation of the great opportunities present in each frontier, the interconnections between the frontiers and the connections to programmes in the rest of the world. We hope that our report will help P5 produce recommendations that we can unite behind, as we did in 2014. That has proven to be an effective step in convincing the public and policy makers that we have conducted a rigorous process and achieved a consensus that is worthy of their support.
This year, the International Particle Physics Outreach Group (IPPOG) celebrates its 25th anniversary. Our group is an international collaboration of active particle physicists, communication experts and educators dedicated to disseminating the goals and accomplishments of fundamental scientific research to the public. Our audiences range from young schoolchildren to college graduates and teachers, from the visiting public to heads of state, and we engage them in classrooms, laboratories, experimental halls, music festivals, art exhibitions, office buildings and government offices across the planet. The activities we use to reach these diverse audiences include public lectures, visits, games, exhibits, books, online apps, and pretty much anything that can be used to demonstrate scientific methodology and instil appreciation for fundamental research.
What drives us to commit so much effort to outreach and public engagement when we are already deeply invested in a field that is both time and labour intensive? First of all, we love doing it. Particle physics is an active and exciting field that lies at the forefront of human understanding of the universe. The analysis methods and tools we employ are innovative, our machines and detectors are jaw-dropping in their size and complexity, and our international collaborations are the largest, most diverse ever assembled. It is a privilege to be part of this community and we love sharing that with those around us.
Secondly, we’ve learned that public engagement improves us as scientists. Learning how to distil complex scientific concepts into understandable descriptions, captivating stories and relatable analogies helps us to better comprehend the topics ourselves. It gives us a clearer picture in our own minds of where our work fits into the larger frame of society. It also significantly improves our communication skills, yielding capabilities that help us down the road as we apply for grants and propose new projects or analyses.
Thirdly, we also understand our moral obligation to share the results of our research with society. The endeavour to improve our understanding of the world around us is rooted in millennia of human evolution and culture. It is how we not only improve our own lives, but also how we provide the tools future generations need to survive. In more practical terms, we realise the importance of engaging those who support our research. That includes funding bodies, as well as the voters who select those bodies and prioritise the deployment of resources.
Finally, but equally as important, we realise the value to both our own field and society at large of encouraging our youth to pursue careers in science and technology. The next generation of experiments will include machines, detectors and collaborations that are even larger than the ones we have today. Given their projected lifetimes, the grandchildren of today’s students will be among those analysing the data. And we need to train that workforce today.
Birth and evolution
Dedicating time and resources to outreach efforts is not easy. As researchers, our days (and often nights) are taken up by analysis meetings, detector shifts, conference deadlines and institutional obligations. So, when we do make the effort it needs to be done in an effective manner, reaching as large and diverse an audience as possible, with messages that are clear and coherent.
Former CERN Director-General Chris Llewellyn Smith certainly had this in mind when he first proposed the establishment of the European particle physics outreach group (EPOG) in 1997. The group held its first meeting on 19 September that year, under the chairmanship of Frank Close. Its primary objectives were to exchange ideas and best practices in particle-physics education and outreach, to define common goals and activities, and to develop and share materials supporting the efforts.
The original members of EPOG were delegates from the CERN Member States, one each from the four major LHC experiments, one each from the CERN and DESY laboratories, and a chair and deputy chair assigned by the European Committee for Future Accelerators (ECFA) and the high-energy physics branch of the European Physical Society. Over the course of the following 25 years, EPOG has expanded beyond Europe (becoming IPPOG), developed major worldwide programmes, including International Masterclasses in Particle Physics and Global Cosmics, and established itself as an international collaboration, defined and supported by a memorandum of understanding (MoU).
Today, the IPPOG collaboration comprises 39 members (32 countries, six experiments and one international lab) and two associate members (national labs). Each member, by signing the MoU, commits to supporting particle-physics outreach worldwide. Members also provide modest funding, which is used to support IPPOG’s core team, its administration and communication platforms, thus facilitating the development and expansion of its global programmes and activities.
The Masterclasses programme now reaches tens of thousands of students in countries spread around the globe, and is engaging new students and training new physics mentors every year. The Global Cosmics portal, hosted on the IPPOG website, provides access to a wide variety of projects distributing cosmic-ray detectors and/or data into remote classrooms that would not normally have access to particle-physics research. And a modest project budget has helped the IPPOG collaboration to establish a presence at science, music and art festivals around the globe by supporting the efforts of local researchers.
Most recently, IPPOG launched a new website, featuring information about the collaboration, its programmes and activities, and giving access to a growing database of educational resources. The resource database serves teachers and students, as well as our own community, and includes searchable, high-quality materials, project descriptions and references to related resources procured and contributed by our colleagues.
Our projects and activities are reviewed and refined periodically during twice-annual collaboration meetings hosted by member countries and laboratories. They feature hands-on activities and presentations by working groups, members, partners and panels of world-renowned topical experts. We present and publish the progress of our activities each year during major physics and science-education conferences. Presentations are made in parallel and poster sessions, and plenary talks, to share developments with the greater community and offer opportunities for their own contributions.
The challenging road ahead
While these accomplishments are noteworthy and lay a strong basis for the development of particle-physics outreach, they are not enough to face the challenges of tomorrow. Or even today. The world has changed dramatically since the days we first advocated for the construction of our current accelerators and detectors. And we are partly to blame. The invention of the web at CERN more than 30 years ago greatly facilitated access to and propagation of scientific facts and publications. Unfortunately, it also became a tool for the development and even faster dissemination of lies and conspiracy theories.
Effective science education is crucial to stem the tide of disinformation. A student in a Masterclass, for example, learns quickly that truth is found in data: only by selecting events and plotting measurements is she/he able to discover what nature has in store. And it might not agree with her/his original hypothesis. It is what it is. This simple lesson teaches students how to extract signal from background, truth from fiction. Other important lessons include the value of international collaboration, the symmetries and beauty of nature and the applications of our technology to society.
How do we, as scientists, make such opportunities available to a broader audience? First and foremost, we need more of us doing outreach. Many physicists do not make the effort because they perceive it as costly to their career. Taking time away from analysis and publication can be detrimental to our advancement, especially for students and junior faculty, unless there is sufficient support and recognition. Our community needs to recognise that outreach has become a key component of scientific research. It is as essential as hardware, computing and analysis. Without it, we won’t have the support we need to build future facilities. That means senior faculty must value experience in outreach on a par with other qualities when selecting new hires, and their institutions need to support outreach activities.
We also need to increase the diversity of our audience. While particle physics can boast of its international reach, our membership is still quite limited in social, economic and cultural scope. We are sorely missing women, people of diverse ethnicities and minorities. Communication strategies and educational methods can be adopted to address this, but they require resources and dedicated personnel.
That’s what IPPOG is striving for. Our expertise and capabilities increase with membership, which is continually on the rise. This past year we have been in discussion with Mexico, Nepal, Canada and the Baltic States, and more are planned for the near future. Some will sign up, others may need more time, but all are committed to maintaining and improving their investment in science education and public engagement. We invite Courier readers to join us in committing to build a brighter future for our field and our world.
The challenge of casting space–time and gravity in the language of quantum mechanics and unravelling their fundamental structure has occupied some of the best minds in physics for almost a century. Not only is it one of the hardest problems out there – requiring mastery of general relativity, quantum field theory, high-level mathematics and deep conceptual issues – but distinct sub-communities of researchers have developed around different and apparently mutually exclusive approaches.
Historically, this reflected to a large extent the existing subdivision of theoretical physics between the particle-physics community and the much smaller gravitational physics one, with condensed-matter theorists entirely alien, at the time, to the quantum gravity (QG) problem. Until 30 years ago, the QG landscape roughly featured two main camps, often identified simply with string theory and canonical loop quantum gravity, even if a few more hybrid formalisms already existed. Much progress has been achieved in this divided landscape, somehow maintaining each camp in the belief that one only had to push forward its own strategies to succeed. At a more sociological level, intertwined with serious scientific disagreements, this even led, in the early 2000s, to what the popular press dubbed the “String Wars”.
A new generation has grown up in a diverse, if conflicting, scientific landscape
Today there is a growing conviction that if we are going to make progress towards this “holy grail” of physics, we need to adopt a more open attitude. We need to pay serious attention to available tools, results and ideas wherever they originated, pursuing unified perspectives when suitable and contrasting them in a constructive manner otherwise. In fact, the past 30 years has seen the development of several QG approaches, the birth of new (hybrid) ones, fresh directions and many results. A new generation has grown up in a diverse, if conflicting, scientific landscape. Today there is much more emphasis on QG phenomenology and physical aspects, thanks to parallel advances in observational cosmology and astrophysics, alongside the recognition that some mathematical developments naturally cut across specific QG formalisms. There is also much more contact with “outside” communities such as particle physics, cosmology, condensed matter and quantum information, which are not interested in internal QG divisions but only in QG deliverables. Furthermore, several scientific overlaps between QG formalisms exist and are often so strong that they make the definition of sharp boundaries between them look artificial.
Introducing the ISQG
The time is ripe to move away from the String Wars towards a “multipolar QG pax”, in which diversity does not mean irreducible conflict and disagreement is turned into a call for better understanding. To this end, last year we created the International Society for Quantum Gravity (ISQG) with a founding committee representing different QG approaches and more than 400 members who do not necessarily agree scientifically, but value intelligent disagreement.
ISQG’s scientific goals are to: promote top-quality research on each QG formalism and each open issue (mathematical, physical and in particular conceptual); stimulate cross-fertilisation across formalisms (e.g. by focusing on shared mathematical ingredients/ideas or on shared physical issues); be prepared for QG observations and tests (develop a common language to interpret experiments with QG implications, and a better understanding of how different approaches would differ in predictions); and push for new ideas and directions. Its sociological goals are equally important. It aims to help recognise that we are a single community with shared interests and goals, overcome barriers and diffidence among sub-communities, support young researchers and promote QG outside the community. A number of initiatives, as well as new funding schemes, are being planned to help achieve these goals.
We envision the main role of the ISQG as sponsoring and supporting the initiatives proposed by its members, in addition to organising its own. This includes a bi-annual conference series to be announced soon, focused workshops and schools, seminar series, career support for young researchers and the preparation of outreach and educational material on quantum gravity.
So far, the ISQG has been well received, with more than 100 participants attending its inaugural workshop in October 2021. Researchers in quantum gravity and related fields are welcome to join the society, contribute to its initiatives and help to create a community that transcends outdated boundaries between different approaches, which only hinder scientific progress. We need all of you!
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.