Topics

Euclid to link the largest and smallest scales

Euclid payload module

Untangling the evolution of the universe, in particular the nature of dark energy and dark matter, is a central challenge of modern physics. An ambitious new mission from the European Space Agency (ESA) called Euclid is preparing to investigate the expansion history of the universe and the growth of cosmic structures over the last 10 billion years, covering the entire period over which dark energy is thought to have played a significant role in the accelerating expansion. The 2 tonne, 4.5 m tall and 3.1 m diameter probe is undergoing final tests in Cannes, France, after which it will be shipped to Cape Canaveral in Florida and inserted into the faring of a SpaceX Falcon 9 rocket, with launch scheduled for July. 

Let there be light

Euclid, which was selected by ESA for implementation in 2012 with a budget of about €600 million, has four main objectives. The first is to investigate whether dark energy is real, or whether the apparent acceleration of the universe is caused by a breakdown of general relativity on the largest scales. Second, if dark energy is real, Euclid will investigate whether it is a constant energy spread across space or a new force of nature that evolves with the expansion of the universe. A third objective is to investigate the nature of dark matter, the mass of neutrinos and whether there exist other, so-far undetected fast-moving particle species, and a fourth is to investigate statistics and properties of the early universe that seeded large-scale structures. To meet these goals, the six-year Euclid mission will use a three-mirror system to direct light from up to a billion galaxies across more than a third of the sky towards a visual imager for photometry and a near-infrared spectrophotometer.

So far, the best constraints on the geometry and expansion history of the universe come from cosmic-microwave background (CMB) surveys. Yet these missions are not the best tracers of the curvature, neutrino masses and expansion history, nor for identifying possible exotic subcomponents of dark matter. For this, large surveys on galaxy clustering are required. Euclid will use three methods to achieve this. The first is redshift-space distortions, which combines how fast galaxies move away from us due to the expansion of the universe and how fast galaxies move towards a region of strong gravitational pull in our line-of-sight; measuring these deformations in galactic positions enables the growth rate of structures as well as gravity to be investigated. The second is baryonic acoustic oscillations (BAOs), which arose when the universe was a plasma made from baryons and photons and set a characteristic scale that is related to the sound horizon at recombination. After recombination, photons decoupled from visible matter while baryons were pulled in by gravity and started to form bigger structures, with the BAO scale imprinted in galaxy distributions. BAOs thus serve as a ruler to trace the expansion rate of the universe. The third method, weak gravitational lensing, occurs when light from a background source is bent around a massive foreground object such as a galaxy cluster, from which the distribution of dark matter can be inferred. 

As the breadth and precision of cosmological measurements increase, so do the links with particle physics. CERN and the Euclid Consortium (which consists of more than 2000 scientists from 300 institutes in 13 European countries, the US, Canada and Japan) signed a memorandum of understanding in 2016 after Euclid gained CERN recognised-experiment status in 2015. The collaboration was motivated by technical synergies for the mission’s Science Ground Segment (SGS), which will process about 850 Gbit of compressed data per day – the largest of any ESA mission to date. CERN is contributing with the provision of critical software tools and related support activities, explains CERN aerospace and environmental applications coordinator Enrico Chesta: “CernVM–FS, developed by the EP-SFT team to assist high-energy physics collaborations to deploy software on the distributed computing infrastructure used to run data-processing applications, has been integrated into Euclid SGS and will be used for software continuous deployment among the nine Euclid science data centres.” 

Competitive survey

Euclid’s main scientific objectives also align closely with CERN’s physics challenges. A 2019 CERN-TH/Euclid workshop identified overlapping areas of interest and options for scientific visitor programmes, with topics of potential interest including N-body CMB simulations, redshift space distortions with relativistic effects, model selection of modified gravity, and dark-energy and neutrino-mass estimation from cosmic voids. Over the coming years, Euclid will provide researchers with data against which they can test different cosmological models. “Galaxy surveys have been happening for decades and have grown in scale, but we didn’t hear much about it because the CMB was, until now, more accurate,” says theorist Marko Simonović of CERN. “With Euclid there will be a competitive survey that is big enough to be comparable to CMB data. It is exciting to see what Euclid, and other new missions such as DESI, will tell us about cosmology. And maybe we will even discover something new.”

Sharing experience, building connections

Like many physicists, Valeria Pettorino’s fascination with science started when she was a child. Her uncle, a physicist himself, played a major role by sharing his passion for science fiction, strings and extra dimensions. She studied physics and obtained her PhD from the University of Naples in 2005, followed by a postdoc at the University of Torino and then SISSA in Italy. In 2012 her path took her to the University of Geneva and a Marie Curie Fellowship, where she worked with theorist Martin Kunz from UNIGE/CERN – a mentor and role model ever since. 

Visiting CERN was an invaluable experience that led to lifelong connections. “Meeting people who worked on particle-physics missions always piqued my interest, as they had such interesting stories and experiences to share,” Valeria explains. “I collaborated and worked alongside people from different areas in cosmology and particle physics, and I got the opportunity to connect with scientists working in different experiments.”

After the fellowship, Valeria went to the University of Heidelberg as a research group leader, and during this time she was selected for the “Science to Data Science” programme by the AI software company Pivigo. Working on artificial intelligence and unsupervised learning to analyse healthcare data for a start-up company in London, it presented her with the opportunity to widen her skillset. 

Valeria’s career trajectory turned towards space science in 2007, when she began working for the Euclid mission of the European Space Agency (ESA) due to launch this year, with the aim to measure the geometry of the universe for the study of dark matter and energy. Currently co-lead of the Euclid theory science working group, Valeria has held a number of roles in the mission, including deputy manager of the communication group. In 2018 she became the CEA representative for Euclid–France communication and is currently director of research for the CEA astrophysics department/CosmoStat lab. She also worked on data analysis for ESA’s Planck mission from 2009 to 2018. 

Mentoring and networking 

In both research collaborations, Valeria worked on numerous projects that she coordinated from start to finish. While leading teams, she studied management with the goal of enabling everyone to reach their full potential. She also completed training in science diplomacy, which helped her gain valuable transferrable skills. “I decided to be proactive in developing my knowledge and started attending webinars, and then training on science diplomacy. I wanted to deepen my understanding on how science can have an impact on the world and society.” In 2022 Valeria was selected to participate in the first Science Diplomacy Immersion Programme organised by the Geneva Science and Diplomacy Anticipator (GESDA), which aims to take advantage of the ecosystem of international organisations in Geneva to anticipate, accelerate and translate emerging scientific themes into concrete actions. 

I wanted to deepen my understanding on how science can have an impact on the world and society

Sharing experience and building connections between people have been a theme in Valeria’s career. Nowhere is this better illustrated than her role, since 2015, as a mentor for the Supernova Foundation – a worldwide mentoring and networking programme for women in physics. “Networking is very important in any career path and having the opportunity to encounter people from a diverse range of backgrounds allows you to grow your network both personally and professionally. The mentoring programme is open to all career levels. There are no barriers. It is a global network of people from 53 countries and there are approximately 300 women in the programme. I am convinced that it is a growing community that will continue to thrive.” Valeria has also acted as mentor for Femmes & Science (a French initiative by Paris-Saclay University) in 2021–2022, and was recently appointed as one of 100 mentors worldwide for #space4women, an initiative of the United Nations Office of Outer Space Affairs to support women pursuing studies in space science.

A member of the CERN Alumni Network, Valeria thoroughly enjoys staying connected with CERN. “Not only is the CERN Alumni Network excellent for CERN as it brings together a wide range of people from many career paths, but it also provides an opportunity for its members to understand and learn how science can be used outside of academia.”

Sigurd Hofmann 1944–2022

Sigurd Hofmann, an extraordinary scientist, colleague and teacher, passed away on 17 June 2022 at the age of 78. Remarkable in his scientific life was the discovery of proton radioactivity, which was achieved in 1981, as well as the synthesis of six new superheavy chemical elements between 1981 and 1996. 

Sigurd was born on 15 February 1944 in Böhmisch-Kamnitz (Bohemia) and studied physics at TH Darmstadt, where he received his diploma in 1969 and his doctorate in 1974 with Egbert Kankeleit. Afterwards, he joined the GSI Helmholtz Centre for Heavy Ion Research in Darmstadt, his scientific work there occupying him for almost 50 years. Accuracy and scientific exactness were important to him from the beginning. He investigated fusion reactions and radioactive decays in the group of Peter Armbruster and worked with Gottfried Münzenberg. 

Sigurd achieved international fame through the discovery of proton radioactivity from the ground state of 151Lu in 1981, a previously unknown decay mechanism. When analysing the data, he benefited from his pronounced thoroughness and scientific curiosity. At the same time, he begun work on the synthesis, unambiguous identification and study of the properties of the heaviest chemical elements, which were to shape his further scientific life. The first highlights were the synthesis of the new elements bohrium (Bh), hassium (Hs) and meitnerium (Mt) between 1981 and 1984, with which GSI entered the international stage of this renowned research field. The semiconductor detectors that Sigurd had developed specifically for these experiments were far ahead of their time, and are now used worldwide to search for new chemical elements. 

At the end of the 1990s Sigurd took over the management of the Separator for Heavy Ion Reaction Products (SHIP) group and, after making instrumental improvements to detectors and electronics, crowned his scientific success with the discovery of the elements darmstadtium (Ds), roentgenium (Rg) and copernicium (Cn) in the years 1994 to 1996. The concept for “SHIP-2000”, a strategy paper developed under his leadership in 1999 for long-term heavy-element research at GSI, is still relevant today. In 2009 he was appointed Helmholtz professor and from then on was able to devote himself entirely to scientific work again. For many years he also maintained an intensive collaboration and scientific exchange with his Russian colleagues in Dubna, where he co-discovered the element flerovium (Fl) in a joint experiment.

For his outstanding research work and findings, Sigurd received a large number of renowned awards and prizes; too many, in fact, to mention. A diligent writer and speaker, he was invited to talk at countless international conferences, authored a large number of review articles, books and book chapters, and many widely cited publications. He also liked to present scientific results at public events. In doing so, he was able to develop a thrilling picture of modern physics, but also of the big questions of cosmology and element synthesis in stars; he was also able to convey very clearly to the public how atoms can be made “visible”.

Many chapters of Sigurd’s contemporary scientific life are recorded in his 2002 book On Beyond Uranium (CRC Press). His modesty and friendly nature were remarkable. You could always rely on him. His care, accuracy and deliberateness in all work were outstanding, and his persistence was one of the foundations for ground-breaking scientific achievements. He was always in the office or at an experiment, even late in the evening and on weekends, so you could talk to him at any time and were always rewarded with detailed answers and competent advice.

We are pleased that we were able to work with such an excellent scientist and colleague, as well as an outstanding teacher and a great person, for so many years.

Karel Cornelis 1955–2022

Our dear colleague and friend Karel Cornelis passed away unexpectedly on 20 December 2022.

After finishing his studies in physics at the University of Leuven (Belgium), Karel joined CERN in 1983 as engineer-in-charge of the Super Proton Synchrotron (SPS) at the time when the machine was operated as a proton–antiproton collider. During his career Karel greatly contributed to the commissioning and performance development and follow-up of the SPS during its various phases as proton–antiproton collider, LEP injector, high-intensity fixed-target machine and as the LHC injector of proton and ion beams. He had a profound and extensive knowledge of the machine, from complex beam dynamics aspects to the engineering details of its various systems, and was the reference whenever new beam requirements or modes of operation were discussed. 

Karel was an extremely competent and rigorous physicist, but also a generous and dedicated mentor who trained generations of control-room technicians, shift leaders and machine physicists and engineers, helping them to grow and take on responsibilities while remaining available to lend a hand when needed. His positive attitude and humour have left a lasting imprint, so much so that “Think like a proton: always positive!” has become the motto of the SPS operation team, and is now visible in the SPS island in the CERN Control Centre.

Karel had the rare gift of explaining complex phenomena with simple but accurate models and clear examples, whether it was accelerator physics and technology, or physics and engineering more generally. He gave a fascinating series of machine shut-down lectures covering the history of the SPS, synchrotron radiation and one of his passions, aviation, with a talk on “Air and the airplanes that fly in it”. 

Karel was a larger-than-life tutor, friend, reference point, expert and father figure to generations of us. He was much missed in the SPS island and beyond following his retirement in September 2019, and will be even more so now.  

New physics in b decays

There are compelling reasons to believe that the Standard Model (SM) of particle physics, while being the most successful theory of the fundamental structure of the universe, does not offer the complete picture of reality. However, until now, no new physics beyond the SM has been firmly established through direct searches at different energy scales. This motivates indirect searches, performed by precision examination of phenomena sensitive to contributions from possible new particles, and comparing their properties with the SM expectations. This is conceptually similar to how, decades ago, our understanding of radioactive beta decay allowed the existence and properties of the W boson to be predicted.

New Physics in b decays, by Marina Artuso, Gino Isidori and the late Sheldon Stone, is dedicated to precision measurements in decays of hadrons containing a b quark. Due to their high mass, these hadrons can decay into dozens of different final states, providing numerous ways to challenge our understanding of particle physics. As is usual for indirect searches, the crucial task is to understand and control all SM contributions to these decays. For b-hadron decays, the challenge is to control the effects of the strong interaction, which is difficult to calculate.

Both sides of the coin

The authors committed to a challenging task: providing a snapshot of a field that has developed considerably during the past decade. They highlight key measurements that generated interest in the community, often due to hints of deviations from the SM expectations. Some of the reported anomalies have diminished since the book was published, after larger datasets were analysed. Others continue to intrigue researchers. This natural scientific progress leads to a better understanding of both the theoretical and experimental sides of the coin. The authors exercise reasonable caution over the significance of the anomalies they present, warning the reader of the look-elsewhere effect, and carefully define the relevant observables. When discussing specific decay modes, they explain their choice compared to other processes. This pedagogical approach makes the book very useful for early-career researchers diving into the topic. 

The book starts with a theoretical introduction to heavy-quark physics within the SM, plotting avenues for searches for possible new-physics effects. Key theoretical concepts are introduced, along with the experiments that contributed most significantly to the field. The authors continue with an overview of “traditional” new-physics searches, strongly interleaving them with precision measurements of the free parameters of the SM, such as the couplings between quarks and the W boson. By determining these parameters precisely with several alternative experimental approaches, one hopes to observe discrepancies. An in-depth review of the experimental measurements, also featuring their complications, is confronted with theoretical interpretations. While some of the discrepancies stand out, it is difficult to attribute them to new physics as long as alternative interpretations are not excluded.

New Physics in b Decays

The second half of the book dives into recent anomalies in decays with leptons, and the theoretical models attempting to address them. The authors reflect on theoretical and experimental work of the past decade and outline a number of pathways to follow. The book concludes with a short overview of searches for processes that are forbidden or extremely suppressed in the SM, such as lepton-flavour violation. These transitions, if observed, would represent an undeniable signature of new physics, although they only arise in a subset of new-physics scenarios. Such searches therefore allow strong limits to be placed on specific hypotheses. The book concludes with the authors’ view of the near future, which is already becoming reality. They expect the ongoing LHCb and Belle II experiments to have a decisive word on the current flavour anomalies, but also to deliver new, unexpected surprises. They rightly conclude that “It is difficult to make predictions, especially about the future.”

The remarkable feature of this book is that it is written by physicists who actively contributed to the development of numerous theoretical concepts and key experimental measurements in heavy-quark physics over the past decades. Unfortunately, one of the authors, Sheldon Stone, could not see his last book published. Sheldon was the editor of the book B decays, which served as the handbook on heavy-quark physics for decades. One can contemplate the impressive progress in the field by comparing the first edition of B decays in 1992 with New Physics in b decays. In the 1990s, heavy-quark decays were only starting to be probed. Now, they offer a well-oiled tool that can be used for precision tests of the SM and searches for minuscule effects of possible new physics, using decays that happen as rarely as once per billion b-hadrons.

The key message of this book is that theory and experiment must go hand in hand. Some parameters are difficult to calculate precisely and they need to be measured. The observables that are theoretically clean are often challenging experimentally. Therefore, the searches for new physics in b decays focus on processes that are accessible both from the theoretical and experimental points of view. The reach of such searches is constantly being broadened by painstakingly refining calculations and developing clever experimental techniques, with progress achieved through the routine work of hundreds of researchers in several experiments worldwide.

Cosmic rays for cultural heritage

In 1965, three years before being awarded a Nobel prize for his decisive contributions to elementary particle physics, Luis Alvarez proposed to use cosmic muons to look inside an Egyptian pyramid. A visit to the Giza pyramid complex a few years earlier had made him ponder why, despite the comparable size of the Great Pyramid of Khufu and the Pyramid of Khafre, the latter was built with a simpler structure – simpler even than the tomb of Khufu’s great-grandfather Sneferu, under whose reign there had been architectural experimentation and pyramids had grown in complexity. Only one burial chamber is known in the superstructure of Khafre’s pyramid, while two are located in the tombs of each of his two predecessors. Alvarez’s doubts were not shared by many archaeologists, and he was certainly aware that the history of architecture is not a continuous process and that family relationships can be complicated; but like many adventurers before him, he was fascinated by the idea that some hidden chambers could still be waiting to be discovered. 

The principles of muon radiography or “muography” were already textbook knowledge at that time. Muons are copiously produced in particle cascades originating from naturally occurring interactions between primary cosmic rays and atmospheric nuclei. The energy of most of those cosmogenic muons is large enough that, despite their relatively short intrinsic lifetime, relativistic dilation allows most of them to survive the journey from the upper atmosphere to Earth’s surface – where their penetration power makes them a promising tool to probe the depths of very large and dense volumes non-destructively. Thick and dense objects can attenuate the cosmic-muon flux significantly by stopping its low-energy component, thus providing a “shadow” analogous to conventional radiographies. The earliest known attempt to use the muon flux attenuation for practical purposes was the estimation of the overburden of a tunnel in Australia using Geiger counters on a rail, published in 1955 in an engineering journal. The obscure precedent was probably unknown to Alvarez, who didn’t cite it.

Led by Alvarez, the Joint Pyramid Project was officially established in 1966. The detector that the team built and installed in the known large chamber at the bottom of Khafre’s pyramid was based on spark chambers, which were standard equipment for particle-physics experiments at that time. Less common were the computers provided by IBM for Monte Carlo simulations, which played a crucial role in the data interpretation. It took some time for the project to take off. Just as the experiment was ready to take data, the Six-Day War broke out, delaying progress by several months until diplomatic relationships were restored between Cairo and Washington. All this might sound like a promising subject for a Hollywood blockbuster were it not for its anticlimax: no hidden chamber was found. Alvarez always insisted that there is a difference between not finding what you search for and conclusively excluding its existence, but despite this important distinction, one wonders how much muography’s fame would have benefitted from a discovery. Their study, published in Science in 1970, set an example that was followed in subsequent decades by many more interdisciplinary applications.  

The second pyramid to be muographed was in Mexico more than 30 years later, when researchers from the National Autonomous University of Mexico (UNAM) started to search for hidden chambers in the Pyramid of the Sun at Teotihuacan. Built by the Aztecs about 1800 years ago, it is the third largest pyramid in the world after Khufu’s and Khafre’s, and its purpose is still a mystery. Although there is no sign that it contains burial chambers, the hypothesis that this monument served as a tomb is not entirely ruled out. After more than a decade of data taking, the UNAM muon detector (composed of six layers of multi-wire chambers occupying a total volume of 1.5 m3) found no hidden chamber. But the researchers did find evidence, reported in 2013, for a very wide low-density volume in the southern side, which is still not understood and led to speculation that this side of the pyramid might be in danger of collapse.

Big void 

Muography returned to Egypt with the ScanPyramids project, which has been taking data since 2015. The project made the headlines in 2017 by revealing an unexpected low-density anomaly in Khufu’s Great Pyramid, tantalisingly similar in size and shape to the Grand Gallery of the same building. Three teams of physicists from Japan and France participated in the endeavour, cross-checking each other by using different detector technologies: nuclear emulsions, plastic scintillators and micromegas. The latter, being gaseous detectors, had to be located externally to the pyramid to comply with safety regulations. Publishing in Nature Physics, all three teams reported a statistically significant excess in muon flux originating from the same 3D position (see “Khufu’s pyramid” figure). 

Khufu’s pyramid

This year, based on a larger data sample, the Scan­Pyramids team concluded that this “Big Void” is a horizontal corridor about 9 m long with a transverse section of around 2 × 2 m2. Confidence in the solidity of these conclusions was provided by a cross-check measurement with ground-penetrating radar and ultrasounds, by Egyptian and German experts, which took data since 2020 and was published simultaneously. The consistency of the data from muography and conventional methods motivated visual inspection via an endoscope, confirming the claim. While the purpose of this unexpected feature of the pyramid is not yet known, the work represents the first characterisation of the position and dimensions of a void detected by cosmic-ray muons with a sensitivity of a few centimetres.

New projects exploring the Giza pyramids are now sprouting. A particularly ambitious project by researchers in Egypt, the US and the UK – Exploring the Great Pyramid (EGP) – uses movable large-area detectors to perform precise 3D tomography of the pyramid. Thanks to its larger surface and some methodological improvements, EGP aims to surpass ScanPyramids’ sensitivity after two years of data taking. Although still at the simulation studies stage, the detector technology – plastic scintillator bars with a triangular section and encapsulated wavelength shifter fibres – is already being used by the ongoing MURAVES muography project to scan the interior of the Vesuvius volcano in Italy. The project will also profit from synergy with the upcoming Mu2e experiment at Fermilab, where the very same detectors are used. Finally, proponents of the ScIDEP (Scintillator Imaging Detector for the Egyptian Pyramids) experiment from Egypt, the US and Belgium are giving Khafre’s pyramid a second look, using a high-resolution scintillator-based detector to take data from the same location as Alvarez’s spark chambers.

Muography data in the Xi’an city walls

Pyramids easily make headlines, but there is no scarcity of monuments around the world where muography can play a role. Recently, a Russian team used emulsion detectors to explore the Svyato–Troitsky Danilov Monastery, the main buildings of which have undergone several renovations across the centuries but with associated documentation lost. The results of their survey, published in 2022, include evidence for two unknown rooms and areas of significantly higher density (possible walls) in the immured parts of certain vaults, and of underground voids speculated to be ancient crypts or air ducts. Muography is also being used to preserve buildings of historical importance. The defensive wall structures of Xi’an, one of the Four Great Ancient Capitals of China, suffered serious damage due to heavy rainfall, but repairs in the 1980s were insufficiently documented, motivating non-destructive techniques to assess their internal status. Taking data from six different locations using a compact and portable muon detector to extract a 3D density map of a rampart, a Chinese team led by Lanzhou University has recently reported density anomalies that potentially pose safety hazards (see “Falling walls” figure). 

The many flavours of muography

All the examples described so far are based on the same basic principle as Alvarez’s experiment: the attenuation of the muon flux through dense matter. But there are other ways to utilise muons as probes. For example, it is possible to exploit their deflection in matter due to Coulomb scattering from nuclei, offering the possibility of elemental discrimination. Such muon scattering tomography (MST) has been proposed to help preserve the Santa Maria del Fiore cathedral in Florence, built between 1420 and 1436 by Filippo Brunelleschi, the iconic dome of which is cracking under its own weight. Accurate modelling is needed to guide reinforcement efforts, but uncertainties exist on the internal structure of the walls. According to some experts, Brunelleschi might have inserted iron chains inside the masonry of the dome to stabilise it; however, no conclusive evidence has been obtained with traditional remote-sensing methods. Searching for iron within masonry is therefore the goal of the proposed experiment (see “Preserving a masterpiece” figure), for which a proof-of-principle test on a mock-up wall has already been carried out in Los Alamos.

Beyond cultural heritage, muography has also been advocated as a powerful remote-sensing method for a variety of applications in the nuclear sector. It has been used, for example, to assess the damage and impact of radioactivity in the Fukushima power plant, where four nuclear reactors were damaged in 2011. Absorption-based muography was applied to determine the difference in the density, for example the thickness of the walls, within the nuclear reactor while MST was applied to locate the nuclear fuel. Muography, especially MST, has allowed the investigation of other extreme systems, including blast furnaces and nuclear waste barrels. 

Santa Maria del Fiore cathedral

Volcanology is a further important application of muography, where it is used to discover empty magma chambers and voids. As muons are better absorbed by thick and dense objects, such as rocks on the bottom of a volcano, the absorption provides key information about its inner structure. The density images created via muography can even be fed into machine-learning models to help predict eruptive patterns, and similar methods can be applied to glaciology, as has been done to estimate the topography of mountains hidden by overlaying glaciers. Among these projects is Eiger-μ, designed to explore the mechanisms of glacial erosion.

Powerful partnership 

Muography creates bridges across the world between particle physics and cultural-heritage preservation. The ability to perform radiography of a large object from a distance or from pre-existing tunnels is very appealing in situations where invasive excavations are impossible, as is often the case in highly populated urban or severely constrained areas. Geophysical remote-sensing methods are already part of the archaeological toolkit, but in general they are expensive, have a limited resolution and demand strong model assumptions for interpreting the data. Muography is now gaining acceptance in the cultural-heritage preservation world because its data are intrinsically directional and can be easily interpreted in terms of density distributions.

From the pioneering work of Alvarez to the state-of-the-art systems available today, progress in muography has gone hand-in-hand with the development of detectors for particle physics. The ScanPyramids project, for example, uses micropattern gaseous detectors such as those developed within the CERN RD51 collaboration and nuclear emulsion detectors as those of the OPERA neutrino experiment, while the upcoming EGP project will benefit from detector technologies for the Mu2e experiment at Fermilab. R&D for next-generation muography includes the development of scintillator-based muon detectors, resistive plate chambers, trackers based on multi-wire proportional chambers and more. There are proposals to use microstrip silicon detectors from the CMS experiment and Cherenkov telescopes inspired by the CTA astrophysics project, showing how R&D for fundamental physics continues to drive exotic applications in archaeology and cultural-heritage preservation.

Exploring the origins of matter–antimatter asymmetry

The first edition of the International Workshop on the Origin of Matter–Antimatter Asymmetry (CP2023), hosted by École de Physique des Houches, took place from 12 to 17 February. Around 50 physicists gathered to discuss the central problem connecting particle physics and cosmology: CP violation. Since one of the very first schools dedicated to time-reversal symmetry in the summer of 1952, chaired by Wolfgang Pauli, research has progressed significantly, especially with the formulation by Sakharov of the conditions necessary to produce the observed matter–antimatter asymmetry in the universe.

The workshop programme covered current and future experimental projects to probe the Sakharov conditions: collider measurements of CP violation (LHCb, Belle II, FCC-ee), searches for electric dipole moments (PSI, FNAL), long-baseline neutrino experiments (NOvA, DUNE, T2K, Hyper-Kamiokande, ESSnuSB) and searches for baryon- and lepton-number violating processes such as neutrinoless double beta decay (GERDA, CUORE, CUPID-Mo, KamLAND-Zen, EXO-200) and neutron–antineutron oscillations (ESS). These were put in context with the different theoretical approaches to baryogenesis and leptogenesis.

With the workshop’s aim to provide a discussion forum for junior and senior scientists from various backgrounds, and following the tradition of the Ecole des Houches, a six-hour mini-school took place in parallel with more specialised talks. A first lecture by Julia Harz (University of Mainz) introduced the hypotheses related to baryogenesis, and another by Adam Falkowski (IJCLab) described how CP violation is treated in effective field theory. Each lecture provided both a common theoretical background, and an opportunity to discuss the fundamental motivation driving experimental searches for new sources of CP violation in particle physics.

In his summary talk, Mikhail Shaposhnikov (EPFL Lausanne) explained that it is impossible to identify which mechanism leads to the existing baryon asymmetry in the universe. He added that we live in exciting times and reviewed the vast number of opportunities in experiment and theory lying ahead.

A bridge between popular and textbook science

Most popular science books are written to reach the largest audience possible, which comes with certain sacrifices. The assumption is that many readers might be deterred by technical topics and language, especially by equations that require higher mathematics. In physics one can therefore usually distinguish textbooks from popular physics books by flicking through the pages and checking for symbols.

The Biggest Ideas in the Universe: space, time, and motion, the first in a three-part series by Sean Carroll, goes against this trend. Written for “…people who have no mathematical experience than high-school algebra, but are willing to look at an equation and think about what it means”, there is no point in the book at which things are muddied because the maths becomes too advanced.

Concepts and theories

The first part of the book covers nine topics including conservation, space–time, geometry, gravity and black holes. Carroll spends the first few chapters introducing the reader to the thought process of a theoretical physicist: how to develop a sense for symmetries, the conservation of charges and expansions in small parameters. It also gives readers a fast introduction to calculus using geometric arguments to define derivatives and integrals. By the end of the third chapter, the concepts of differential equations, phase space and the principle of least action have been introduced.

The centre part of the book focusses on geometry. A discussion of the meaning of space and time in physics is followed by the introduction of Minkowski spacetime, with considerable effort given to the philosophical meaning of these concepts. The third part is the most technical. It covers differential geometry, a beautiful derivation of Einstein’s equation of general relativity and the final chapter uses the Schwarzschild solution to discuss black holes.

The Biggest Ideas in the Universe

It is a welcome development that publishers and authors such as Carroll are confident that books like this will find a sizeable readership (another good, recent example of advanced popular physics texts is Leonard Susskind’s “A Theoretical Minimum” series). Many topics in physics can only be fully appreciated if the equations are explained and if chapters go beyond the limitations of typical popular science books. Carroll’s writing style and the structure of the book help to make this case: all concepts are carefully introduced and even though the book is very dense and covers a lot of material, everything is interconnected and readers won’t feel lost while reading. Regular reference to the historical steps in discovering theo­ries and concepts loosen up the text. Two examples are the correspondence between Leibniz and Clarke about the nature of space and the interesting discussion of Einstein and Hilbert’s different approaches to general relativity. The whole series of books, of which two of the three parts will be published soon, is accompanied by recorded lectures that are freely available online and present the topic of every chapter, along with answers to questions on these topics.

It is difficult to find any weaknesses in this book. Figures are often labelled with symbols that readers not used to physics notation can find in the text, so more text in the figures would make them even more accessible. Strangely, the section introducing entropy is not supported by equations and, given the technical detail of all other parts of the book, Carroll could have taken advantage of the mathematical groundwork of the previous chapters here.

I want to emphasise that every topic discussed in The Biggest Ideas in the Universe is well established physics. No flashy but speculative theories or unbalanced focus on science-fiction ideas, which are often used to attract readers to theoretical physics, appear. It stands apart from similar titles by offering insights that can only be obtained if the underlying equations are explained and not just mentioned.

Anyone who is interested in fundamental physics is encouraged to read this book, especially young people interested in studying physics because they will get an excellent idea of the type of physical arguments they will encounter at university. Those who think their mathematical background isn’t sufficient will likely learn many new things, even though the later chapters are quite technical. And if you are at the other end of the spectrum, such as a working physicist, you will find the philosophical discussions of familiar concepts and the illuminating arguments included to elicit physical intuition most useful.

Digging deeper into invisible Higgs-boson decays

ATLAS figure 1

Studies of the Higgs boson by ATLAS and CMS have observed and measured a large spectrum of production and decay mechanisms. Its relatively long lifetime and low expected width (4.1 MeV, compared with the GeV-range decay widths of the W and Z bosons) make the Higgs boson a sensitive probe for small couplings to new states that may measurably distort its branching fractions. The search for invisible or yet undetected decay channels is thus highly relevant.

Dark-matter (DM) particles created in LHC collisions would have no measurable interaction with the ATLAS detector and thus would be “invisible”, but could still be detected via the observation of missing transverse momentum in an event, similarly to neutrinos. The Standard Model (SM) predicts the Higgs boson to decay invisibly via H → ZZ*→ 4ν in only 0.1% of cases. However, this value could be significantly enhanced if the Higgs boson decays into a pair of (light enough) DM particles. Thus, by constraining the branching fraction of Higgs-boson decays to invisible particles it is possible to constrain DM scenarios and probe other physics beyond the SM (BSM).

The ATLAS collaboration has performed comprehensive searches for invisible decays of the Higgs boson considering all its major production modes: vector-boson fusion with and without additional final-state photons, gluon fusion in association with a jet from initial-state radiation, and associated production with a leptonically decaying Z boson or a top quark–antiquark pair. The results of these searches have now been combined, including inputs from Runs 1 and 2 analyses. They yield an upper limit of 10.7% on the branching ratio of the Higgs boson to invisible particles at 95% confidence level, for an unprecedented expected sensitivity of 7.7%. The result is used to extract upper limits on the spin-independent DM-nucleon scattering cross section for DM masses smaller than about 60 GeV in a variety of Higgs-portal models (figure 1). In this range and for the models considered, invisible Higgs-boson decays are more sensitive than the results from DM-nucleon scattering detection experiments.

ATLAS figure 2

An alternative way to constrain possible undetected decays of the Higgs boson is to measure its total decay width ΓH. Combining the observed value of the width with measurements of the branching fractions to observed decays allows the partial width for decays to new particles to be inferred. Directly measuring ΓH at the LHC is not possible as it is much smaller than the detector resolution. However, ΓH can be constrained by taking advantage of an unusual feature of the H  ZZ(*) decay channel: the rapid increase in available phase space for the H  ZZ(*) decay as mH approaches the 2mZ threshold counteracts the mass dependence of Higgs-boson production. Furthermore, this far “off-shell” production above 2mZ has a negligible ΓH dependence, unlike “on-shell” production near the Higgs-boson mass at 125 GeV. Comparing the Higgs-boson production rates in these two regions therefore allows an indirect measurement of ΓH. Although some assumptions are required (e.g. that the relation between on-shell and off-shell production is not modified by BSM effects), the measurement is sensitive to the value of ΓH expected in the SM. Recently, ATLAS measured the off-shell production cross-section using both the four-charged lepton (4l) and two-charged lepton plus two neutrino (2l2v) final states, finding evidence for off-shell Higgs-boson production with a significance of 3.3 σ (figure 2). By combining both the previously measured on-shell Higgs-boson production-cross section and the of-shell Higgs-boson production-cross section, ΓH was found to be 4.5+3.3–2.5 MeV, which agrees with the SM prediction of 4.1 MeV but leaves plenty of room for possible BSM contributions.

This sensitivity will improve thanks to the new data to be collected in Run 3 of the LHC, which should more than triple the size of the Run 2 dataset.

Design principles of theoretical physics

“Now I know what the atom looks like!” Ernest Rutherford’s simple statement belies the scientific power of reductionism. He had recently discovered that atoms have substructure, notably that they comprise a dense positively charged nucleus surrounded by a cloud of negatively charged electrons. Zooming forward in time, that nucleus ultimately gave way further when protons and neutrons were revealed at its core. A few stubborn decades later they too gave way with our current understanding being that they are comprised of quarks and gluons. At each step a new layer of nature is unveiled, sometimes more, sometimes less numerous in “building blocks” than the one prior, but in every case delivering explanations, even derivations, for the properties (in practice, parameters) of the previous layer. This strategy, broadly defined as “build microscopes, find answers” has been tremendously successful, arguably for millennia.

Natural patterns

While investigating these successively explanatory layers of nature, broad patterns emerge. One of which is known colloquially as “naturalness”. This pattern essentially asserts that in reversing the direction and going from one microscopic theory, “the UV-completion”, to its larger-scale shell, “the IR”, the values of parameters measured in the latter are, essentially, “typical”. Typical, in the sense that they reflect the scales, magnitudes and, perhaps most importantly, the symmetries of the underlying UV completion. As Murray Gell-Mann once said: “everything not forbidden is compulsory”.

So, if some symmetry is broken by a large amount by some interaction in the UV theory, the same symmetry, in whatever guise it may have adopted, will also be broken by a large amount in the IR theory. The only exception to this is accidental fine-tuning, where large UV-breakings can in principle conspire and give contributions to IR-breakings that, in practical terms, accidentally cancel to a high degree, giving a much smaller parameter than expected in the IR theory. This is colloquially known as “unnaturalness”.

There are good examples of both instances. There is no symmetry in QCD that could keep a proton light; unsurprisingly it has mass of the same order as the dominant mass scale in the theory, the QCD scale, mp ~ ΛQCD. But there is a symmetry in QCD that keeps the pion light. The only parameters in UV theory that break this symmetry are the light quark masses. Thus, the pion mass-squared is expected to be around m2π ~ mqΛQCD. Turns out, it is.

There are also examples of unnatural parameters. If you measure enough different physical observables, observations that are unlikely on their own become possible in a large ensemble of measurements – a sort of theoretical “look elsewhere effect”. For example, consider the fact that the Moon almost perfectly obscures the Sun during a lunar eclipse. There is no symmetry which requires that the angular size of the Moon should almost match that of the Sun to an Earth-based observer. Yet, given many planets and many moons, this will of course happen for some planetary systems.

However, if an observation of a parameter returns an apparently unnatural value, can one be sure that it is accidentally small? In other words, can we be confident we have definitively explored all possible phenomena in nature that can give rise to naturally small parameters? 

From 30 January to 3 February, participants of an informal CERN theory institute “Exotic Approaches to Naturalness” sought to answer this question. Drawn from diverse corners of the theorist zoo, more than 130 researchers gathered, both virtually and in person, to discuss questions of naturalness. The invited talks were chosen to expose phenomena in quantum field theory and beyond which challenge the naive naturalness paradigm.

Coincidences and correlations

The first day of the workshop considered how apparent numerical coincidences can lead to unexpectedly small parameters in the IR due to the result of selection rules that do not immediately manifest from a symmetry, known as “natural zeros”. A second set of talks considered how, going beyond quantum field theory, the UV and IR can potentially be unexpectedly correlated, especially in theories containing quantum gravity, and how this correlation can lead to cancellations that are not apparent from a purely quantum field theory perspective.

The second day was far-ranging, with the first talk unveiling some lower dimensional theories of the sort one more readily finds in condensed matter systems, in which “topological” effects lead to constraints on IR parameters. A second discussed how fundamental properties, such as causality, can impose constraints on IR parameters unexpectedly. The last demonstrated how gravitational effective theories, including those describing the gravitational waves emitted in binary black hole inspirals, have their own naturalness puzzles.

The ultimate goal is to now go forth and find new angles of attack on the biggest naturalness questions in fundamental physics

Midweek, alongside an inspirational theory colloquium by Nathaniel Craig (UC Santa Barbara), the potential role of cosmology in naturalness was interrogated. An early example made famous by Steven Weinberg concerns the role of the “anthropic principle” in the presently measured value of the cosmological constant. However, since then, particularly in recent years, theorists have found many possible connections and mechanisms linking naturalness questions to our universe and beyond.

The fourth day focussed on the emerging world of generalised and higher-form symmetries, which are new tools in the arsenal of the quantum field theorist. It was discussed how naturalness in IR parameters may potentially arise as a consequence of these recently uncovered symmetries, but whose naturalness would otherwise be obscured from view within a traditional symmetry perspective. The final day studied connections between string theory, the swampland and naturalness, exploring how the space of theories consistent with string theory leads to restricted values of IR parameters, which potentially links to naturalness. An eloquent summary was delivered by Tim Cohen (CERN).

Grand slam

In some sense the goal of the workshop was to push back the boundaries by equipping model builders with new and more powerful perspectives and theoretical tools linked to questions of naturalness, broadly defined. The workshop was a grand slam in this respect. However, the ultimate goal is to now go forth and use these new tools to find new angles of attack on the biggest naturalness questions in fundamental physics, relating to the cosmological constant and the Higgs mass.

The Standard Model, despite being an eminently marketable logo for mugs and t-shirts, is incomplete. It breaks down at very short distances and thus it is the IR of some more complete, more explanatory UV theory. We don’t know what this UV theory is, however, it apparently makes unnatural predictions for the Higgs mass and cosmological constant. Perhaps nature isn’t unnatural and generalised symmetries are as-yet hidden from our eyes, or perhaps string theory, quantum gravity or cosmology has a hand in things? It’s also possible, of course, that nature has fine-tuned these parameters by accident, however, that would seem – à la Weinberg – to point towards a framework in which such parameters are, in principle, measured in many different universes. All of these possibilities, and more, were discussed and explored to varying degrees.

Perhaps the most radical possibility, the most “exotic approach to naturalness” of all, would be to give up on naturalness altogether. Perhaps, in whatever framework UV completes the Standard Model, parameters such as the Higgs mass are simply incalculable, unpredictable in terms of more fundamental parameters, at any length scale. Shortly before the advent of relativity, quantum mechanics, and all that have followed from them, Lord Kelvin (attribution contested) once declared: “There is nothing new to be discovered in physics now. All that remains is more and more precise measurement”. The breadth of original ideas presented at the “Exotic Approaches to Naturalness” workshop, and the new connections constantly being made between formal theory, cosmology and particle phenomenology, suggest it would be similarly unwise now, as it was then, to make such a wager.

bright-rec iop pub iop-science physcis connect