Comsol -leaderboard other pages

Topics

Physics is about principles, not particles

Last year marked the 10th anniversary of the discovery of the Higgs particle. Ten years is a short lapse of time when we consider the profound implications of this discovery. Breakthroughs in science mark a leap in understanding, and their ripples may extend for decades and even centuries. Take Kirchhoffs’ blackbody proposal more than 150 years ago: a theoretical construction, an academic exercise that opened the path towards a quantum revolution, the implications of which we are still trying to understand today. 

Imagine now the vast network of paths opened by ideas, such as emission theory, that led to no fruition despite their originality. Was pursuing these useful, or a waste of resources? Scientists would answer that the spirit of basic research is precisely to follow those paths with unknown destinations; it’s how humanity reached the level of knowledge that sustains modern life. As particle physicists, as long as the aim is to answer nature’s outstanding mysteries, the path is worth following. The Higgs-boson discovery is the latest triumph of this approach and, as for the quantum revolution, we are still working hard to make sense of it. 

Particle discoveries are milestones in the history of our field, but they signify something more profound: the realisation of a new principle in nature. Naively, it may seem that the Higgs discovery marked the end of our quest to understand the TeV scale. The opposite is true. The behaviour of the Higgs boson, in the form it was initially proposed, does not make sense at a quantum level. As a fundamental scalar, it experiences quantum effects that grow with their energy, doggedly pushing its mass towards the Planck scale. The Higgs discovery solidified the idea that gauge symmetries could be hidden, spontaneously broken by the vacuum. But it did not provide an explanation of how this mechanism makes sense with a fundamental scalar sensitive to mysterious phenomena such as quantum gravity. 

Veronica Sanz

Now comes the hard part. From the plethora of ideas proposed during the past decades to make sense of the Higgs boson – supersymmetry being the most prominent – most physicists predicted that it would have an entourage of companion particles with electroweak or even strong couplings. Arguments of naturalness, that these companions should be close-by to prevent troublesome fine-tunings of nature, led to the expectation that discoveries would follow or even precede that of the Higgs. Ten years on, this wish has not been fulfilled. Instead, we are faced with a cold reality that can lead us to sway between attitudes of nihilism and hubris, especially when it comes to the question of whether particle physics has a future beyond the Higgs. Although these extremes do not apply to everyone, they are understandable reactions to viewing our field next to those with more immediate applications, or to the personal disappointment of a lifelong career devoted to ideas that were not chosen by nature. 

Such despondence is not useful. Remember that the no-lose theorem we enjoyed when planning the LHC, i.e. the certainty that we would find something new, Higgs boson or not, at the TeV scale, was an exception to the rules of basic research. Currently, there is no no-lose theorem for the LHC, or for any future collider. But this is precisely the inherent premise of any exploration worth doing. After the incredible success we have had, we need to refocus and unify our discourse. We face the uncertainty of searching in the dark, with the hope that we will initiate the path to a breakthrough, still aware of the small likelihood that this actually happens. 

The no-lose theorem we enjoyed when planning the LHC was an exception to the rules of basic research

Those hopes are shared by wider society, which understands the importance of exploring big questions. From searching for exoplanets that may support life to understanding the human mind, few people assume these paths will lead to immediate results. The challenge for our field is to work out a coherent message that can enthuse people. Without straying far from collider physics, we could notice that there is a different type of conversation going on in the search for dark matter. Here, there is no no-lose theorem either, and despite the current exclusion of most vanilla scenarios, there is excitement and cohesion, which are effectively communicated. As for our critics, they should be openly confronted and viewed as an opportunity to build stronger arguments.

We have powerful arguments to keep delving into the smallest scales, with the unknown nature of dark matter, neutrinos and the matter–antimatter asymmetry the most well-known examples. As a field, we need to renew the excitement that led us where we are, from the shock of watching alpha particles bounce back from a thin gold sheet, to building a colossus like the LHC. We should be outspoken about our ambition to know the true face of nature and the profound ideas we explore, and embrace the new path that the Higgs discovery has opened. 

Gabriella Pálla 1934–2022

Gabriella Pálla

Gabriella Pálla, who laid the foundations for the participation of Hungarian groups in CERN experiments, passed away on 11 October 2022 at the age of 88.

Gabriella attended Eötvös Loránd University in 1953, and began her career in nuclear physics in 1958 at the KFKI Research Institute for Particle and Nuclear Physics. Her first position was at the atomic physics department under the supervision of Károly Simonyi (on the topic of fast neutron reactions). In the 1970s she received a Humboldt Research Fellowship and worked at the cyclotron at the University of Hamburg, later at Jülich. She received her PhD in 1972 at Eötvös University and gained a DSc titled “Direct reactions and the collective properties of nuclei” in 1987.

In the 1990s Gabriella’s attention turned towards heavy-ion physics. She helped initiate the Buda-TOF project at NA49 and NA61 and later became the Hungarian ALICE representative in the early years of the experiment. She received the Academy Prize in Physics from the Hungarian Academy of Sciences in 1999 and the Simonyi Károly Award in 2010.

Statistics meets gamma-ray astronomy

As a subfield of astroparticle physics, gamma-ray astronomy, investigates many questions rooted in particle physics in an astrophysical context. A prominent example is the search for self-annihilating Weakly Interacting Massive Particles (WIMPs) in the Milky Way as a signature of dark matter. Another long-standing problem is finding out where in the universe the cosmic-ray particles detected on Earth are accelerated to PeV energies and beyond.

With the imminent commissioning of the Cherenkov Telescope Array (CTA), which will comprise more than 100 telescopes located in the northern and southern hemispheres, gamma-ray astronomy is about to enter a new era. This was taken as an opportunity to discuss the statistical methods used to analyze data from Cherenkov telescopes at a dedicated PHYSTAT workshop hosted by the university of Berlin. More than 300 participants, including several statisticians, registered for PHYSTAT-Gamma from 28 to 30 September to discuss concrete statistical problems, find synergies between fields, and set the employed methods in a broader context.

Three main topics were addressed at the meeting across 13 talks and multiple discussion sessions: statistical analysis of data from gamma-ray observatories in a multi-wavelength context, connecting statisticians and gamma-ray astronomers, and astrophysical sources across different wavelengths. Many concrete physical questions in gamma-ray astronomy must be answered in an astrophysical context, which becomes visible only by observing the electromagnetic spectrum. A mutual understanding of the statistical methods and systematic errors is therefore needed. Josh Speagle (University of Toronto) proclaimed a potential ‘datapocalypse’ in the heterogeneity and amount of soon-to-be-expected astronomical data. Similarities between analyses in X- and gamma-ray astronomy gave hope for reducing the data heterogeneity. Further cause for optimism arose from new approaches for combining data from different observatories.

The second day of PHYSTAT-Gamma focused on building connections between statisticians and gamma-ray astronomers. Eric Feigelson (Penn State) gave an overview of astrostatistics, followed by deeper discussions of Bayesian methods in astronomy by Tom Loredo (Cornell) and techniques for fitting astrophysical models to data with bootstrap methods by Jogesh Babu (Penn State). The session concluded with an overview of statistical methods for the analysis of astronomical time series by Jeff Scargle (NASA).

The final day centered on the problem of how to match astrophysical sources across different wavelengths. CTA is expected to detect gamma rays from more than 1000 sources. Identifying the correct counterparts at other wavelengths will be essential to study the astrophysical context of the gamma-ray emission. Applying Bayesian methods, Tamas Budavari (Johns Hopkins) discussed the current state of the problem from a statistical point of view, followed by in-depth talks and discussions among experts from X-ray, gamma-ray, and radio astronomy.

Topics across all sessions were the treatment of systematic errors and the formats for exchanging data between experiments. Technical considerations appear to dominate the definition of data formats in astronomy currently. However, for example, as Fisher famously showed with the introduction of sufficiency, statistical aspects can help to find useful representations of data and might also be considered in the definition of future data formats.

PHYSTAT-gamma was only the first attempt to discuss statistical aspects of gamma-ray astronomy. For example, the LHCf experiment at CERN will help to improve the prediction of the gamma-ray flux, which is expected from astrophysical hadron colliders and measured by gamma-ray observatories like CTA. However, modeling uncertainties from particle physics must be treated appropriately to improve the constraints on astrophysical processes. The discussion of this and many further topics is planned for follow-up meetings.

Fundamental symmetries and interactions at PSI

PSI_2022

The triennial workshop “Physics of fundamental Symmetries and Interactions – PSI2022” took place for the sixth time at the Paul Scherrer Institut (PSI) in Switzerland from 17 to 22 October, bringing the worldwide fundamental symmetries community together. More than 190 participants including some 70 young scientists welcomed the close communication of an in-person meeting built around 35 invited and 25 contributed talks.

A central goal of the meeting series is to deepen relations between disciplines and scientists. This year, exceptionally, participants connected with the FIPs workshop at CERN on the second day of the conference, due to the common topics discussed.

With PSI’s leading high-intensity muon and pion beams, many topics in muon physics and lepton-flavour violation were highlighted. These covered rare muon decays (μ → e + γ, μ → 3e) and muon conversion (μ → e), muonic atoms and proton structure, and muon capture. Presentations covered complementary experimental efforts at J-PARC, Fermilab and PSI. The status of the muon g-2 measurement was reviewed from an experimental and theoretical perspective, where lattice-QCD calculations from 2021 and 2022 have intensified discussions around the tension with Standard Model expectations.

Fundamental physics using cold and ultracold neutrons was a second cornerstone of the programme. Searches for a neutron electric dipole moment (EDM) were discussed in contributions by collaborations from TRIUMF, LANL, SNS, ILL and PSI, complemented by presentations on searches for EDMs in atomic and molecular systems. Along with new results from neutron-beta-decay measurements, the puzzle of the neutron lifetime keeps the community busy, with improving “bottle” and “beam” measurements presently differing by more than 5 standard deviations. Several talks highlighted possible explanations via neutron oscillations into sterile or mirror states.

The current status of direct neutrino-mass measurements and future outlook down into the meV range was covered together with updates on searches for neutrinoless double-beta decay. An overview of the hunt for the unknown at the dark-matter frontier was presented together with new limits and plans from various searches. Ultraprecise atomic clocks were discussed allowing checks of general relativity and the Standard Model, and for searches beyond established theories. The final session covered the latest results from antiproton and antihydrogen experiments at CERN, demonstrating the outstanding precision achieved in CPT tests with these probes. The workshop was a great success and participants look forward to reconvening at PSI2025.

Higgs hunting in Paris

higgs_hunting_2022

The 12th Higgs Hunting workshop, which took place in Paris and Orsay from 12 to14 September, presented an overview of recent and new results in Higgs-boson physics. The results painted an increasingly detailed picture of Higgs-boson properties, thanks to the many analyses now reporting results based on the full LHC Run 2 dataset, with an integrated luminosity of about 140 fb-1. Searches for phenomena beyond the Standard Model (BSM) were also presented.

Highlights included new results from CMS on decays of Higgs bosons to b quarks and to invisible final states, and a new limit from ATLAS on lepton-flavour violating decays of the Higgs boson. Events with two Higgs bosons in the final state were used to set limits on interactions involving three Higgs bosons and between two Higgs bosons and two weak vector bosons. All the results remain compatible with Standard Model expectations, except for a small number of intriguing tensions in some BSM searches, such as small excesses in a search for heavier partners of the Higgs boson decaying to W-boson pairs and in a search for resonances produced alongside a Z boson and decaying to a pair of Higgs bosons. These deviations from theory will be followed up by ATLAS and CMS in further analyses using Run 2 and Run 3 data.

This year’s workshop was special as the event marked the tenth anniversary of the Higgs-boson discovery in 2012. Two historical talks given by the former ATLAS and CMS spokespersons Peter Jenni (University of Freiburg & CERN) and Jim Virdee (Imperial College) highlighted the long-term efforts that laid the foundation for the Higgs-boson discovery in 2012.

The workshop also hosted an in-depth discussion on future accelerators and related detector R&D. It focused on future efforts in Europe, the US and Latin America, and featured presentations by Karl Jakobs (University of Freiburg and chair of the European Committee for Future Accelerators), Meenashi Narain (Brwon University and convener of the energy frontier group of the Snowmass process), Maria-Teresa Tova (National University of La Plata) and representative for the Latin American strategy effort) and Emmanuel Perez (CERN), who discussed recent improvements in physics analyses at future colliders.

Recent theory developments were also extensively covered, in particular recent developments in higher-order computations by Michael Spira (PSI), which highlighted the agreement between experimental results and predictions. A review of recent theory progress towards future colliders was also presented by Gauthier Durieux (CERN), while Carlos Wagner (Enrico Fermi Institute, & Kavli Institute for Cosmological Physics) discussed the new-physics that can be explored via precise measurements of Higgs-boson couplings. Finally, a “vision” presentation by Marcela Carena (Fermilab) highlighted new opportunities for the study of electroweak baryogenesis in relation to Higgs-boson measurements.

Many experimental sessions were held regarding recent results on a wide variety of topics, some which will be relevant in upcoming Run 3 measurements. This includes measurements related to potential CP-violating effects in the Higgs sector, as well as effective field theories (EFTs). This latter topic allows a general description of deviations from Standard Model  predictions in Higgs-boson measurements and beyond, and much improved measurements in this direction are expected in Run 3. The search for  Higgs-boson pair production was also an important focus at the Paris meeting. The latest Run 2 analyses showed greatly improved sensitivity compared to earlier rounds, and further improvements are expected in Run 3. While sensitivity to the Standard Model signal is not expected until the High-Luminosity LHC, these searches should set strong constraints on BSM effects in the Higgs sector.

Concluding talks were given by Fabio Maltoni (Louvain) and Giacinto Piacquadio (Stony Brook), and the next Higgs Hunting workshop will be held in Orsay and Paris from 11 to 13 September 2023.

Back to the Swamp

Since its first revolution in the 1980s, string theory has been proposed as a framework to unify all known interactions in nature. As such, it is a perfect candidate to embed the standard models of particle physics and cosmology into a consistent theory of quantum gravity. Over the past decades, the quest to recover both models as low-energy effective field theories (EFTs) of string theory has led to many surprising results, and to the notion of a “landscape” of string solutions reproducing many key features of the universe.

back_to_the_swamp

Initially, the vast number of solutions led to the impression that any quantum field theory could be obtained as an EFT of string theory, hindering the predictive power of the theory. In fact, recent developments have shown that quite the opposite is true: many respectable-looking field theories become inconsistent when coupled to quantum gravity and can never be obtained as EFTs of string theory. This set is known as the “swampland” of quantum field theories. The task of the swampland programme is to determine the structure and boundaries of the swampland, and from there extract the predictive power of string theory. Over the past few years, deep connections between the swampland and a fundamental understanding of open questions in high-energy physics ranging from the hierarchy of fundamental scales to the origin and fate of the universe, have emerged.

The workshop Back to the Swamp, held at Instituto de Física Teórica UAM/CSIC in Madrid from 26 to 28 September, gathered leading experts in the field to discuss recent progress in our understanding of the swampland, as well as its implications for particle physics and cosmology. In the spirit of the two previous conferences Vistas over the Swampland and Navigating the Swampland, also hosted at IFT, the meeting featured 22 scientific talks and attracted about 100 participants.

The swampland programme has led to a series of conjectures that have sparked debate about how to connect string theory with the observed universe, especially with models of early-universe cosmology. This was reflected with several talks on the subject, ranging from new scrutiny of current proposals to obtain de Sitter vacua, which might not be consistently constructed in quantum gravity, new candidates for quintessence models that introduce a scalar field to explain the observed accelerated expansion  of the universe, and scenarios where dark matter is composed of primordial black holes. Several talks covered the implications of the programme for particle physics and quantum field theories in general. Topics included axion-based proposals to solve the strong-CP problem from the viewpoint of quantum gravity, as well as how axion physics and approximate symmetries can link swampland ideas with experiment and how the mathematical concept of “tameness” could describe those quantum field theories that are compatible with quantum gravity. Progress on the proposal to characterize large field distances and field-dependent weak couplings as emergent concepts, general bounds on supersymmetric quantum field theories from consistency of axionic string worldsheet theories, and several proposals on how dispersive bound and the boostrap programme are also relevant for swampland ideas. Finally, several talks covered more formal topics, such as a sharpened formulation of the distance conjecture, new tests of the tower weak gravity conjecture, the discovery of new corners in the string theory landscape, and arguments in favour of and against Euclidean wormholes.

The new results demonstrated the intense activity in the field and highlighted several current aspects of the swampland programme. It is clear that the different proposals and conjectures driving the programme have sharpened and become more interconnected. Each year, the programme attracts more scientists working in different specialities of string theory, and proposals to connect the swampland with experiment take a larger fraction of the efforts.

Chasing feebly interacting particles at CERN

What is the origin of neutrino masses and oscillations? What is the nature of Dark Matter? What mechanism generated matter-antimatter-asymmetry? What drove the inflation of our Universe and provides an explanation to Dark Energy? What is the origin of the hierarchy of scales? These are outstanding questions in particle physics that still require an answer.

So far, the experimental effort has been driven by theoretical arguments that favoured the existence of new particles with relatively large couplings to the Standard Model (SM) and masses commensurate the mass of the Higgs boson. Searching for these particles has been one of the main goals of the physics programme of the LHC. However, several beyond-the-SM theories predict the existence of light (sub-GeV) particles, which interact very weakly with the SM fields. Such feebly interacting particles (FIPs) can provide elegant explanations to several unresolved problems in modern physics. Furthermore, searching for them requires specific and distinct techniques, creating new experimental challenges along with innovative theoretical efforts.

FIPs are currently one of the most debated and discussed topics in fundamental physics and were recommended by the 2020 update of the European strategy for particle physics as a compelling field to explore in the next decade. The FIPs 2022 workshop held at CERN from 17 to 21 October was the second in a series dedicated to the physics of FIPs, attracted 320 experts from collider, beam-dump and fixed-target experiments, as well as from the astroparticle, cosmology, axion and dark-matter communities gathered to discuss the progress in experimental searches and new developments in underlying theoretical models.

The main goal of the workshop was to create a base for a multi-disciplinary and interconnected approach. The breadth of open questions in particle physics and their deep interconnection requires a diversified research programme with different experimental approaches and techniques, together with a strong and focused theoretical involvement. In particular, FIPs 2022, which is strongly linked with the Physics Beyond Colliders initiative at CERN, aimed at shaping the FIPs programme in Europe. Topics under discussion include the impact that FIPs might have in stellar evolution, ΛCDM cosmological-model parameters, indirect dark-matter detection, neutrino physics, gravitational-wave physics and AMO (atoms-molecular-optical) physics. This is in addition to searches currently performed at colliders and extracted beam lines worldwide.

The main sessions were organised around three main themes: light dark matter in particle and astroparticle physics and cosmology; ultra-light FIPs and their connection with cosmology and astrophysics; and heavy neutral leptons and their connection with neutrino physics. In addition, young researchers in the field presented and discussed their work in the “new ideas” sessions.

FIPs 2022 aimed not only to explore new answers to the unresolved questions in fundamental physics, but to analyse the technical challenges and necessary infrastructure and collaborative networks required to answer them. Indeed, no single experiment or laboratory would be able by itself to cover the large parameter space in terms of masses and couplings that FIPs models suggest. Synergy and complementarity among a great variety of experimental facilities are therefore paramount, calling for a deep collaboration across many laboratories and cross-fertilisation among different communities and experimental techniques. We believe that a network of interconnected laboratories can become a sustainable, flexible and efficient way of addressing the particle physics questions in the next millennium.

The next appointment for the community is the retreat/school “FIPs in the ALPs” to be held in Les Houches from 15 to 19 May 2023, to be followed by the next edition of the FIPs workshop at CERN in autumn 2024.

Remembering the W discovery

A W event recorded by UA1 in 1982

When the W and Z bosons were predicted in the mid-to-late 1960s, their masses were not known. Experimentalists therefore had no idea what energy they needed to produce them. That changed in 1973, when Gargamelle discovered neutral-current neutrino interactions and measured the cross-section ratio between neutral- and charged-current interactions. This ratio provided the first direct determination of the weak mixing angle, which, via the electroweak theory, predicted the W-boson mass to lie between 60 and 80 GeV, and the Z mass between 75 and 95 GeV – at least twice the energy of the leading accelerators of the day. 

By then, the world’s first hadron collider – the Intersecting Storage Rings (ISR) at CERN – was working well. Kjell Johnsen proposed a new superconducting ISR in the same tunnel, capable of reaching 240 GeV. A study group was formed. Then, in 1976, Carlo Rubbia, David Cline and Peter McIntyre suggested adding  an antiproton source to a conventional 400 GeV proton accelerator, either at Fermilab or at CERN, to transform it into a pp collider. The problem was that the antiprotons had to be accumulated
and cooled if the target luminosity (1029 cm–2s–1, providing about one Z event per day) was to be reached. Two methods were proposed: stochastic cooling by Simon van der Meer at CERN and electron cooling by Gersh Budker in Novosibirsk. 

CERN Director-General John Adams wasn’t too happy that as soon as the SPS had been built, physicists wanted to convert it into a pp collider. But he accepted the suggestion, and the idea of a superconducting ISR was abandoned. Following the Initial Cooling Experiment, which showed that the luminosity target was achievable with stochastic cooling, the SppS was approved in May 1978 and the construction of the Antiproton Accumulator (AA) by van der Meer and collaborators began. Around that time, the design of the UA1 experiment was also approved. 

A group of us proposed a second, simpler experiment in another interaction region (UA2), but it was put on hold for financial reasons. Then, at the end of 1978, Sam Ting proposed an experiment to go in the same place. His idea was to surround the beam with heavy material so that everything would be absorbed except for muons, making it good at identifying Z → μ+μ but far from good for W bosons decaying to a muon and a neutrino. In a tense atmosphere, Ting’s proposal was turned down and ours was approved.

First sightings

The first low-intensity pp collisions arrived in late 1981. In December 1982 the luminosity reached a sufficient level, and by the following month UA1 had recorded six W candidates and UA2 four. The background was minimal; there was nothing else we could think of that would produce such events. Carlo presented the UA1 events and Pierre Darriulat the UA2 ones at a workshop in Rome on 12–14 January 1983. On 20 January, Carlo announced the W discovery at a CERN seminar, and the next day I presented the UA2 results, confirming UA1. In UA2 we never discussed priority, because we all knew that it was Carlo who had made the whole project possible. 

Luigi Di Lella

The same philosophy guided the discovery of the Z boson. UA2 had recorded a candidate Z → e+e event in December 1982, also presented by Pierre at the Rome workshop. One electron was perfectly clear, whereas the other had produced a shower with many tracks. I had shown the event to Jack Steinberger, who strongly suggested we publish immediately; however, we decided to wait for the first “golden” event with both electrons unambiguously identified. Then, one night in May 1983, UA1 found a Z. As with ours, only one electron satisfied all electron-identification criteria, but Carlo used the event to announce a discovery. The UA1 results (based on four Z → e+e events and one Z → μ+μ) were published that July, followed by the UA2 results (based on eight Z → e+e events, including the 1982 one) a month later. 

The SppS ran until 1990, when it became clear that Fermilab’s Tevatron was going to put us out of business. In 1984–1985 the energy was increased from 546 to 630 GeV and in 1986 another ring was added to the AA, increasing the luminosity 10-fold. Following the 1984 Nobel prize to Rubbia and van der Meer, UA1 embarked on an ambitious new electromagnetic calorimeter that never quite worked. UA2 went on to make a precise measurement of the ratio mW/mZ, which, along with the first precise measurement of mZ at LEP, enabled us to determine the W mass with 0.5% precision and, via radiative corrections, to predict the mass of the top quark (160+50–60 GeV) several years before the Tevatron discovered it. 

Times have certainly changed since then, but the powerful interplay between theory, experiment and machine builders remains essential for progress in particle physics. 

Combining quantum with high-energy physics

From 1 to 4 November, the first International Conference on Quantum Technologies for High-Energy Physics (QT4HEP) was held at CERN. With 224 people attending in person and many more following online, the event brought together researchers from academia and industry to discuss recent developments and, in particular, to identify activities within particle physics that can benefit most from the application of quantum technologies.

Opening the event, Joachim Mnich, CERN director for research and computing, noted that CERN is widely recognised, including by its member states, as an important platform for promoting applications of quantum technologies for both particle physics and beyond. “The journey has just begun, and the road is still long,” he said, “but it is certain that deep collaboration between physicists and computing experts will be key in capitalising on the full potential of quantum technologies.”

The conference was organised by the CERN Quantum Technology Initiative (CERN QTI), which was established in 2020, and followed a successful workshop on quantum computing in 2018 that marked the beginning of a range of new investigations into quantum technologies at CERN. CERN QTI covers four main research areas: quantum theory and simulation; quantum sensing, metrology and materials; quantum computing and algorithms; and quantum communication and networks. The first day’s sessions focused on the first two: quantum theory and simulation, as well as quantum sensing, metrology and materials. Topics covered included the quantum simulation of neutrino oscillations, scaling up atomic interferometers for the detection of dark matter, and the application of quantum traps and clocks to new-physics searches.

Building partnerships

Participants showed an interest in broadening collaborations related to particle physics. Members of the quantum theory and quantum sensing communities discussed ways to identify and promote areas of promise relevant to CERN’s scientific programme. It is clear that many detectors in particle physics can be enhanced – or even made possible – through targeted R&D in quantum technologies. This fits well with ongoing efforts to implement a chapter on quantum technologies in the European Committee for Future Accelerators’ R&D roadmap for detectors, noted Michael Doser, who coordinates the branch of CERN QTI focused on sensing, metrology and materials.

For the theory and simulation branch of CERN QTI, the speakers provided a useful overview of quantum machine learning, quantum simulations of high-energy collider events and neutrino processes, as well as making quantum-information studies of wormholes testable on a quantum processor. Elina Fuchs, who coordinates this branch of CERN QTI, explained how quantum advantages have been found for toy models of increased physical relevance. Furthermore, she said, developing a dictionary that relates interactions at high energies to lower energies will enhance knowledge about new-physics models learned from quantum-sensing experiments.

The conference demonstrated the clear potential of different quantum technologies to impact upon particle-physics research

The second day’s sessions focused on the remaining two areas, with talks on quantum-machine learning, noise gates for quantum computing, the journey towards a quantum internet, and much more. These talks clearly demonstrated the importance of working in interdisciplinary, heterogeneous teams when approaching particle-physics research with quantum-computing techniques. The technical talks also showed how studies on the algorithms are becoming more robust, with a focus on trying to address problems that are as realistic as possible.

A keynote talk from Yasser Omar, president of the Portuguese Quantum Institute, presented the “fleet” of programmes on quantum technologies that has been launched since the EU Quantum Flagship was announced in 2018. In particular, he highlighted QuantERA, a network of 39 funding organisations from 31 countries; QuIC, the European Quantum Industry Consortium; EuroQCI, the European Quantum Communication Infrastructure; EuroQCS, the European Quantum Computing and Simulation Infrastructure; and the many large national quantum initiatives being launched across Europe. The goal, he said, is to make Europe autonomous in quantum technologies, while remaining open to international collaboration. He also highlighted the role of World Quantum Day – founded in 2021 and celebrated each year on 14 April – in raising awareness around the world of quantum science.

Jay Gambetta, vice president of IBM Quantum, gave a fascinating talk on the path to quantum computers that exceed the capabilities of classical computers. “Particle physics is a promising area for looking for near-term quantum advantage,” he said. “Achieving this is going to take both partnership with experts in quantum information science and particle physics, as well as access to tools that will make this possible.”

Industry and impact

The third day’s sessions – organised in collaboration with CERN’s knowledge transfer group – were primarily dedicated to industrial co-development. Many of the extreme requirements faced by quantum technologies are shared with particle physics, such as superconducting materials, ultra-high vacuum, precise timing, and much more. For this reason, CERN has built up a wealth of expertise and specific technologies that can directly address challenges in the quantum industry. CERN strives to maximise the impact of all of its technologies and know-how on society in many ways to ease the transfer of CERN’s knowledge to industry and society. One focus is to see which technologies might help to build robust quantum-computing devices. Already, CERN’s White Rabbit technology, which provides sub-nanosecond accuracy and picosecond precision of synchronisation for the LHC accelerator chain, has found its way to the quantum community, noted Han Dols, business development and entrepreneurship section leader.

Several of the day’s talks focused on challenges around trapped ions and control systems. Other topics covered included the potential of quantum computing for drug development, measuring brain function using quantum sensors, and developing specialised instrumentation for quantum computers. Representatives of several start-up companies, as well as from established technology leaders, including Intel, Atos and Roche, spoke during the day. The end of the third day was dedicated to crucial education, training and outreach initiatives. Google provided financial support for 11 students to attend the conference, and many students and researchers presented posters.

Marieke Hood, executive director for corporate affairs at the Geneva Science and Diplomacy Anticipator (GESDA) foundation, also gave a timely presentation about the recently announced Open Quantum Institute (OQI). CERN is part of a coalition of science and industry partners proposing the creation of this institute, which will work to ensure that emerging quantum technologies tackle key societal challenges. It was launched at the 2022 GESDA Summit in October, during which CERN Director-General Fabiola Gianotti highlighted the potential of quantum technologies to help achieve key UN Sustainable Development Goals. “The OQI acts at the interface of science and diplomacy,” said Hood. “We’re proud to count CERN as a key partner for OQI, its experience of multinational collaboration will be most useful to help us achieve these ambitions.”

The final day of the conference was dedicated to hands-on workshops with three different quantum-computing providers. In parallel, a two-day meeting of the “Quantum Computing 4HEP” working group, organised by CERN, DESY and the IBM Quantum Network, took place.

Qubit by qubit

Overall, the QT4HEP conference demonstrated the clear potential of different quantum technologies to impact upon particle-physics research. Some of these technologies are here today, while others are still a long way off. Targeted collaboration across disciplines and the academia–industry interface will help ensure that CERN’s research community is ready to maximise on the potential of these technologies.

“Widespread quantum computing may not be here yet, but events like this one provide a vital platform for assessing the opportunities this breakthrough technology could deliver for science,” said Enrica Porcari, head of the CERN IT department. “Through this event and the CERN QTI, we are building on CERN’s tradition of bringing communities together for open discussion, exploration, co-design and co-development of new technologies.”

Playing in the sandbox of geometry

Maryna Viazovska

When did you first know you had a passion for pure mathematics? 

I have had a passion for mathematics since my first year in school. At that time I did not realise what “pure mathematics” was, but maths was my favourite subject from a very early age.

What is number theory, in terms that a humble particle physicist can understand?

In fact, “number theory” is not well defined and any interesting question about numbers, geometric shapes and functions can be seen as a question for a number theorist.

What motivated you to work on sphere-packing? 

I think it is a beautiful problem, something that can be easily explained. Physicists know what a Euclidean space and a sphere are, and everybody knows the problem from stacking oranges or apples. What is a bit harder to explain is that mathematicians are not trying to model a particular physical situation. Mathematicians are not bound to phenomena in nature to justify their work, they just do it. We do not need to model any physical situation, which is a luxury. The work could have an accidental application, but this is not the primary goal. Physicists, especially theorists, are used to working in multi-dimensional spaces. At the same time, these dimensions have a special interpretation in physics. 

What fascinates you most about working on theoretical rather than applied mathematics?

My motivation often comes out of curiosity and my belief that the solutions to the problems will become useful at some point in the future. But it is not my job to judge or to define the usefulness. My belief is that the fundamental questions must be answered, so that other people can use this knowledge later. It is important to understand the phenomena in mathematics and in science in general, and the possibility of discovering something that other people have not yet. Maybe it is possible to come up with other ideas for detectors, which become interesting. When I look at physics detectors, for example, it fascinates me how complex these machines are and how many tiny technical solutions must be invented to make it all work. 

How did you go about cracking the sphere-stacking problem? 

I think there was an element of luck that I could find the correct idea to solve this problem because many people worked on it before. I was fortunate to find the right solution. The initial problem came from geometry, but the final solution came from Fourier analysis, via a method called linear programming. 

I think a mathematical reality exists on its own and sometimes it does describe actual physical phenomena

In 2003, mathematicians Henry Cohn and Noam Elkies applied the linear programming method to the sphere-packing problem and numerically obtained a nearly optimal upper bound in dimensions 8 and 24. Their method relied on constructing an auxiliary, “magic”, function. They computed this function numerically but could not find an explicit formula for it. My contribution was to find the explicit formula for the magic function.

What applications does your work have, for example in quantum gravity? 

After I solved the sphere-packing problem in dimension 8 in 2016, CERN physicists worked on the relation between two-dimensional conformal field theory and quantum gravity. From what I understand, conformal field theories are mathematically totally different from sphere-packing problems. However, if one wants to optimise certain parameters in the conformal field theory, physicists use a method called “bootstrap”, which is similar to the linear programming that I used. The magic functions I used to solve the sphere-packing problem were independently rediscovered by Thomas Hartman, Dalimil Mazác and Leonardo Rastelli.

Are there applications beyond physics?

One of the founders of modern computer science, Claude Shannon, realised that sphere-packing problems are not only interesting geometric problems that pure mathematicians like me can play with, but they are also a good model for error-correcting codes, which is why higher-dimensional sphere packing problems became interesting for mathematicians. A very simplified version of the original model could be the following. An error is introduced during the transmission of a message. Assuming the error is under control, the corrupted message is still close to the original message. The remedy is to select different versions of the messages called codewords, which we think are close to the original message but at the same time far away from each other, so that they do not mix with each other. In geometric language, this situation is an exact analogy of sphere-packing, where each code word represents the centre of the sphere and the sphere around the centre represents the cloud of possible errors. The spheres will not intersect if their centres are far away from each other, which allows us to decode the corrupted message.  

Do you view mathematics as a tool, or a deeper property of reality?

Maybe it is a bit idealistic, but I think a mathematical reality exists on its own and sometimes it does describe actual physical phenomena, but it still deserves our attention if not. In our mathematical world, we have chances to realise that something from this abstract mathematical world is connected to other fields, such as physics, biology or computer science. Here I think it’s good to know that the laws of this abstract world often provide us with useful gadgets, which can be used later to describe the other realities. This whole process is a kind of “spiral of knowledge” and we are in one of its turns.

bright-rec iop pub iop-science physcis connect