Comsol -leaderboard other pages

Topics

Calculating the curiosity windfall

Magnet R&D

Recent decades have seen an emphasis on the market and social value of fundamental science. Increasingly, researchers must demonstrate the benefits of their work beyond the generation of pure scientific knowledge, and the cultural benefits of peaceful and open international collaboration.

This timely collection of short essays by leading scientific managers and policymakers, which emerged from a workshop held during Future Circular Collider (FCC) Week 2019, brings the interconnectedness of fundamental science and economics into focus. Its 18 contributions range from procurement to knowledge transfer, and from global-impact assessments to case studies from CERN, SKA, the ESS and ESA, with a foreword by former CERN Director-General Rolf Heuer. As such, it constitutes an important contribution to the literature and a guide for future projects such as a post-LHC collider.

As the number and size of research infrastructures (RIs) has grown over the years, describes CERN’s head of industry, procurement and knowledge transfer, Thierry Lagrange, the will to push the frontier of knowledge has required significant additional public spending linked to the development and upgrade of high-tech instruments, and increased maintenance costs. The socioeconomic returns to society are clear, he says. But these benefits are not generated automatically: they require a thriving ecosystem that transfers knowledge and technologies to society, aided by entities such as CERN’s knowledge transfer group and business incubation centres.

RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures

Multi-billion public investments in RIs are justified given their crucial and multifaceted role in society, asserts EIROforum liaison officer at the European Commission, Margarida Ribeiro. She argues that new RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures, adequate long-term funding, closer engagement with industry, and methodologies for assessing RI impact. All contributors acknowledge the importance of this latter point. While physicists would no doubt prefer to go back to the pre-Cold War days of doing science for science’s sake, argues ESS director John Womersley, without the ability to articulate the socioeconomic justifications of fundamental science as a driver of prosperity, jobs, innovation, startups and as solutions to challenges such as climate change and the environment, it is only going to become more difficult for projects to get funding.

A future collider is a case in point. Johannes Gutleber of CERN and the FCC study describes several recent studies seeking to quantify the socioeconomic value of the LHC and its proposed successor, the FCC, with training and industrial innovation emerging as the most important generators of impact. The rising interest in the type of RI benefits that emerge and how they can be maximised and redistributed to society, he writes, is giving rise to a new field of interdisciplinary research, bringing together economists, social scientists, historians and philosophers of science, and policymakers.

Nowhere is this better illustrated than the ongoing programme led by economists at the University of Milan, described in two chapters by Florio Massimo and Andrea Bastianin. A recent social cost–benefit analysis of the HL-LHC, for example, conservatively estimates that every €1 of costs returns €1.2 to society, while a similar study concerning the FCC estimates the benefit/cost ratio to be even higher, at 1.8. Florio argues that CERN and big science more generally are ideal testing grounds for theoretical and empirical economic models, while demonstrating the positive net impact that large colliders have for society. His 2019 book Investing in Science: Social Cost-Benefit Analysis of Research Infrastructures (MIT Press) explores this point in depth (CERN Courier September 2018 p51), and is another must-read in this growing interdisciplinary area. Completing the series of essays on impact evaluation, Philip Amison of the UK’s Science and Technology Facilities Council reviews the findings of a report published last year capturing the benefits of CERN membership.

The final part of the volume focuses on the question “Who benefits from such large public investments in science?”, and addresses the contribution of big science to social justice and inequalities. Carsten Welsch of the University of Liverpool/Cockcroft Institute argues that fundamental science should not be considered as a distant activity, illustrating the point convincingly via the approximately 50,000 particle accelerators currently used in industry, medical treatments and research worldwide.

The grand ideas and open questions in particle physics and cosmology already inspire many young people to enter STEM subjects, while technological spin-offs such as medical treatments, big-data handling, and radio-frequency technology are also often communicated. Less well known are the significant but harder-to-quantify economic benefits of big science. This volume is therefore essential reading, not just for government ministers and policymakers, but for physicists and others working in curiosity-driven research who need to convey the immense benefits of their work beyond pure knowledge.

Fermilab strengthens muon g-2 anomaly

Hotly anticipated results from the first run of the muon g-2 experiment at Fermilab were announced today, increasing the tension between measurements and theoretical calculations. The last time this ultra-precise measurement was performed, in a sequence of results at Brookhaven National Laboratory in the late 1990s and early 2000s, it disagreed with the Standard Model (SM) by 3.7σ. After almost eight years of work rebuilding the Brookhaven experiment at Fermilab and analysing its first data, the muon’s anomalous magnetic moment has been measured to be 116 592 040(54)×10-11. The result is in agreement with the Brookhaven measurement and is 3.3σ greater than the SM prediction: 116 591 810(43)×10-11. Combined with the Brookhaven result, the world-average value for the anomalous magnetic moment of the muon is 116 592 061(41)×10-11, representing a 4.2σ departure from the SM.

“Today is an extraordinary day, long awaited not only by us but by the whole international physics community,” says Graziano Venanzoni of the INFN, who is co-spokesperson of the Fermilab muon g-2 collaboration. “A large amount of credit goes to our young researchers who, with their talent, ideas and enthusiasm, have allowed us to achieve this incredible result.”

Today is an extraordinary day, long awaited not only by us but by the whole international physics community

Graziano Venanzoni

The Fermilab result was unblinded during a Zoom meeting on 25 February in the presence of around 200 collaborators from around the world. “We were all very excited to finally know our result and the meeting was very emotional,” says Venanzoni. The analysis took almost three years from data taking to the release of the result and the collaboration decided to unblind only when all the steps of the analysis were completed and there were no outstanding questions. Venanzoni adds that no further analysis was completed after the unblinding and the results are unchanged.

The previous Brookhaven measurement left physicists pondering whether the presence of unknown particles in loops could be affecting the muon’s behaviour. It was clear that further measurements were needed, but it turned out to be much cheaper to move the apparatus to Fermilab than to build a new, more precise experiment at Brookhaven. So in the summer of 2013, the experiment’s 14-m diameter, 1.45 T superconducting magnet was transported from Long Island to the suburbs of Chicago. The Fermilab team reassembled the magnet and spent a year “shimming” its field, making it three times more uniform than the one it created at Brookhaven. Along with a new beamline to deliver a purer muon beam, Fermilab’s muon g-2 reincarnation required entirely new instrumentation, along with new detectors and a control room.

When a muon travels through the strong external magnetic field of a storage ring, the direction of its magnetic moment precesses at a rate that depends on its strength g. The Dirac equation predicts that all fermions have a g-factor equal to two. But higher order loops add an “anomalous” moment, aμ = (g-2)/2, which can be calculated extremely precisely. At Fermilab, muons with an energy of about 3.1 GeV are vertically focused in the storage ring via quadrupoles, and their precession frequency is determined from decays to electrons using 24 electromagnetic calorimeters located along the ring’s inner circumference. The intense polarised muon beam suppresses the pion contamination that challenged the Brookhaven measurement, while new calibration systems and simulations allow better control of systematic uncertainties.

It is so gratifying to finally be resolving this mystery

Chris Polly

The Fermilab muon g-2 collaboration took its first dataset in 2018, with over eight billion muon decays resulting in an overall uncertainty approximately 15% better than Brookhaven’s. Data analysis on the second and third runs is already under way, while a fourth run is ongoing and a fifth is planned. The collaboration is targeting a final precision of around 0.14 ppm – four times greater than the previous measurement.

“After the 20 years that have passed since the Brookhaven experiment ended, it is so gratifying to finally be resolving this mystery,” said Fermilab’s Chris Polly, a co-spokesperson for the current experiment and a graduate student on the Brookhaven experiment. “So far we have analysed less than 6% of the data that the experiment will eventually collect. Although these first results are telling us that there is an intriguing difference with the Standard Model, we will learn much more in the next couple of years.”

Theory baseline
Developments in the theory community are equally vital. The Fermilab muon g-2 collaboration takes as its theory baseline the value for aμ obtained last year by the Muon g-2 Theory Initiative. Uncertainties in the calculation are dominated by hadronic contributions, in particular a term called the hadronic vacuum polarization (HVP). The Theory Initiative incorporates the HVP value obtained by well-established “dispersive methods”, which combine fundamental properties of quantum field theory with experimental measurements of low-energy hadronic processes. An alternative approach gaining traction is to calculate the HVP contribution using lattice QCD. In a paper published in Nature today, one group reports lattice calculations of HVP which, if included in the theory result, would significantly reduce the discrepancy between the experimental and theoretical values for aμ. The result is in 2σ tension with the value obtained from the dispersive approach, and is currently dominated by systematic uncertainties stemming from approximations used in the lattice calculations, say Muon g-2 Theory Initiative members.

“This being the first lattice result at sub-percent precision, it is premature to draw firm conclusions from this comparison,” reads a statement from the Muon g-2 Theory Initiative steering committee. “Indeed, given the complexity of the computations, independent results from different lattice groups with commensurate uncertainties are needed to test and check the lattice calculations against each other. Being entirely based on Standard Model theory, once the lattice results are well tested and precise enough, they will play an important role in understanding how new physics enters into the discrepancy.”

High-power linac shows promise for accelerator-driven reactors

Physicists at the Institute of Modern Physics (IMP) in Lanzhou, China, have achieved a significant milestone towards an accelerator-driven sub-critical system – a proposed technology for sustainable fission energy. In February, the institute’s prototype front-end linac for the China Accelerator Driven Subcritical System (C-ADS) reached its design goal with the successful commissioning of a 10 mA, 205 kW continuous-wave (CW) proton beam at an energy of 20 MeV. The result breaks the world record for a high-power CW superconducting linac, says Yuan He, director of IMP’s Linac Center: “This result consists of ten years of hard work by IMP scientists, and brings the realisation of an actual ADS facility one step closer to the world.”

The ADS concept, which was proposed by Carlo Rubbia at CERN in late 1990s, offers a potential technology for nuclear-waste transmutation and the development of safe, sustainable nuclear power. The idea is to sustain fission reactions in a subcritical reactor core with neutrons generated by directing a high-energy proton or electron beam, which can be switched on or off at will, at a heavy-metal spallation target. Such a system could run on non-fissile thorium fuel, which is more abundant than uranium and produces less waste. The challenge is to design an accelerator with the required beam power and long-term reliability, for which a superconducting proton linac is a promising candidate.

CAFe is the world’s first CW superconducting proton linac stepping into the hundred-kilowatt level

Yuan He

In 2011, a team at IMP launched a programme to build a superconducting proton linac (CAFe) with an unprecedented 10 mA beam current. It was upgraded in 2018 by replacing the radio-frequency quadrupole and a cryomodule, but the team faced difficulties in reaching the design goals. Challenges including beam-loss control and detection, heavy beam loading and rapid fault recovery were finally overcome in early 2021, enabling the 38 m-long facility to achieve its design performance at the start of the Chinese new year. CAFe’s beam availability during long-term, high-power operation was measured to be 93- 96%, indicating high reliability: 12 hours of operation at 174 kW/10 mA and 108 hours at 126 kW/7.3 mA.

The full C-ADS project is expected to be completed this decade. A similar project called MYRHHA is under way at SCK CEN in Belgium, the front-end linac for which recently entered construction. Other ADS projects are under study in Japan, India and other countries.

“CAFe is the world’s first CW superconducting proton linac stepping into the hundred-kilowatt level,” says He. “The successful operation of the 10 mA beam meets the beam-intensity requirement for an experimental ADS demo facility – a breakthrough for ADS linac development and an outstanding achievement in the accelerator field.”

 

ANAIS challenges DAMA dark-matter claim

ANAIS shows no modulation

Despite the strong indirect evidence for the existence of dark matter, a plethora of direct searches have not resulted in a positive detection. The exception to this are the famous results from the DAMA/NaI experiment at Gran Sasso underground laboratory in Italy, first reported in the late 1990s, which show a modulating signal compatible with Earth moving through a region containing Weakly Interacting Massive Particles (WIMPs). These results were backed-up more recently with measurements from the follow-up DAMA/LIBRA detector. Combining the data in 2018, the evidence reported for a dark-matter signal is as high as 13 sigma.Now, the Annual modulation with NaI Scintillators (ANAIS) collaboration, which aims to directly reproduce the DAMA results using the same detector concept, has published the results from their first three years of operations. The results, which were presented today at Rencontres de Moriond, show a clear contradiction with DAMA, indicating that we are still no closer to finding dark matter.

The DAMA results are based on searches for an annual modulation in the interaction rate of WIMPs in a detector comprising NaI crystals. First theoretically proposed in 1986 by Andrzej Drukier, Katherine Freese and David Spergel, this modulation is a result of the difference in velocity of Earth with respect to the dark-matter halo of the galaxy. On 2 June, the velocities of Earth and the Sun are aligned with respect to the galaxy, whereas half a year later they are oppositely aligned, resulting in a lower cross section for WIMPs with a detector placed on Earth. Although this method has advantages compared to more direct detection methods, it requires that other potential sources of such a seasonal modulation be ruled out. Despite the significant modulation with the correct phase observed by DAMA, its results were not immediately accepted as a clear signal of dark matter due to the remaining possibility of instrumental effects, seasonal background modulation or artifacts from the analysis.

Over the years the significance of the DAMA results has continued to increase while other dark-matter searches, in particular with the XENON1T and LUX experiments, found no evidence of WIMPs capable of explaining the DAMA results. The fact that only the final analysis products from DAMA have been made public has also hampered attempts to prove or disprove alternative origins of the modulation. To overcome this, the ANAIS collaboration set out to reproduce the data with an independent detector intentionally similar to DAMA, consisting of NaI(Tl) scintillators readout by photomultipliers placed in the Canfranc Underground Laboratory deep beneath the Pyrenees in northern Spain. Using this method ANAIS can rule out any instrument-induced effects while producing data in a controlled way and studying it in detail with different analysis procedures.

The ANAIS results agree with the first results published by the COSINE-100 collaboration

ANAIS and DAMA signals

The first three years of ANAIS data have now been unblinded, and the results were posted on arXiv on 1 March. None of the analysis methods used show any signs of a modulation, with a statistical analysis ruling out the DAMA results at 99% confidence. The results therefore narrow down the possible causes of the modulation observed by DAMA to either differences in the detector compared to ANAIS, or in the analysis method. One specific issue raised by the ANAIS collaboration regards the background-subtraction method. In the DAMA results the mean background rate for each year is subtracted from the raw data for that full year. In case the background during that year is not constant, however, this will produce an artificial saw-tooth shape which, with the limited statistics, can be fitted with a sinusoidal. This effect was already pointed out in a publication by a group from INFN in 2020, which showed how a slowly increasing background is capable of producing the exact modulation observed by DAMA. The ANAIS collaboration describes their background in detail, shows that it is indeed not constant, and provides suggestions for a more robust handling of the background.

The ANAIS results also agree with the first results published by the COSINE-100 collaboration in 2019 which, again using a NaI-based detector, found no evidence of a yearly modulation. Thanks to the continued experimental efforts of these two groups, and with the ANAIS collaboration planning to make their data public to allow independent analyses, the more than 20 year-old DAMA anomaly looks likely to be settled in the next few years.

New data strengthens RK flavour anomaly

RK 2021

The principle that the charged leptons have identical electroweak interaction strengths is a distinctive feature of the Standard Model (SM). However, this lepton-flavour universality (LFU) is an accidental symmetry in the SM, which may not hold in theories beyond the SM. The LHCb collaboration has used a number of rare decays mediated by flavour-changing neutral currents, where the SM contribution is suppressed, to test for deviations from LFU. During the past few years, these and other measurements, together with results from B-factories, hint at possible departures from the SM.

In a new measurement of a LFU-sensitive parameter “RK” with increased precision and statistical power, reported today at the Rencontres de Moriond, LHCb has strengthened the significance of the flavour anomalies. The value RK  probes the ratio of B-meson decays to muons and electrons: RK = BR(B+→K+μ+μ)/BR(B+→K+e+e). Testing LFU in such b→sℓ+ transitions has the advantage that not only are SM contributions suppressed, but the theoretical predictions are very precise. Therefore, any significant deviation of RK from unity would imply physics beyond the SM.

The experimental challenge lies in the fact that, while electrons and muons interact via the electroweak force in the same way, the small electron mass means it interacts with detector material much more than muons. For example, electrons radiate a significant number of bremsstrahlung photons when traversing the LHCb detector, which degrades reconstruction efficiency and signal resolution compared to muons. The key to control this effect is to use the decays J/ψ→e+e and J/ψ→μ+μ, which are known to have the same decay probability and can be used to calibrate and test electron reconstruction efficiencies. High precision tests with the J/ψ are compatible with LFU, which provides a powerful cross-check on the experimental analysis.

Previous LHCb measurements of RK and RK* (which probes B0→K*ℓ+ decays) in 2019 and 2017 respectively, provide hints of deviations from unity. The latest analysis of RK, which uses the full dataset collected by the experiment in Run 1 and Run 2 of the LHC, represents a substantial improvement in precision on the previous measurement (see figure) thanks to doubling the dataset. The RK ratio is measured to be three standard deviations from the SM prediction (see figure). This is the first time that a departure from LFU above this level has been seen in any individual B-meson decay, with a value of RK=0.846+0.042-0.039 (stat.) +0.013-0.012 (syst.).

Although it is too early to conclude anything definitive at this stage, this deviation is consistent with a pattern of anomalies which have manifested themselves in b→s ℓ+ and similar processes over the course of the past decade. In particular, the strengthening RK anomaly may be considered alongside hints from other measurements of these transitions, including angular asymmetries and decay rates.

The LHCb experiment is well placed to clarify the potential existence of new-physics effects in these decays. Updates on a suite of  b→s ℓ+ related measurements with the full Run 1 and Run 2 dataset are underway. A major upgrade to the detector during the ongoing second long shutdown of the LHC will offer a step change in precision in Run 3 and beyond.

The CERN Quantum Technology Initiative

By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

Quantum technologies have the potential to revolutionise science and society, but are still in their infancy. In recent years, the growing importance and the potential impact of quantum technology development has been highlighted by increasing investments in R&D worldwide in both academia and industry.

Cutting-edge research in quantum systems has been performed at CERN for many years to investigate the many open questions in quantum mechanics and particle physics. However, only recently, the different ongoing activities in quantum computing, sensing communications and theory have been brought under a common strategy to assess the potential impact on future CERN experiments.

This webinar, presented by Alberto Di Meglio, will introduce the new CERN Quantum Technology Initiative, give an overview of the Laboratory’s R&D activities and plans in this field, and give examples of the potential impact on research. It will also touch upon the rich international network of activities and how CERN fosters research collaborations.

Want to learn more on this subject?

Alberto Di Meglio is the head of CERN openlab in the IT Department at CERN and co-ordinator of the CERN Quantum Technology Initiative. Alberto is an aerospace engineer (MEng) and electronic engineer (PhD) by education and has extensive experience in the design, development and deployment of distributed computing and data infrastructures and software services for both commercial and research applications.

He joined CERN in 1998 as data centre systems engineer. In 2004, he took part in the early stages of development of the High-Energy Physics Computing Grid. From 2010 to 2013, Alberto was project director of the European Middleware Initiative (EMI), a project responsible for developing and maintaining most of the software services powering the Worldwide LHC Computing Grid.

Since 2013, Alberto has been leading CERN openlab, a long-term initiative to organise public–private collaborative R&D projects between CERN, academia and industry in ICT, computer and data science, covering many aspects of today’s technology, from heterogenous architecture and distributed computing to AI and quantum technologies.









Roger J N Phillips 1931–2020

R Phillips

The eminent theoretical physicist Roger Julian Noel Phillips died peacefully on 4 September 2020, aged 89, at his home in Abingdon, UK. Roger was educated at Trinity College, Cambridge, where he received his PhD in 1955. His thesis advisor was Paul Dirac. Roger transferred from the Harwell theory group to the Rutherford Appleton Laboratory (RAL) in 1962 where he led the theoretical high-energy physics group to international prominence. He also held visiting appointments at CERN, Berkeley, Madison and Riverside.

Roger was a giant in particle physics phenomenology and his book “Collider Physics” (Addison-Wesley, 1987), co-authored with his longstanding collaborator Vernon Barger, remains a classic. In 1990 Roger was awarded the Ernest Rutherford Prize & medal of the UK Institute of Physics. To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built. To theorists, he was renowned for his deep understanding of particle-physics models. A career-long collaboration across the Atlantic with Barger ensued from their sharing an office at CERN in 1967. Their initial focus was the Regge-pole model to describe high-energy scattering of hadrons. Subsequently they inferred the momentum distribution of the light quarks and gluons from deep-inelastic scattering data and made studies to identify the charm-quark signal in a Fermilab neutrino experiment.

To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built

In 1980, Phillips and collaborators discovered the resonance in neutrino oscillations when neutrinos propagate long distances through matter. This work is the basis of the ongoing Fermilab long-baseline neutrino program that will make precision determinations of neutrino masses and mixing. From 1983, Phillips and his collaborators developed pioneering strategies in collider physics for finding the W boson, the top quark, the Higgs boson and searches for physics beyond the Standard Model. In an influential 1990 publication, Phillips, Hewett and Barger showed that the decay of a b-quark to an s-quark and a photon is a highly sensitive probe of a charged Higgs boson through its one-loop virtual contribution.

After retiring in 1997, Roger maintained an active interest in particle physics. He struggled with Parkinson’s disease in recent years but continued to live with determination, wit and cheer. He joked that his Parkinson’s tremor made his mouse and keyboard run wild: “I know that an infinite number of random monkeys can eventually write Shakespeare, but I can’t wait that long!” One of his very last whispers to his son David was: “There are symmetries in mathematics which are like aspects of dreaming”. He did great things with his brain when he was alive that will continue as he donated his to the Parkinson’s UK Brain Bank.

Roger was highly respected for his intellectual brilliance, physics leadership and immense integrity, but also for his modesty and generosity in going out of his way to help others. He was a delight to work with and an inspiration to all who knew him. He is missed by his many friends around the world.

Odderon discovered

The TOTEM collaboration at the LHC, in collaboration with the DØ collaboration at the former Tevatron collider at Fermilab, have announced the discovery of the odderon – an elusive three-gluon state predicted almost 50 years ago. The result was presented in a “discovery talk” on Friday 5 March during the LHC Forward Physics meeting at CERN, and follows the joint publication of a CERN/Fermilab preprint by TOTEM and DØ reporting the observation in December 2020.

This result probes the deepest features of quantum chromodynamics

Simone Giani

“This result probes the deepest features of quantum chromodynamics, notably that gluons interact between themselves and that an odd number of gluons are able to be ‘colourless’, thus shielding the strong interaction,” says TOTEM spokesperson Simone Giani of CERN. “A notable feature of this work is that the results are produced by joining the LHC and Tevatron data at different energies.”

States comprising two, three or more gluons are usually called “glueballs”, and are peculiar objects made only of the carriers of the strong force. The advent of quantum chromodynamics (QCD) led theorists to predict the existence of the odderon in 1973. Proving its existence has been a major experimental challenge, however, requiring detailed measurements of protons as they glance off one another in high-energy collisions.

While most high-energy collisions cause protons to break into their constituent quarks and gluons, roughly 25% are elastic collisions where the protons remain intact but emerge on slightly different paths (deviating by around a millimetre over a distance of 200 m at the LHC). TOTEM measures these small deviations in proton–proton (pp) scattering using two detectors located 220 m on either side of the CMS experiment, while DØ employed a similar setup at the Tevatron proton–antiproton (pp̄) collider.

Pomerons and odderons

At low energies, differences in pp vs pp̄ scattering are due to the exchange of different virtual mesons. At multi-TeV energies, on the other hand, proton interactions are expected to be mediated purely by gluons. In particular, elastic scattering at low-momentum transfer and high energies has long been explained by the exchange of a pomeron – a colour-neutral virtual glueball made up of an even number of gluons.

However, in 2018 TOTEM reported measurements at high energies that could not easily be explained by this traditional picture. Instead, a further QCD object seemed to be at play, supporting models in which a three-gluon compound, or one containing higher odd numbers of gluons, was being exchanged. The discrepancy came to light via measurements of a parameter called ρ, which represents the ratio of the real and imaginary parts of the forward elastic-scattering amplitude when there is minimal gluon exchange between the colliding protons and thus almost no deviation in their trajectories. The results were sufficient to claim evidence for the odderon, although not yet its definitive observation.

The D⌀ experiment

The new work is based on a model-independent analysis of data at medium-range momenta transfer. The TOTEM and DØ teams compared LHC pp data (recorded at collision energies of 2.76, 7, 8 and 13 TeV and extrapolated to 1.96 TeV), with Tevatron pp̄ data measured at 1.96 TeV. The odderon would be expected to contribute with different signs to pp and pp̄ scattering. Supporting this picture, the two data sets disagree at the 3.4σ level, providing evidence for the t-channel exchange of a colourless, C-odd gluonic compound.

“When combined with the ρ and total cross-section result at 13 TeV, the significance is in the range 5.2-5.7σ and thus constitutes the first experimental observation of the odderon,” said Christophe Royon of University of Kansas, who presented the results on behalf of DØ and TOTEM last week. “This is a major discovery by CERN/Fermilab.”

In addition to the new TOTEM-DØ model-independent study, several theoretical papers based on data from the ISR, SPS, Tevatron and LHC, and model-dependent inputs, provide additional evidence supporting the conclusion that the odderon exists.

Precision leap for Bs0 fragmentation and decay

How likely is it for a b quark to partner itself with an s quark rather than a light d or u quark? This question is key for understanding the physics of fragmentation and decay following the production of a b quark in proton–proton collisions. In addition, the number of Bs0 mesons to be produced, formed by a pair of b and s quarks, is required for measuring its decay probabilities, most notably to final states that are sensitive to physics beyond the Standard Model, such as the Bs0 → μ+μ decay.

Figure 1

The knowledge of fs/fd – the ratio of the fragmentation fraction of a b quark to a Bs0 or a B0 meson – is thus a key parameter at the LHC. So far it has been measured with limited precision and has been the dominant systematic uncertainty for most B0s branching fractions. Now, however, the LHCb collaboration has, in a recent publication, combined the efforts of five different analyses with information on this parameter. The fs/fd ratio was measured in previous publications through semi­leptonic decays, hadronic decays with D mesons and hadronic decays with J/ψ mesons in the final state. Some of these measurements are only sensitive to the product of the fragmentation fraction and the branching fractions. This new work analyses these results simultaneously, obtaining a precise measurement of fs/fd as well as branching fraction measurements of two important decays, B0s → Ds π+ and B0s → J/ψ φ. These are golden channels for mixing and CP violation measurements in the B0ssector.

Precision leap

The results reduce the uncertainty on fs/fd by roughly a factor of two for collisions at 7 TeV, and a factor of 1.5 for collisions at 13 TeV, yielding a precision of about 3%. They also confirm the dependence of fs/fd on the transverse momentum of the B0s meson, and indicate a slight dependence on the centre-of-mass energy of proton–proton collisions (figure 1). The results are used in this work to update the previous branching-fraction measurements of about 50 different B0s decay channels, significantly improving their precision, and boosting several searches for new physics.

Tooling up to hunt dark matter

Bullet Cluster

The past century has seen ever stronger links forged between the physics of elementary particles and the universe at large. But the picture is mostly incomplete. For example, numerous observations indicate that 87% of the matter of the universe is dark, suggesting the existence of a new matter constituent. Given a plethora of dark-matter candidates, numerical tools are essential to advance our understanding. Fostering cooperation in the development of such software, the TOOLS 2020 conference attracted around 200 phenomenologists and experimental physicists for a week-long online workshop in November.

The viable mass range for dark matter spans 90 orders of magnitude, while the uncertainty about its interaction cross section with ordinary matter is even larger (see “Theoretical landscape” figure). Dark matter may be new particles belonging to theories beyond-the-Standard Model (BSM), an aggregate of new or SM particles, or very heavy objects such as primordial black holes (PBHs). On the latter subject, Jérémy Auffinger (IP2I Lyon) updated TOOLS 2020 delegates on codes for very light PBHs, noting that “BlackHawk” is the first open-source code for Hawking-radiation calculations.

Flourishing models

Weakly interacting massive particles (WIMPs) have enduring popularity as dark-matter candidates, and are amenable to search strategies ranging from colliders to astrophysical observations. In the absence of any clear detection of WIMPs at the electroweak scale, the number of models has flourished. Above the TeV scale, these include general hidden-sector models, FIMPs (feebly interacting massive particles), SIMPs (strongly interacting massive particles), super-heavy and/or composite candidates and PBHs. Below the GeV scale, besides FIMPs, candidates include the QCD axion, more generic ALPs (axion-like particles) and ultra-light bosonic candidates. ALPs are a class of models that received particular attention at TOOLS 2020, and is now being sought in fixed-target experiments across the globe.

For each dark-matter model, astro­particle physicists must compute the theoretical predictions and characteristic signatures of the model and confront those predictions with the experimental bounds to select the model parameter space that is consistent with observations. To this end, the past decade has seen the development of a huge variety of software – a trend mapped and encouraged by the TOOLS conference series, initiated by Fawzi Boudjema (LAPTh Annecy) in 1999, which has brought the community together every couple of years since.

Models connecting dark matter with collider experiments are becoming ever more optimised to the needs of users

Three continuously tested codes currently dominate generic BSM dark-matter model computations. Each allows for the computation of relic density from freeze-out and predictions for direct and indirect detection, often up to next-to-leading corrections. Agreement between them is kept below the percentage level. “micrOMEGAs” is by far the most used code, and is capable of predicting observables for any generic model of WIMPs, including those with multiple dark-matter candidates. “DarkSUSY” is more oriented towards supersymmetric theories, but it can be used for generic models as the code has a very convenient modular structure. Finally, “MadDM” can compute WIMP observables for any BSM model from MeV to hundreds of TeV. As MadDM is a plugin of MadGraph, it inherits unique features such as its automatic computation of new dark-matter observables, including indirect-detection processes with an arbitrary number of final-state particles and loop-induced processes. This is essential for analysing sharp spectral features in indirect-detection gamma-ray measurements that cannot be mimicked by any known astrophysical background.

Interaction cross sections versus mass

Both micrOMEGAs and MadDM permit the user to confront theories with recast experimental likelihoods for several direct and indirect detection experiments. Jan Heisig (UCLouvain) reported that this is a work in progress, with many more experimental data sets to be included shortly. Torsten Bringmann (University of Oslo) noted that a strength of DarkSUSY is the modelling of qualitatively different production mechanisms in the early universe. Alongside the standard freeze-out mechanism, several new scenarios can arise, such as freeze-in (FIMP models, as chemical and kinetic equilibrium cannot be achieved), dark freeze-out, reannihilation and “cannibalism”, to name just a few. Freeze-in is now supported by micrOMEGAs.

Models connecting dark matter with collider experiments are becoming ever more optimised to the needs of users. For example, micrOMEGAs interfaces with SModelS, which is capable of quickly applying all possible LHC-relevant supersymmetric searches. The software also includes long-lived particles, as commonly found in FIMP models. As MadDM is embedded in MadGraph, noted Benjamin Fuks (LPTHE Paris), tools such as MadAnalysis may be used to recast CMS and ATLAS searches. Celine Degrande (UCLouvain) described another nice tool, FeynRules, which produces model files in both the MadDM and micrOMEGAs formats given the Lagrangian for the BSM model, providing a very useful automatised chain from the model directly to the dark-matter observables, high-energy predictions and comparisons with experimental results. Meanwhile, MadDump expands MadGraph’s predictions and detector simulations from the high-energy collider limits to fixed-target experiments such as NA62. To complete a vibrant landscape of development efforts, Tomas Gonzalo (Monash) presented the GAMBIT collaboration’s work to provide tools for global fits to generic dark-matter models.

A phenomenologists dream

Huge efforts are underway to develop a computational platform to study new directions in experimental searches for dark matter, and TOOLS 2020 showed that we are already very close to the phenomenologist’s dream for WIMPs. TOOLS 2020 wasn’t just about dark matter either – it also covered developments in Higgs and flavour physics, precision tests and general fitting, and other tools. Interested parties are welcome to join in the next TOOLS conference due to take place in Annecy in 2022.

bright-rec iop pub iop-science physcis connect