Supersymmetry (SUSY) introduces a new fermion–boson symmetry that gives rise to supersymmetric “partners” of the Standard Model (SM) particles, and “naturally” leads to a light Higgs boson with mass close to that of the W and Z. SUSY partners that are particularly relevant in these “natural SUSY” scenarios are the top and bottom squarks, as well as the SUSY partners of the weak SM bosons, the neutralinos and charginos.
Despite the theory’s many appealing features, searches for SUSY at the LHC and elsewhere have so far yielded only exclusion limits. With LHC Run 2 completed as of the end of 2018, the ATLAS experiment has recorded 139 fb-1 of physics-quality proton–proton collisions at a centre-of-mass energy of 13 TeV. Three recent ATLAS SUSY searches highlight the significant increase in sensitivity offered by this dataset.
The first search took advantage of refinements in b-tagging to search for light bottom squarks decaying into bottom quarks, Higgs bosons and the lightest SUSY partner, which is assumed to be invisible and stable (a candidate for dark matter). The data agree with the SM and lead to significantly improved constraints, with bottom squark masses now excluded up to 1.5 TeV.
If the accessible SUSY particles can only be produced via electroweak processes, the resulting low-production cross sections present a challenge. The second search focuses on such electroweak SUSY signatures with two charged leptons and a significant amount of missing momentum carried away by a pair of the lightest SUSY partners. The current search places strong constraints on SUSY models with light charginos and more than doubles the sensitivity of the previous analysis (figure 1).
A third recent analysis considered less conventional signatures. Top squarks – the bosonic SUSY partner of the top quark – may evade detection if they have a long lifetime and decay at macroscopic distances from the collision point. This search looked for SUSY particles decaying to a quark and a muon, looking primarily for long-lived top squarks that decayed several millimetres into the detector volume. The observed results are consistent with the background-only expectation.
These analyses represent just the beginning of a large programme of SUSY searches using the entirety of the Run-2 dataset. With a rich signature space left to explore, there remains plenty of room for discovery in mining the riches from the LHC.
Ever since the 1970s, when the third generation of quarks and leptons began to emerge experimentally, physicists have asked if further generations await discovery. One of the first key results from the Large Electron–Positron Collider 30 years ago provided evidence to the contrary, showing that there are only three generations of neutrinos. The discovery of the Higgs boson in 2012 added a further wrinkle to the story: many theorists believe that the mass of the Higgs boson is unnaturally small if there are additional generations of quarks heavier than the top quark. But a loophole arises if the new heavy quarks do not interact with the Higgs field in the same way as regular quarks. The search for new heavy fourth-generation quarks – denoted T – is therefore the subject of active research at the LHC today.
CMS researchers have recently completed a search for such “vector-like” quarks using a new machine-learning method that exploits special relativity in a novel way. If the new T quarks exist, they are expected to decay to a quark and a W, Z or Higgs boson. As top quarks and W/Z/H bosons decay themselves, production of a T quark–antiquark pair could lead to dozens of different final states. While most previous searches focused on a handful of channels at most, this new analysis is able to search for 126 different possibilities at once.
The key to classifying all the various final states is the ability to identify high-energy top quarks, Higgs bosons, and W and Z bosons that decay into jets of particles recorded by the detector. In the reference frame of the CMS detector, these particles produce wide jets that all look alike, but things look very different in a frame of reference in which the initial particle (a W, Z or H boson, or a top quark) is at rest. For example, in the centre-of-mass frame of a Higgs boson, it would appear as two well-collimated back-to-back jets of particles, whereas in the reference frame of the CMS detector the jets are no longer back-to-back and may indeed be difficult to identify as separate at all. This feature, based on special relativity, tells us how to distinguish “fat” jets originating from different initial particles.
Modern machine-learning techniques were used to train a deep neural-network classification algorithm using simulations of the expected particle decays. Several dozen properties of the jets were calculated in different hypothetical reference frames, and fed to the network, which classifies the original fat jets as coming from either top quarks, H, W or Z bosons, b quarks, light quarks, or gluons. Each event is then classified according to how many of each jet type there are in the event. The number of observed events in each category was then compared to the predicted background yield: an excess could indicate T-quark pair production.
CMS found no evidence for T-quark pair production in the 2016 data, and has excluded T-quark masses up to 1.4 TeV (figure 1). The collaboration is working on new ideas to improve the classification method and extend the search to higher masses using the four-times larger 2017 to 2018 dataset.
The study of lead–ion collisions at the LHC is a window into the quark–gluon plasma (QGP), a hot and dense phase of deconfined quarks and gluons. An important effect in heavy-ion collisions is jet quenching – the suppression of particle production at large transverse momenta (pT) due to energy loss in the QGP. This suppression is quantified by the nuclear-modification factor RAA, which is the ratio of particle production rate in Pb–Pb collisions to that in proton–proton collisions, scaled for the number of binary nucleon–nucleon collisions. A measured nuclear modification factor of unity would indicate the absence of final-state effects such as jet quenching.
Previous measurements of peripheral collisions revealed less suppression than seen in head-on collisions, but RAA remained significantly below unity. This observation indicates the formation of a dense and strongly interacting system – but it also poses a puzzle. In p–Pb collisions, no suppression has been observed, even though the energy densities are similar to those in peripheral Pb–Pb collisions.
The ALICE collaboration has recently put jet quenching to the test experimentally by performing a rigorous measurement of RAA in narrow centrality bins. The results (figure 1, left) show that the trend of a gradual reduction in the suppression of high-pT particle production as one moves from the most central collisions (corresponding to the 0% centrality percentile) to those with a greater impact parameter does not continue above a centrality of 75%. Instead, the data show a dramatically different behaviour: increasingly strong suppression for the most peripheral collisions. The change at 75% centrality shows that the suppression mechanism for peripheral collisions is fundamentally different from that observed in central collisions, where the suppression can be explained by parton energy loss in the QGP.
In a single Pb–Pb collision several nucleons collide. It has recently been suggested that the alignment of each nucleon collision plays an important role: if the nucleons are aligned, a single collision produces more particles, which results in a correlation between particle production at low pT, which is used to determine the centrality, and at high pT, where RAA is measured. The suppression in the peripheral events can be modelled with a simple PYTHIA- based model that does not implement jet-quenching effects, but incorporates the biases originating from the alignment of the nucleons, yielding qualitative agreement above 75% centrality (figure 1, right).
These results demonstrate that with the correct treatment of biases from the parton–parton interactions the observed suppression in Pb–Pb collisions is consistent with results from p–Pb collisions at similar multiplicities – an important new insight into the nuclear modification factor in small systems.
The LHCb collaboration has discovered a new pentaquark particle, dubbed the Pc(4312)+, decaying to a J/ψ and a proton, with a statistical significance of 7.3 standard deviations. The LHCb data, first presented at Rencontres de Moriond in March, also confirm that the Pc(4450)+ structure previously reported by the collaboration in 2015 has now been resolved into two narrow, overlapping peaks, the Pc(4440)+ and Pc(4457)+, with a statistical significance of 5.4 standard deviations compared to the single-peak hypothesis (figure 1). Together, the results offer rich studies of the strong internal dynamics of exotic hadrons.
In the famous 1964 papers that set out the quark model, Murray Gell-Mann and George Zweig mentioned the possibility of adding a quark–antiquark pair to the minimal meson and baryon states qq̅ and qqq, thereby proposing the new configurations qqq̅q̅ and qqqqq̅. Nearly four decades later, the Belle collaboration discovered the surprisingly narrow X(3872) state with a mass very close to the D0D̅*0 threshold, hinting at a tetraquark structure (cc̅uu̅). A decade after that, Belle discovered narrow Zb0,± states just above the BB̅* and B*B̅* thresholds; this was followed by observations of Zc0,± states just above the equivalent charm thresholds by BES-III and Belle. The existence of charged Zb± and Zc± partners makes the exotic nature of these states clear: they cannot be described as charmonium (cc̅) or bottomonium (bb̅) mesons, which are always neutral, but must instead be a combination such as cc̅ud̅. There is also evidence for broad Zc± states from Belle and LHCb, such as the Zc(4430)±.
A major turning point in exotic baryon spectroscopy was achieved by LHCb in July 2015 when, based on an analysis of Run 1 data, the collaboration reported significant pentaquark structures in the J/ψ−p mass distribution in Λb0→ J/ψpK− decays. A narrow Pc(4450)+ and a broad Pc(4380)+ were reported, both with minimal quark content of cc̅uud (CERN Courier September 2015 p5).
The new results use the data collected at LHCb in Run 1 and Run 2, providing a Λb0 sample nine times larger than that used in the 2015 paper. The new data reproduce the parameters of the Pc(4450)+ and Pc(4380)+ states when analysed the same way as before. However, the much larger dataset makes a more fine-grained analysis possible, revealing additional peaking structures in the J/ψ-p invariant mass spectrum that were not visible before. A new narrow peak, with a width comparable to the mass resolution, is observed near 4312 MeV, right below the Σ+cD̅0 threshold. The structure seen before at 4450 MeV has been resolved into two narrower peaks, at 4440 and 4457 MeV. The latter is right at the Σ+cD̅*0 threshold.
These Pc states join a growing family of narrow exotic hadrons with masses near hadron–hadron thresholds. This is expected in certain models of loosely bound “molecular” states whose structure resembles the way a proton and neutron bind to form a deuteron. Other models, such as of tightly bound pentaquarks, could also explain the Pc resonances. A more complete understanding will require further experimental and theoretical investigation.
Searching for the decay μ+ → e+γ is like looking for a needle in a haystack the size of the Great Pyramid of Giza. This simile-stretching endeavour is the task of the MEG II experiment at the Paul Scherrer Institute (PSI) in Villigen, Switzerland. MEG II is an upgrade of the previous MEG experiment, which operated from 2008 to 2013. All experimental data so far are consistent with muon decays that conserve lepton flavour by the production of two appropriately flavoured neutrinos. Were MEG II to observe the neutrinoless decay of the muon to a positron and a photon, it would be the first evidence of flavour violation with charged leptons, and unambiguous evidence for new physics.
Lepton-flavour conservation is a mainstay of every introductory particle-physics course, yet it is merely a so-called accidental symmetry of the Standard Model (SM). Unlike gauge symmetries, it arises because only massless left-handed neutrinos are included in the model. The corresponding mass and interaction terms of the Lagrangian can therefore be simultaneously diagonalised, which means that interactions always conserve lepton flavour. This is not the case in the quark sector, and as a result quark flavour is not conserved in weak interactions. Since lepton flavour is not considered to be a fundamental symmetry, most extensions of the SM predict its violation at a level that could be observed by state-of-the-art experiments.
Indeed an extension of the SM is already required to include the tiny neutrino masses that we infer from neutrino oscillations. In this extension, neutrino oscillations induce charged lepton-flavour-violating processes but with the branching ratio for μ+ → e+γ emerging to be only 10–54, which cannot be accessed experimentally (see “Charged lepton-flavour violation in the SM” box). A data sample of muons as large as the number of protons in the Earth would not be enough to see such an improbable decay. Charged lepton-flavour violation is therefore a clear signature of new physics with no SM backgrounds.
Finding the needle
The search requires an intense source of muons, and detectors capable of reconstructing the kinematics of the muon’s decay products with high precision. PSI offers the world’s most intense continuous muon beams, delivering up to 108 muons per second. MEG II (previously as MEG) is designed to search for μ+ → e+γ by stopping positive muons on a thin target, and looking for positron–photon pairs from muon decays at rest. This method exploits the two-body kinematics of the decay to discriminate signal events from the backgrounds, which are predominantly the radiative muon decay μ+ → e+ νe ν̅μ γ and the accidental time coincidence of a positron and photon produced by different muon decays.
In the late 1990s, when the first MEG experiment was being designed, theorists argued that the μ+ → e+γ branching ratio could be as high as 10–12 to 10–14, based on supersymmetry arising at the TeV scale. Twenty years later, MEG has excluded branching ratios above 4.2 × 10–13 (figure 1), and supersymmetric particles remain undiscovered at the LHC. Nevertheless, since charged lepton-flavour-violating processes are sensitive to the virtual exchange of new particles, while not requiring their creation as at the LHC, they can probe new physics models (supersymmetry, extra dimensions, leptoquarks, multi-Higgs, etc) up to mass scales of thousands of TeV. Scales such as these are not only unreachable at the LHC, but also at near-future accelerators.
The MEG collaboration therefore decided to upgrade the detectors with the goal of improving the sensitivity of the experiment by a factor of 10. The new experiment, which adopts the same measurement principle, is expected to start taking data at the end of 2019 (figure 2). Photons are reconstructed by a liquid xenon (LXe) detector technology that was pioneered by the MEG collaboration, achieving an unprecedented ~2% calorimetric resolution at energies as low as 52.8 MeV – the energy of the photon in a μ+ → e+γ decay. The LXe detector provides a high-resolution measurement of the position and timing of the photon conversion, precise to a few millimetres and approximately 70 ps. The positrons are reconstructed in a magnetic spectrometer instrumented with drift chambers for tracking, and scintillator bars for timing. A peculiarity of the MEG spectrometer is a non-uniform magnetic field, diminishing from 1.2 T at the centre of the detector to 0.5 T at the extremities. The gradated field prevents positrons from curling too many times. This avoids pileup in the detectors and makes positrons of the same momentum curl with the same radius, independent of their emission angle, thus simplifying the design and operation of the tracking system.
Following a major overhaul that was begun in 2011, all the detectors have now been upgraded. Silicon photomultipliers custom-modified for sensitivity to the ultraviolet LXe scintillation light have replaced conventional photomultipliers on the inner face of the calorimeter. Small scintillating tiles have replaced the scintillating bars of the positron-timing detector to improve timing and reduce pileup. The main challenge when upgrading the drift chambers was dealing with high positron rates. Here, the need for high granularity had to be balanced by keeping the total amount of material low. This reduces both multiple scattering and the rate of positrons annihilating in the material, and contributions to the coincident-photon background in the calorimeter. The solution was the use of extremely thin 40 and 50 μm silver-plated aluminium wires, 20 μm gold-plated tungsten wires, and innovative assembly techniques. All the detectors’ resolutions were improved by a factor of around two with respect to the MEG experiment. The MEG II design also includes a new detector to veto photons coming from radiative muon decays, improved calibration tools and new trigger and data-acquisition electronics to cope with the increased number of readout channels. The improved detector performance will allow the muon beam rate to be more than doubled, from 3.3 × 107 to 7 × 107 muons per second.
The detectors were installed and tested in the muon beam in 2018. In 2019 a test of the whole detector will be completed, with the possibility of collecting the first physics data. The experiment is then expected to run for three years to uncover evidence for the μ+ → e+γ decay if the branching ratio is around 10–13 or set a limit of 6 × 10–14 on its branching ratio.
Charged lepton-flavour violation in the SM – a very small neutrino oscillation experiment
The presence of only massless left-handed neutrinos in the Standard Model (SM) gives rise to the accidental symmetry of lepton-flavour conservation – yet neutrino oscillation experiments have observed neutrinos changing flavour in-transit from sources as far away as the Sun and as near as a nuclear reactor. Such neutral lepton-flavour violation implies that neutrinos have tiny masses and that their flavour eigenstates are distinct from their mass eigenstates. Phases develop between the mass eigenstates as a neutrino travels, and the wavefunction becomes a mixture of the flavour eigenstates, rather than the unique original flavour, as would remain the case for truly massless neutrinos.
The effect on charged lepton-flavour violation is subtle and small. In most neutrino oscillation experiments, a neutrino is created in a charged-current interaction and observed in a later interaction via the creation of a charged lepton of the corresponding flavour in the detector.
μ+ → e+γ may proceed in a similar way, but where the same W boson is involved in both the creation and destruction of the neutrino, and the neutrino oscillates in between (see figure above).
In this process, the neutrino oscillation ν̅μ→ν̅e has to occur at an energy scale E ~ mw, over an extremely short distance of L ~ 1/mw. Considering only two neutrino species with masses m1 and m2, the probability for the oscillation is proportional to sin2 [(m21 – m22) L /4E]. Hence, the μ → eγ branching ratio is suppressed by the tiny factor (m21 – m22)/m2w)2 ≲ 10–49. The exact calculation, including the most recent estimates of the neutrino mixing matrix elements, gives BR(μ → eγ) ~ 10–54.
New directions
In the meantime, PSI researchers are investigating the possibility of building new beamlines with 109 or even 1010 muons per second to allow experimenters to probe even smaller branching ratios. How could a future experiment cope with such high rates? Preliminary studies are investigating a system where photons are converted into pairs of electrons and positrons, and reconstructed in a tracking device. This solution, which has already been exploited previously by the MEGA experiment at Los Alamos National Laboratory, could also improve the photon resolution.
At the same time, other experiments are searching for charged lepton-flavour violation in other channels. Mu3e, also at PSI, will search for μ+ → e+e+e– decays. The Mu2e and COMET experiments, at Fermilab and J-PARC, respectively, will search for muon-to-electron conversion in the field of a nucleus. These processes are complementary to μ+ → e+γ,allowing alternative scenarios to be probed. At the same time, collider experiments such as Belle II and LHCb are working on studies of lepton-flavour violation in tau decays. LHCb researchers are also testing lepton universality, which holds that the weak couplings are the same for each lepton flavour (see The flavour of new physics). As theorists often stress, all these analyses are strongly complementary both with each other and with direct searches for new particles at the LHC.
Ever since the pioneering work of Conversi, Pancini and Piccioni, muons have played a crucial role in the development of particle physics. When I I Rabi exclaimed “who ordered that?”, he surely did not imagine that 80 years later the lightest unstable elementary particle would still be a focus of cutting-edge research.
On 10 April, researchers working on the Event Horizon Telescope – a network of eight radio dishes that creates an Earth-sized interferometer – released the first direct image of a black hole. The landmark result, which shows the radiation emitted by superheated gas orbiting the event horizon of a super massive black hole in a nearby galaxy, opens a brand new window on these incredible objects.
Super massive black holes (SMBHs) are thought to occupy the centre of most galaxies, including our own, with masses up to billions of solar masses and sizes up to 10 times larger than our solar system. Discovered in the 1960s via radio and optical measurements, their origin, as well as their nature and surrounding environments, remain important open issues within astrophysics. Spatially resolved images of an SMBH and the potential accretion disks around them form vital input, but producing such images is extremely challenging.
SMBHs are relatively bright in radio wavelengths. However, since the imaging resolution achievable with a telescope scales with the wavelength (which is long in the radio range) and scales inversely with the telescope diameter, it is difficult to obtain useful images in the radio region. For example, producing an image with the same resolution as the optical Hubble Space Telescope would require a km-wide telescope, while obtaining a resolution that would allow an SMBH to be imaged, would require a telescope diameter of thousands of kilometres. One way around this is to use interferometry to turn many telescopes dishes at different locations into one large telescope. Such an interferometer measures the differences in arrival time of one radio wave at different locations on Earth (induced by the difference in travel path), from which it is possible to reconstruct an image on the sky. This does not only require a large coordination between many telescopes around the world, but also very precise timing, vast amounts of collected data and enormous computing power.
Despite the considerable difficulties, the Event Horizon Telescope project used this technique to produce the first image of an SMBH using an observation time of only tens of minutes. The imaged SMBH lies at the centre of the supergiant elliptical galaxy Messier 87, which is located in the Virgo constellation at a distance of around 50 million light years. Although relatively close in astronomical terms, its very large mass makes its size on the sky comparable to that of the much lighter SMBH in the centre of our galaxy. Furthermore, its accretion rate (brightness) is variable on longer time scales, making it easier to image. The resulting image (above) shows the clear shadow of the black hole in the centre surrounded by an asymmetric ring caused by radio waves that are bent around the SMBH by its strong gravitational field. The asymmetry is likely a result of relativistic beaming of part of the disk of matter which moves towards Earth.
The team compared the image to a range of detailed simulations in which parameters such as the black hole’s mass, spin and orientation were varied. Additionally, the characteristics of the matter around the SMBH, mainly hot electrons and ions, as well as the magnetic field properties were varied. While the image alone does not allow researchers to constrain many of these parameters, combining it with X-ray data taken by the Chandra and NuSTAR telescopes enables a deeper understanding. For example, the combined data constrain the SMBH mass to 6.5 billion solar masses and appears to exclude a non-spinning black hole. Whether the matter orbiting the SMBH rotates in the same direction or opposite to the black hole, as well as details on the environment around it, will require additional studies. Such studies can also potentially exclude alternative interpretations of this object; currently, exotic objects like boson stars, gravastars and wormholes cannot be fully excluded.
The work of the Event Horizon Telescope collaboration, which involves more than 200 researchers worldwide, was published in six consecutive papers in TheAstrophysical Journal Letters. While more images at shorter wavelengths are foreseen in the future, the collaboration also points out that much can be learned by combining the data with that from other wavelengths, such as gamma-rays. Despite this first image being groundbreaking, it is likely only the start of a revolution in our understanding of black holes and, with it, the universe.
The LHCb collaboration has released a much anticipated update on its measurement of RK – a ratio that describes how often a B+ meson decays to a charged kaon and either a μ+μ– or an e+e– pair, and therefore provides a powerful test of lepton universality. The more precise measurement, officially revealed at Rencontres de Moriond on 22 March, suggests that the intriguing current picture of flavour anomalies persists.
Since 2013, several results involving the decay of b quarks have hinted at deviations from lepton universality, a tenet of the Standard Model (SM), though none is individually significant enough to constitute evidence of new physics. LHCb has studied a number of ratios comparing b-decays to different leptons and also sees signs that something is amiss in angular distributions of B→K*μ+μ− decays. Data from BaBar and Belle add further intrigue, though with lower statistical significances.
The latest measurement from LHCb is the first lepton-universality test performed using part of the 13 TeV Run 2 data set (2015–2016) together with the full Run 1 data sample, representing in total an integrated luminosity of 5fb-1. The blinded analysis was performed in the range 1.1<q2<6.0 GeV2, where q2 is the invariant mass of the μ+μ– or e+e– pair. It found RK = 0.846+0.060–0.054 (stat) +0.016–0.014 (syst), the most precise measurement to date. However, having shifted closer to the Standard Model prediction, the value leaves the overall significance unchanged at about 2.5 standard deviations.
“I cannot tell you if lepton-flavour universality is broken or not, so sorry for this!” said Thibaud Humair of Imperial College London, who presented the result on behalf of the LHCb collaboration. “All LHCb results for RK are below SM expectations. Together with b → sμ+μ− results, RK and RK* constitute an interesting pattern of anomalies, but the significance is still low,” he said.
Humair’s talk generated much discussion, with physicists pressing LHCb on potential sources of uncertainties and other possible explanations such as the dependence of RK on q2. Other experiments also showed new measurements of lepton universality and other related tests of the Standard Model, such as ATLAS on the branching ration of Bs→μ+μ− and an update from Belle on both RD(*) and RK*. The current experimental activity in flavour physics was reflected by several talks at Moriond from theorists.
“It’s not a discovery, but something is going on,” says David Straub of TUM Munich, who had spent the previous 24 hours working solid to update a global likelihood fit of all parameters relevant to the b anomalies with the updated LHCb and Belle results. The fit, which involves 265 observables showed that b → sl+l– observables such as RK continue to show a “large pull” towards new-physics. “The popular ‘U1 leptoquark’ is still giving excellent fit to the data”, says Straub.
Further reduction in the uncertainty on RK can be expected when the data collected by LHCb in 2017 and 2018 are included in a future analysis. Meanwhile, in Japan, the Belle II physics programme has now begun in earnest and the collaboration is expected to bring further statistical power to the b-anomaly question in the near future.
On the morning of 21 March, at the 2019 Rencontres de Moriond in La Thuile, Italy, the LHCb collaboration announced the discovery of charge-parity (CP) violation in the charm system. Met with an impromptu champagne celebration, the result represents a milestone in particle physics and opens a new area of investigation in the charm sector.
CP violation, which results in differences in the properties of matter and antimatter, was first observed in the decays of K mesons (which contain strange quarks) in 1964 by James Cronin and Val Fitch. Even though parity (P) violation had been seen eight years earlier, the discovery that the combined C and P symmetries are not conserved was unexpected. The story deepened in the early 1970s, when, building on the foundations laid by Nicola Cabibbo and others, Makoto Kobayashi and Toshihide Maskawa showed that CP violation could be included naturally in the Standard Model (SM) if at least six different quarks existed in nature. Their fundamental idea – whereby direct CP violation arises if a complex phase appears in the CKM matrix describing quark mixing – was confirmed 30 years later by the discovery of CP violation in B-meson decays by the BaBar and Belle collaborations. Despite decades of searches, CP violation in the decays of charmed particles escaped detection.
LHCb physicists used the unprecedented dataset accumulated in 2011–2018 to study the difference in decay rates between D0 and D̅0 (which contain a c quark or antiquark) decaying into K+K– or π+π– pairs. To differentiate between the identical D0 and D̅0 decays, the collaboration exploited two different classes of decays: those of D*+/- mesons decaying into a D0 and a charged pion, where the presence of a π+(π–) indicates the presence of a D0(D̅0) meson; and those of B mesons decaying into a D0, a muon and a neutrino, in which the presence of a μ+(μ–) identifies a D0(D̅0). Counting the number of decays present in the data sample, the final result is ΔACP= -0.154±0.029%. At 5.3 standard deviations from zero, it represents the first observation of CP violation in the charm system.
“This is a major result that could be obtained thanks to the very high charm- production cross section at LHC, and to the superb performance of both the LHC machine and the LHCb detector, which provided the largest sample of charm particles ever collected,” says LHCb spokesperson Giovanni Passaleva. “Analysing the tens of millions of D0 mesons needed for such a precise measurement was a remarkable collective effort by the collaboration. The result opens up a new field in particle physics, involving the study of CP-violating effects in the sector of up-type quarks and searches for new-physics effects in a completely new domain.”
CP violation is a thought to be an essential ingredient to explain the observed cosmological matter-antimatter asymmetry, but the level of CP violation observed in the SM is only able to explain a fraction of the imbalance. In addition to hunting for novel sources of CP violation, physicists are making precise measurements of known sources to look for deviations that could indicate physics beyond the SM. The SM prediction for the amount of CP violation in charm decays is estimated to be in the range of 10-4 – 10-3 in the decay modes of interest. The new LHCb measurement is consistent with the SM expectation but falls at the upper end of the range, generating much discussion at Moriond 2019. Unusually for particle physics, the experimental measurement is much more precise than the SM prediction. This is due to the lightness of charm quarks, which means that reliable perturbative QCD and other approximate calculation techniques are not possible. Future theoretical improvements, and data, will establish whether the seminal LHCb result is consistent with the SM.
“This is an important milestone in the study of CP violation,” Kobayashi, now professor emeritus at KEK in Japan, tells CERN Courier. “I hope that analysis of the results will provide a clue to new physics.”
Neutrinos, discovered in 1956, play an exceptional role in particle and nuclear physics, as well as astrophysics, and their study has led to the award of several Nobel prizes. In recognition of their importance, the first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018.
The purpose of the conference, which drew 120 participants, was to cover the main steps in the history of the neutrino since 1930, when Wolfgang Pauli postulated its existence to explain the continuous energy spectrum of the electrons emitted in beta decay. Specifically, for each topic in neutrino physics, the aim was to pursue an historical approach and follow as closely as possible the discovery or pioneering papers. Speakers were chosen as much as possible for their roles as authors or direct witnesses, or as players in the main events.
The first session, “Invention of a new particle”, started with the prehistory of the neutrino – that is, the establishment of the continuous energy spectrum in beta decay – before moving into the discoveries of the three flavour neutrinos. The second session, “Neutrinos in nature”, was devoted to solar and atmospheric neutrinos, as well as neutrinos from supernovae and Earth. The third session covered neutrinos from reactors and beams including the discovery of neutral-current neutrino interactions, in which the neutrino is not transformed into another particle like a muon or an electron. This discovery was made in 1973 by the Gargamelle bubble chamber team at CERN after a race with the HPWF experiment team at Fermilab.
The major theme of neutrino oscillations from the first theoretical ideas of Bruno Pontecorvo (1957) to the Mikheyev–Smirnov–Wolfenstein effect (1985), which can modify the oscillations when neutrinos travel through matter, was complemented by talks on the discovery of neutrino oscillations by Nobel laureates Takaaki Kajita and Art McDonald. In 1998, the Super-Kamiokande experiment, led by Kajita, observed the oscillation of atmospheric neutrinos, and in 2001 the Sudbury Neutrino Observatory experiment, led by McDonald, observed the oscillation of solar neutrinos.
The role of the neutrino in the Standard Model was discussed, as was its intrinsic nature. Although physicists have observed the rare process of double beta decay with neutrinos in the final state, neutrinoless double beta decay with no neutrinos produced has been searchedfor for more than 30 years because its observation would prove that the neutrino is Majorana-type (its own antiparticle) and not Dirac-type.
To complete the panorama, the conference discussed neutrinos as messengers from the wider universe, from the Big Bang to violent phenomena such as gamma-ray bursts or active galactic nuclei. Delegates also discussed wrong hints and tracks, which play a positive role in the development of science, and the peculiar sociological aspects that are common to particle physics and astrophysics.
Following the conference, a website dedicated to the history of this fascinating particle was created: https://neutrino-history.in2p3.fr.
In a workshop held at CERN on 16–17 January, researchers presented the findings of the Physics Beyond Colliders (PBC) initiative, which was launched in 2016 to explore the opportunities at CERN via projects complementary to the LHC and future colliders (CERN Courier November 2016 p28). PBC members have weighed up the potential for such experiments to explore open questions in QCD and the existence of physics beyond the Standard Model (BSM), in particular including searches for signatures of hidden-sector models in which the conjectured dark matter does not couple directly to Standard Model particles.
The BSM and QCD groups of the PBC initiative have developed detailed studies of CERN’s options and compared them to other worldwide possibilities. The results show the international competitiveness of the PBC options.
The Super Proton Synchrotron (SPS) remains a clear attraction, offering the world’s highest-energy beams to fixed-target experiments in the North Area (see Fixed target, striking physics). The SPS high-intensity muon beam could allow a better understanding of the theoretical prediction of the muon anomalous magnetic moment (MUonE project), and a significant contribution to the resolution of the proton radius puzzle by COMPASS(Rp). The NA61 experiment could explore QCD in the interesting region of “criticality”, while upgrades of NA64 and a few months of NA62 operation in beam-dump mode (whereby a target absorbs most of the incident protons and contains most of the particles generated by the primary beam interactions) would explore the hidden-sector parameter space. In the longer term, the KLEVER experiment could probe rare decays of neutral kaons, and NA60 and DIRAC could enhance our understanding of QCD.
A novel North Area proposal is the SPS Beam Dump Facility (BDF). Such a facility could, in the first instance, serve the SHiP experiment, which would perform a comprehensive investigation of the hidden sector with discovery potential in the MeV–GeV mass range, and the TauFV experiment, which would search for forbidden τ decays. The BDF team has made excellent progress with the facility design and is preparing a comprehensive design study report. Options for more novel exploitation of the SPS have also been considered: proton-driven plasma- wakefield acceleration of electrons for a dark-matter experiment (AWAKE++); the acceleration and slow extraction of electrons to light–dark-matter experiments (eSPS); and the production of well-calibrated neutrinos via a muon decay ring (nuSTORM).
Fixed-target studies at the LHC are also considered within PBC, and these could improve our understanding of QCD in regions where it is relevant for new-physics searches at the high-luminosity LHC upgrade. The LHC could also be supplemented with new experiments to search for long-lived particles, and PBC support for a small experiment called FASER has helped pave the way for its installation in the ongoing long shutdown of CERN’s accelerator complex.
2018 was a notable year for the gamma factory, a novel concept that would use the LHC to produce intense gamma-ray beams for precision measurements and searches (CERN Courier November 2017 p7). The team has already demonstrated the acceleration of partially stripped ions in the LHC, and is now working towards a proof-of-principle experiment in the SPS. Meanwhile, the Electric Dipole Moment (CPEDM) collaboration has continued studies, supported by experiments at the COSY synchrotron in Germany (CERN Courier September 2016 p27), towards a prototype storage ring to measure the proton EDM.
The PBC technology team has also been working to leverage CERN’s skills base to novel experiments, for example by exploring synergies across experiments and collaboration in technologies – in particular, concerning light-shining-through-walls experiments and QED vacuum-birefringence measurements.
Finally, some PBC projects are likely to flourish outside CERN: the IAXO axion helioscope, now under consideration at DESY; the proton EDM ring, which could be prototyped at the Jülich laboratory, also in Germany; and the REDTOP experiment devoted to η meson rare decays, for which Fermilab in the US seems better suited.
The PBC groups have submitted their full findings to the European Particle Physics Strategy Update (http://pbc.web.cern.ch/).
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.