On 10 April, researchers working on the Event Horizon Telescope – a network of eight radio dishes that creates an Earth-sized interferometer – released the first direct image of a black hole. The landmark result, which shows the radiation emitted by superheated gas orbiting the event horizon of a super massive black hole in a nearby galaxy, opens a brand new window on these incredible objects.
Super massive black holes (SMBHs) are thought to occupy the centre of most galaxies, including our own, with masses up to billions of solar masses and sizes up to 10 times larger than our solar system. Discovered in the 1960s via radio and optical measurements, their origin, as well as their nature and surrounding environments, remain important open issues within astrophysics. Spatially resolved images of an SMBH and the potential accretion disks around them form vital input, but producing such images is extremely challenging.
SMBHs are relatively bright in radio wavelengths. However, since the imaging resolution achievable with a telescope scales with the wavelength (which is long in the radio range) and scales inversely with the telescope diameter, it is difficult to obtain useful images in the radio region. For example, producing an image with the same resolution as the optical Hubble Space Telescope would require a km-wide telescope, while obtaining a resolution that would allow an SMBH to be imaged, would require a telescope diameter of thousands of kilometres. One way around this is to use interferometry to turn many telescopes dishes at different locations into one large telescope. Such an interferometer measures the differences in arrival time of one radio wave at different locations on Earth (induced by the difference in travel path), from which it is possible to reconstruct an image on the sky. This does not only require a large coordination between many telescopes around the world, but also very precise timing, vast amounts of collected data and enormous computing power.
Despite the considerable difficulties, the Event Horizon Telescope project used this technique to produce the first image of an SMBH using an observation time of only tens of minutes. The imaged SMBH lies at the centre of the supergiant elliptical galaxy Messier 87, which is located in the Virgo constellation at a distance of around 50 million light years. Although relatively close in astronomical terms, its very large mass makes its size on the sky comparable to that of the much lighter SMBH in the centre of our galaxy. Furthermore, its accretion rate (brightness) is variable on longer time scales, making it easier to image. The resulting image (above) shows the clear shadow of the black hole in the centre surrounded by an asymmetric ring caused by radio waves that are bent around the SMBH by its strong gravitational field. The asymmetry is likely a result of relativistic beaming of part of the disk of matter which moves towards Earth.
The team compared the image to a range of detailed simulations in which parameters such as the black hole’s mass, spin and orientation were varied. Additionally, the characteristics of the matter around the SMBH, mainly hot electrons and ions, as well as the magnetic field properties were varied. While the image alone does not allow researchers to constrain many of these parameters, combining it with X-ray data taken by the Chandra and NuSTAR telescopes enables a deeper understanding. For example, the combined data constrain the SMBH mass to 6.5 billion solar masses and appears to exclude a non-spinning black hole. Whether the matter orbiting the SMBH rotates in the same direction or opposite to the black hole, as well as details on the environment around it, will require additional studies. Such studies can also potentially exclude alternative interpretations of this object; currently, exotic objects like boson stars, gravastars and wormholes cannot be fully excluded.
The work of the Event Horizon Telescope collaboration, which involves more than 200 researchers worldwide, was published in six consecutive papers in TheAstrophysical Journal Letters. While more images at shorter wavelengths are foreseen in the future, the collaboration also points out that much can be learned by combining the data with that from other wavelengths, such as gamma-rays. Despite this first image being groundbreaking, it is likely only the start of a revolution in our understanding of black holes and, with it, the universe.
A world record for laser-driven wakefield acceleration has been set by a team at the Berkeley Lab Laser Accelerator (BELLA) Center in the US. Physicists used a novel scheme to channel 850 TW laser pulses through a 20 cm-long plasma, allowing electron beams to be accelerated to an energy of 7.8 GeV – almost double the previous record set by the same group in 2014.
Proposed 40 years ago, plasma-wakefield acceleration can produce gradients hundreds of times higher than those achievable with conventional techniques based on radio-frequency cavities. It is often likened to surfing a wave. Relativistic laser pulses with a duration of the order of the plasma period generate large-amplitude electron plasma waves that displace electrons with respect to the background ions, allowing the plasma waves to accelerate charged particles to relativistic energies. Initial work showed that TeV energies could be reached in just a few hundred metres using multiple laser-plasma accelerator stages, each driven by petawatt laser pulses propagating through a plasma with a density of about 1017 cm–3. However, this requires the focused laser pulses to be guided over distances of tens of centimetres. While a capillary discharge is commonly used to create the necessary plasma channel, achieving a sufficiently deep channel at a plasma density of 1017 cm–3 is challenging.
In the latest BELLA demonstration, the plasma channel produced by the capillary discharge was modified by a nanosecond-long “heater” pulse that confined the focused laser pulses over the 20 cm distance. This allowed for the acceleration of electron beams with quasi-monoenergetic peaks up to 7.8 GeV. “This experiment demonstrates that lasers can be used to accelerate electrons to energies relevant to X-ray free-electron lasers, positron generation, and high-energy collider stages,” says lead author Tony Gonsalves. “However, the beam quality currently available from laser-wakefield accelerators is far from that required by future colliders.”
The quality of the accelerated electron beamis determined by how background plasma electrons are trapped in the accelerating and focusing “bucket” of the plasma wave. Several different methods of initiating electron trapping have been proposed to improve the beam emittance and brightness significantly beyond state-of-the-art particle sources, representing an important area of research. Another challenge, says Gonsalves, is to improve the stability and reproducibility of the accelerated electron beams, which are currently limited by fluctuations in the laser systems caused by air and ground motion.
In addition to laser-driven schemes, particle-driven plasma acceleration holds promise for high-energy physics applications. Experiments using electron-beam drivers are ongoing and planned at various facilities including FLASHForward at DESY and FACET-II at SLAC (CERN Courier January/February 2019 p10). The need for staging multiple plasma accelerators may even be circumvented by using energetic proton beams as drivers. Recent experiments at CERN’s Advanced Wakefield Experiment demonstrated electron acceleration gradients of around 200 MV/m using proton-beam-driven plasma wakefields (CERN Courier October 2018 p7).
Experiments at Berkeley in the next few years will focus on demonstrating the staging of laser-plasma accelerators with multi-GeV energy gains. “The field of plasma wakefield acceleration is picking up speed,” writes Florian Grüner of the University of Hamburg in an accompanying APS Viewpoint article. “If plasma wakefields can have gradients of 1 TV/m, one might imagine that a ‘table-top version of CERN’ is possible.”
The LHCb collaboration has released a much anticipated update on its measurement of RK – a ratio that describes how often a B+ meson decays to a charged kaon and either a μ+μ– or an e+e– pair, and therefore provides a powerful test of lepton universality. The more precise measurement, officially revealed at Rencontres de Moriond on 22 March, suggests that the intriguing current picture of flavour anomalies persists.
Since 2013, several results involving the decay of b quarks have hinted at deviations from lepton universality, a tenet of the Standard Model (SM), though none is individually significant enough to constitute evidence of new physics. LHCb has studied a number of ratios comparing b-decays to different leptons and also sees signs that something is amiss in angular distributions of B→K*μ+μ− decays. Data from BaBar and Belle add further intrigue, though with lower statistical significances.
The latest measurement from LHCb is the first lepton-universality test performed using part of the 13 TeV Run 2 data set (2015–2016) together with the full Run 1 data sample, representing in total an integrated luminosity of 5fb-1. The blinded analysis was performed in the range 1.1<q2<6.0 GeV2, where q2 is the invariant mass of the μ+μ– or e+e– pair. It found RK = 0.846+0.060–0.054 (stat) +0.016–0.014 (syst), the most precise measurement to date. However, having shifted closer to the Standard Model prediction, the value leaves the overall significance unchanged at about 2.5 standard deviations.
“I cannot tell you if lepton-flavour universality is broken or not, so sorry for this!” said Thibaud Humair of Imperial College London, who presented the result on behalf of the LHCb collaboration. “All LHCb results for RK are below SM expectations. Together with b → sμ+μ− results, RK and RK* constitute an interesting pattern of anomalies, but the significance is still low,” he said.
Humair’s talk generated much discussion, with physicists pressing LHCb on potential sources of uncertainties and other possible explanations such as the dependence of RK on q2. Other experiments also showed new measurements of lepton universality and other related tests of the Standard Model, such as ATLAS on the branching ration of Bs→μ+μ− and an update from Belle on both RD(*) and RK*. The current experimental activity in flavour physics was reflected by several talks at Moriond from theorists.
“It’s not a discovery, but something is going on,” says David Straub of TUM Munich, who had spent the previous 24 hours working solid to update a global likelihood fit of all parameters relevant to the b anomalies with the updated LHCb and Belle results. The fit, which involves 265 observables showed that b → sl+l– observables such as RK continue to show a “large pull” towards new-physics. “The popular ‘U1 leptoquark’ is still giving excellent fit to the data”, says Straub.
Further reduction in the uncertainty on RK can be expected when the data collected by LHCb in 2017 and 2018 are included in a future analysis. Meanwhile, in Japan, the Belle II physics programme has now begun in earnest and the collaboration is expected to bring further statistical power to the b-anomaly question in the near future.
On 26 February, a new solar power plant powering the SESAME light source in Jordan was officially inaugurated. In addition to being the first synchrotron-light facility in the Middle East region, SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy.
Electricity from the solar power plant will be supplied by an on-grid photovoltaic system constructed 30 km away, and its 6.48 MW power capacity is ample to satisfy SESAME’s needs for several years. “As in the case of all accelerators, SESAME is in dire need of energy, and as the number of its users increases so will its electricity bill,” says SESAME director Khaled Toukan. “Given the very high cost of electricity in Jordan, with this solar power plant the centre becomes sustainable.”
Energy efficiency and other environmental factors are coming under growing scrutiny at large research infrastructures worldwide. The necessary funding for the SESAME installation became available in late 2016 when the Government of Jordan agreed to allocate JD 5 million (US$7.05 million) from funds provided by the European Union (EU) to support the deployment of clean energy sources. The power plant, which uses monocrystalline solar panels, was built by the Jordanian company Kawar Energy and power that is transmitted to the grid will be accounted for to the credit of SESAME.
SESAME opened its beamlines to users in July 2018. Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey are currently members of SESAME, with 16 further countries – plus CERN and the EU – listed as observers.
The Japanese government has put on hold a decision about hosting the International Linear Collider (ILC), to the disappointment of many hoping for clarity ahead of the update of the European strategy for particle physics. At a meeting in Tokyo on 6–7 March, Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) announced, with input from the Science Council of Japan (SCJ), that it has “not yet reached declaration” for hosting the ILC at this time. A statement from MEXT continued: “The ILC project requires further discussion in formal academic decision-making processes such as the SCJ Master Plan, where it has to be clarified whether the ILC project can gain understanding and support from the domestic academic community… MEXT will continue to discuss the ILC project with other governments while having an interest in the ILC project.”
The keenly awaited announcement was made during the 83rd meeting of the International Committee for Future Accelerators (ICFA) at the University of Tokyo. During a press briefing, ICFA chair Geoffrey Taylor emphasised that colliders are long-term projects. “At the last strategy update in 2013 the ILC was seen as an important development in the field, and we were hoping there would be a definite statement from Japan so that it can be incorporated into the current strategy update,” he said. “We don’t have that positive endorsement, so it will proceed at a slower rate than we hoped. ICFA still supports Japan as hosts of the ILC, and we hope it is built here because Japan has been working hard towards it. If not, we can be sure that there will be somewhere else in the world where the project can be taken up.”
The story of the ILC, an electron–positron collider that would serve as a Higgs factory, goes back more than 15 years. In 2012, physicists in Japan submitted a petition to the Japanese government to host the project. A technical design report was published the following year. In 2017, the original ILC design was revised to reduce its centre-of-mass energy by half, shortening it by around a third and reducing its cost by up to 40%.
Meanwhile, MEXT has been weighing up the ILC project in terms of its scientific significance, technical challenges, cost and other factors. In December 2018, the SCJ submitted a critical report to MEXT highlighting perceived issues with the project, including its cost and international organisation. Asked at the March press briefing why the SCJ should now be expected to change its views on the ILC, KEK director-general Masanori Yamauchi responded: “We can show that we already have solutions for the technical challenges pointed out in the latest SCJ report, and we are going to start making a framework for international cost-sharing.”
Writing in LC NewsLine, Lyn Evans, director of the Linear Collider Collaboration (which coordinates planning and research for the ILC and CERN’s Compact Linear Collider, CLIC), remains upbeat: “We did not get the green light we hoped for. Nevertheless, there was a significant step forward with a strong political statement and, for the first time, a declaration of interest in further discussions by a senior member of the executive. We will continue to push hard.”
Japan’s statement has also been widely interpreted as a polite way for the government to say “no” to the ILC. “The reality is that it is naturally difficult for people outside the machinery of any national administration to understand fully how procedures operate, and this is certainly true of the rest of the world with regard to what is truly happening with ILC in Japan,” says Phil Burrows of the University of Oxford, who is spokesperson for the CLIC accelerator collaboration.
A full spectrum of views was expressed at a meeting of the linear-collider community in Lausanne, Switzerland, on 8–9 April, with around 100 people present. “The global community represented at the Lausanne meeting restated the overwhelming physics case for an electron–positron collider to make precision measurements in the Higgs and top-quark sectors, with superb sensitivity to new physics,” says Burrows. “We are in the remarkable situation that we have not one, but two, mature options for doing this: ILC and CLIC. I hope that the European Strategy Update recommendations will reflect this consensus on the physics case, position Europe to play a leading role, and hence ensure that one of these projects proceeds to realisation.”
On the morning of 21 March, at the 2019 Rencontres de Moriond in La Thuile, Italy, the LHCb collaboration announced the discovery of charge-parity (CP) violation in the charm system. Met with an impromptu champagne celebration, the result represents a milestone in particle physics and opens a new area of investigation in the charm sector.
CP violation, which results in differences in the properties of matter and antimatter, was first observed in the decays of K mesons (which contain strange quarks) in 1964 by James Cronin and Val Fitch. Even though parity (P) violation had been seen eight years earlier, the discovery that the combined C and P symmetries are not conserved was unexpected. The story deepened in the early 1970s, when, building on the foundations laid by Nicola Cabibbo and others, Makoto Kobayashi and Toshihide Maskawa showed that CP violation could be included naturally in the Standard Model (SM) if at least six different quarks existed in nature. Their fundamental idea – whereby direct CP violation arises if a complex phase appears in the CKM matrix describing quark mixing – was confirmed 30 years later by the discovery of CP violation in B-meson decays by the BaBar and Belle collaborations. Despite decades of searches, CP violation in the decays of charmed particles escaped detection.
LHCb physicists used the unprecedented dataset accumulated in 2011–2018 to study the difference in decay rates between D0 and D̅0 (which contain a c quark or antiquark) decaying into K+K– or π+π– pairs. To differentiate between the identical D0 and D̅0 decays, the collaboration exploited two different classes of decays: those of D*+/- mesons decaying into a D0 and a charged pion, where the presence of a π+(π–) indicates the presence of a D0(D̅0) meson; and those of B mesons decaying into a D0, a muon and a neutrino, in which the presence of a μ+(μ–) identifies a D0(D̅0). Counting the number of decays present in the data sample, the final result is ΔACP= -0.154±0.029%. At 5.3 standard deviations from zero, it represents the first observation of CP violation in the charm system.
“This is a major result that could be obtained thanks to the very high charm- production cross section at LHC, and to the superb performance of both the LHC machine and the LHCb detector, which provided the largest sample of charm particles ever collected,” says LHCb spokesperson Giovanni Passaleva. “Analysing the tens of millions of D0 mesons needed for such a precise measurement was a remarkable collective effort by the collaboration. The result opens up a new field in particle physics, involving the study of CP-violating effects in the sector of up-type quarks and searches for new-physics effects in a completely new domain.”
CP violation is a thought to be an essential ingredient to explain the observed cosmological matter-antimatter asymmetry, but the level of CP violation observed in the SM is only able to explain a fraction of the imbalance. In addition to hunting for novel sources of CP violation, physicists are making precise measurements of known sources to look for deviations that could indicate physics beyond the SM. The SM prediction for the amount of CP violation in charm decays is estimated to be in the range of 10-4 – 10-3 in the decay modes of interest. The new LHCb measurement is consistent with the SM expectation but falls at the upper end of the range, generating much discussion at Moriond 2019. Unusually for particle physics, the experimental measurement is much more precise than the SM prediction. This is due to the lightness of charm quarks, which means that reliable perturbative QCD and other approximate calculation techniques are not possible. Future theoretical improvements, and data, will establish whether the seminal LHCb result is consistent with the SM.
“This is an important milestone in the study of CP violation,” Kobayashi, now professor emeritus at KEK in Japan, tells CERN Courier. “I hope that analysis of the results will provide a clue to new physics.”
Particle physics began more than a century ago with the discoveries of radioactivity, the electron and cosmic rays. Photographic plates, gas-filled counters and scintillating substances were the early tools of the trade. Studying cloud formation in moist air led to the invention of the cloud chamber, which, in 1932, enabled the discovery of the positron. The photographic plate soon morphed into nuclear-emulsion stacks, and the Geiger tube of the Geiger–Marsden–Rutherford experiments developed into the workhorse for cosmic-ray studies. The bubble chamber, invented in 1952, represented the culmination of these “imaging detectors”, using film as the recording medium. Meanwhile, in the 1940s, the advent of photomultipliers had opened the way to crystal-based photon and electron energy measurements and Cherenkov detectors. This was the toolbox of the first half of the 20th century, credited with a number of groundbreaking discoveries that earned the toolmakers and their artisans more than 10 Nobel Prizes.
The invention of the Multi Wire Proportional Chamber (MWPC) by Georges Charpak in 1968 was a game changer, earning him the 1992 Nobel Prize in Physics. Suddenly, experimenters had access to large-area charged particle detectors with millimetre spatial resolution and staggering MHz-rate capability. Crucially, the emerging integrated-circuit technology could deliver amplifiers so small in size and cost to equip many thousands of proportional wires. This ingenious and deceptively simple detector is relatively easy to construct. The workshops of many university physics departments could master the technology, attracting students and “democratising” particle physics. So compelling was experimentation with MWPCs that within a few years, large detector facilities with tens of thousands of wires were constructed – witness the Split Field Magnet at CERN’s Intersecting Storage Rings (ISR). Its rise to prominence was unstoppable: it became the detector of choice for the Proton Synchrotron, Super Proton Synchrotron (SPS) and ISR programmes. An extension of this technique is the drift chamber, a MWPC-type geometry, with which the time difference between the passage of the particle and the onset of the wire signal is recorded, providing a measure of position with 100 µm-level resolution. The MWPC concept lends itself to a multitude of geometries and has found its “purest” application as the readout of time projection chambers (TPCs). Modern derivatives replace the wire planes with metallised foils with holes in a sub-millimetre pattern, amplifying the ionisation signals.
The ambition, style and success of these large, global collaborations was contagious
The ISR was a hotbed for accelerator and detector inventions. The world’s first proton–proton collider, an audacious project, was clearly ahead of its time and the initial experiments could not fully exploit its discovery potential. It prompted, however, the concept of multi-purpose facilities capable of obtaining “complete” collision information. For the first time, a group developed and used transition-radiation detectors for electron detection and liquid-argon calorimetry. The ISR’s Axial Field Spectrometer (AFS) provided high-quality hadron calorimetry with close to 4π coverage. These technologies are now widely used at accelerators and for non-accelerator experiments. The stringent performance requirements for experiments at the ISR encouraged the detector developers to explore and reach a measurement quality only limited by the laws of detector physics: science-based procedures had replaced the “black magic” of detector construction. With collision rates in the 10 MHz range, these experiments (and the ISR) were forerunners of today’s Large Hadron Collider (LHC) experiments. Of course, the ISR is most famous for its seminal accelerator developments, in particular the invention of stochastic cooling, which was the enabling technology for converting the SPS into a proton–antiproton collider.
The SPS marked another moment of glory for CERN. In 1976 first beams were accelerated to 400 GeV, initiating a diverse physics programme and motivating a host of detector developments. Advances in semiconductor technology led to the silicon-strip detector. With the experiments barely started, Carlo Rubbia and collaborators launched the idea, as ingenious as it was audacious, to convert the SPS into a proton–antiproton collider. The goal was clear: orchestrate quickly and rather cheaply a machine with enough collision energy to produce the putative W and Z bosons. Simon van der Meer’s stochastic-cooling scheme had to deliver the required beam intensity and lifetime, and two experimental teams were charged with the conception and construction of the equally novel detectors. The centrepiece of the UA1 detector was a 6 m-long and 2 m-diameter “electronic bubble chamber”, which adapted the drift-chamber concept to the event topology and collision rate, combined with state-of-the-art electronic readout. The electronic images were of such illuminating quality that “event scanning”, the venerable bubble- chamber technique, was again a key tool in data analysis. The UA2 team pushed calorimetry and silicon detectors to new levels of performance, provided healthy competition and independent discoveries. The discovery of the W and Z bosons was achieved in 1983 and, the following year, Rubbia and van der Meer became Nobel Laureates.
Laying foundations
In 1981, with the approval of the Large Electron Positron (LEP) collider, the community laid the foundation for decades of research at CERN. Mastering the new scale of the accelerator dimension also brought a new approach to managing the larger experimental collaborations and to meeting their more stringent experimental requirements. For the first time, mostly outside collaborators developed and built the experimental apparatus, a non-trivial, but needed success in technology transfer. The detection techniques reached a new state of matureness. Silicon-strip detectors became ubiquitous. Gaseous tracking in a variety of forms, such as TPCs and jet chambers, reached new levels of size and performance. There were also some notable firsts. The DELPHI collaboration developed the Ring Imaging Cherenkov Counter, a delicate technology in which the distribution of Cherenkov photons, imaged with mirrors onto photon-sensitive MWPC-type detectors, provides a measure of the particle’s velocity. The L3 collaboration aimed at ultimate-precision energy measurements of muons, photons and electrons, and put its money on a recently discovered scintillating crystal, bismuth germanate. Particle physicists, material scientists and crystallographers from academia and industry transformed this laboratory curiosity into mass-producible technology: ultimately, 12,000 crystals were grown, cut to size as truncated pyramids and assembled into the calorimeter, a pioneering trendsetter.
The ambition, style and success of these large, global collaborations was contagious. It gave the cosmic-ray community a new lease of life. The Pierre Auger Observatory, one of whose initiators was particle physicist and Nobel Laureate James Cronin, explores cosmic rays at extreme energies with close to 2000 detector stations spread over an area of 3000 km2. The IceCube collaboration has instrumented around a cubic kilometre of Antarctic ice to detect neutrinos. One of the most ambitious experiments is the Alpha Magnetic Spectrometer, hosted by the International Space Station – again with a particle physicist and Nobel Prize winner, Samuel Ting, as a prime mover and shaker.
These decade-long efforts in experimentation find their present culmination at the LHC. Experimenters had to innovate on several fronts: all detector systems were designed for and had to achieve ultimate performance, limited only by the laws of physics; the detectors must operate at a GHz or more collision rate, generating some 100 billion particles per second. “Impossible” was many an expert’s verdict in the early 1990s. The successful collaboration with industry giants in the IT and electronics sectors was a life-saver; and achieving all this – fraught with difficulties, technical and sociological – in international collaborations of several thousand scientists and engineers was an immense achievement. All existing detection technologies – ranging from silicon-tracking, to transition-radiation and RICH detectors, liquid-argon, scintillator and crystal calorimeters to 10,000 m3-scale muon spectrometers – needed novel ideas, major improvements and daring extrapolations. The success of the LHC experiments is beyond the wildest dreams: hundreds of measurements achieve a precision, previously considered only possible at electron–positron colliders. The Higgs boson, discovered in 2012, will be part of the research agenda for most of the 21st century, and CERN is in the starting block with ambitious plans.
Sharing with society
Worldwide, more than 30,000 accelerators are in operation. Particle and nuclear physics research uses barely more than 100 of them. Society is the principal client, and many of the accelerator innovations and particle detectors have found their way into industry, biology and health applications. A class of accelerators, to which CERN has contributed significantly, is specifically dedicated to tumour therapy. Particle detectors have made a particular impact on medical imaging, such as positron emission tomography (PET), whose origin dates back to CERN with a MWPC-based detector in the 1970s. Today’s clinical PETs use crystals, very similar to those used in the discovery of the Higgs boson.
Possibly the most important benefit of particle physics to society is the collaborative approach developed by the community, which underpins the incredible success that has led us to the LHC experiments today. There are no signs that the rate of innovation in detectors and instrumentation is slowing. Currently the LHC experiments are undergoing major upgrades and plans for the next generation of experiments and colliders are already well under way. These collaborations succeed in being united and driven by a common goal, bridging cultural and political divides.
Neutrinos, discovered in 1956, play an exceptional role in particle and nuclear physics, as well as astrophysics, and their study has led to the award of several Nobel prizes. In recognition of their importance, the first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018.
The purpose of the conference, which drew 120 participants, was to cover the main steps in the history of the neutrino since 1930, when Wolfgang Pauli postulated its existence to explain the continuous energy spectrum of the electrons emitted in beta decay. Specifically, for each topic in neutrino physics, the aim was to pursue an historical approach and follow as closely as possible the discovery or pioneering papers. Speakers were chosen as much as possible for their roles as authors or direct witnesses, or as players in the main events.
The first session, “Invention of a new particle”, started with the prehistory of the neutrino – that is, the establishment of the continuous energy spectrum in beta decay – before moving into the discoveries of the three flavour neutrinos. The second session, “Neutrinos in nature”, was devoted to solar and atmospheric neutrinos, as well as neutrinos from supernovae and Earth. The third session covered neutrinos from reactors and beams including the discovery of neutral-current neutrino interactions, in which the neutrino is not transformed into another particle like a muon or an electron. This discovery was made in 1973 by the Gargamelle bubble chamber team at CERN after a race with the HPWF experiment team at Fermilab.
The major theme of neutrino oscillations from the first theoretical ideas of Bruno Pontecorvo (1957) to the Mikheyev–Smirnov–Wolfenstein effect (1985), which can modify the oscillations when neutrinos travel through matter, was complemented by talks on the discovery of neutrino oscillations by Nobel laureates Takaaki Kajita and Art McDonald. In 1998, the Super-Kamiokande experiment, led by Kajita, observed the oscillation of atmospheric neutrinos, and in 2001 the Sudbury Neutrino Observatory experiment, led by McDonald, observed the oscillation of solar neutrinos.
The role of the neutrino in the Standard Model was discussed, as was its intrinsic nature. Although physicists have observed the rare process of double beta decay with neutrinos in the final state, neutrinoless double beta decay with no neutrinos produced has been searchedfor for more than 30 years because its observation would prove that the neutrino is Majorana-type (its own antiparticle) and not Dirac-type.
To complete the panorama, the conference discussed neutrinos as messengers from the wider universe, from the Big Bang to violent phenomena such as gamma-ray bursts or active galactic nuclei. Delegates also discussed wrong hints and tracks, which play a positive role in the development of science, and the peculiar sociological aspects that are common to particle physics and astrophysics.
Following the conference, a website dedicated to the history of this fascinating particle was created: https://neutrino-history.in2p3.fr.
In a workshop held at CERN on 16–17 January, researchers presented the findings of the Physics Beyond Colliders (PBC) initiative, which was launched in 2016 to explore the opportunities at CERN via projects complementary to the LHC and future colliders (CERN Courier November 2016 p28). PBC members have weighed up the potential for such experiments to explore open questions in QCD and the existence of physics beyond the Standard Model (BSM), in particular including searches for signatures of hidden-sector models in which the conjectured dark matter does not couple directly to Standard Model particles.
The BSM and QCD groups of the PBC initiative have developed detailed studies of CERN’s options and compared them to other worldwide possibilities. The results show the international competitiveness of the PBC options.
The Super Proton Synchrotron (SPS) remains a clear attraction, offering the world’s highest-energy beams to fixed-target experiments in the North Area (see Fixed target, striking physics). The SPS high-intensity muon beam could allow a better understanding of the theoretical prediction of the muon anomalous magnetic moment (MUonE project), and a significant contribution to the resolution of the proton radius puzzle by COMPASS(Rp). The NA61 experiment could explore QCD in the interesting region of “criticality”, while upgrades of NA64 and a few months of NA62 operation in beam-dump mode (whereby a target absorbs most of the incident protons and contains most of the particles generated by the primary beam interactions) would explore the hidden-sector parameter space. In the longer term, the KLEVER experiment could probe rare decays of neutral kaons, and NA60 and DIRAC could enhance our understanding of QCD.
A novel North Area proposal is the SPS Beam Dump Facility (BDF). Such a facility could, in the first instance, serve the SHiP experiment, which would perform a comprehensive investigation of the hidden sector with discovery potential in the MeV–GeV mass range, and the TauFV experiment, which would search for forbidden τ decays. The BDF team has made excellent progress with the facility design and is preparing a comprehensive design study report. Options for more novel exploitation of the SPS have also been considered: proton-driven plasma- wakefield acceleration of electrons for a dark-matter experiment (AWAKE++); the acceleration and slow extraction of electrons to light–dark-matter experiments (eSPS); and the production of well-calibrated neutrinos via a muon decay ring (nuSTORM).
Fixed-target studies at the LHC are also considered within PBC, and these could improve our understanding of QCD in regions where it is relevant for new-physics searches at the high-luminosity LHC upgrade. The LHC could also be supplemented with new experiments to search for long-lived particles, and PBC support for a small experiment called FASER has helped pave the way for its installation in the ongoing long shutdown of CERN’s accelerator complex.
2018 was a notable year for the gamma factory, a novel concept that would use the LHC to produce intense gamma-ray beams for precision measurements and searches (CERN Courier November 2017 p7). The team has already demonstrated the acceleration of partially stripped ions in the LHC, and is now working towards a proof-of-principle experiment in the SPS. Meanwhile, the Electric Dipole Moment (CPEDM) collaboration has continued studies, supported by experiments at the COSY synchrotron in Germany (CERN Courier September 2016 p27), towards a prototype storage ring to measure the proton EDM.
The PBC technology team has also been working to leverage CERN’s skills base to novel experiments, for example by exploring synergies across experiments and collaboration in technologies – in particular, concerning light-shining-through-walls experiments and QED vacuum-birefringence measurements.
Finally, some PBC projects are likely to flourish outside CERN: the IAXO axion helioscope, now under consideration at DESY; the proton EDM ring, which could be prototyped at the Jülich laboratory, also in Germany; and the REDTOP experiment devoted to η meson rare decays, for which Fermilab in the US seems better suited.
The PBC groups have submitted their full findings to the European Particle Physics Strategy Update (http://pbc.web.cern.ch/).
On 11 December 2018, 25 years after its inaugural meeting, the BaBar collaboration came together at the SLAC National Accelerator Laboratory in California to celebrate its many successes. David Hitlin, BaBar’s first spokesperson, described the inaugural meeting of what was then called the Detector Collaboration for the PEP-II “asymmetric” electron–positron collider, which took place at SLAC at the end of 1993. By May 1994 the collaboration had chosen the name BaBar in recognition of its primary goal to study CP violation in the neutral B-B̅ meson system. Jonathan Dorfan, PEP-II project director, recounted how PEP-II was constructed by SLAC, LBL and LLNL. Less than six years later, PEP-II and the BaBar detector were built and the first collision events were collected on 26 May 1999. Twenty-five years on, and BaBar has now chalked up more than 580 papers on CP violation and many other topics.
BaBar has now chalked up more than 580 papers on CP violation and many other topics.
The “asymmetric” descriptor of the collider refers to Pier Oddone’s concept of using unequal electron and positron beam energies – tuned to 10.58 GeV, the mass of the ϒ(4S) meson and just above the threshold for producing a pair of B mesons. This relativistic boost enabled measurements of the distance between the points where the mesons decay, which is critical for the study of CP violation. Equally critical was the entanglement of the B meson and anti-B meson produced in the ϒ(4S) decay, as it marked whether it was the B0 or B̅0 that decayed to the same CP final state by tagging the flavour of the other meson.
By October 2000 PEP-II had achieved its design luminosity of 3 × 1033 cm–2 s–1 and less than a year later BaBar published its observation of CP violation in the B0 meson system based on a sample of 32 × 106 pairs of B0-B̅0 mesons – on the same day that Belle, its competitor at Japan’s KEK laboratory, published the same observation. These results led to Makoto Kobayashi and Toshihide Maskawa sharing the 2008 Nobel Prize in Physics. The ultimate luminosity achieved by PEP-II, in 2006, was 1.2 × 1034 cm–2s–1. BaBar continued to collect data on or near the ϒ(4S) meson until 2007 and in 2008 collected large samples of ϒ(2S) and ϒ(3S) mesons before PEP-II was shut down. In total, PEP-II produced 471 × 106 B-B̅ pairs for BaBar studies – as well as a myriad of other for other investigations.
The anniversary event also celebrated technical innovations, including “trickle injection” of beam particles intoPEP-II, which provided a nearly 40% increase in integrated luminosity; BaBar’s impressive particle identification, made possible by the DIRC detector; and the implementation of a computing model – spurred by PEP-II delivering significantly more than design luminosity – whereby countries provided in-kind computing support via large “Tier-A” centres. This innovation paved the way for CERN’s Worldwide LHC Computing Grid.
Notable physics results from BaBar include the first observation in 2007 of D–D̅ mixing, while in 2008 the collaboration discovered the long-sought ηb, the lowest energy particle of the bottomonium family. The team also searched for lepton-flavour violation in tau–lepton decays, publishing in 2010 what remain the most stringent limits onτ → μγand τ → eγ branching fractions. In 2012, making it onto Physics World’s top-ten physics results of the year, the BaBar collaboration made the first direct observation of time-reversal violation by measuring the rates at which the B0 meson changes quantum states. Also published in 2012 was evidence for an excess of B̅→ D(*)τ– ν̅τ decays, which challenges lepton universality and is an important part of the current Belle II and LHCb physics programmes. Several years after data-taking ended, it was recognised that BaBar’s data could also be mined for evidence of dark-sector objects such as dark photons, leading to the publication of two significant papers in 2014 and 2017. Another highlight, published last year, is a joint BaBar–Belle paper that resolved an ambiguity concerning the quark-mixing unitarity triangle.
Although BaBar stopped collecting data in 2008, this highly collegial team of researchers continues to publish impactful results. Moreover, BaBar alumni continue to bring their experience and expertise to subsequent experiments, ranging from ATLAS, CMS and LHCb at the LHC, Belle II at SuperKEKB, and long-baseline neutrino experiments (T2K, DUNE, HyperK) to dark-matter (LZ, SCDMS) and dark-energy (LSST) experiments in particle astrophysics.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.