The study of lead–ion collisions at the LHC is a window into the quark–gluon plasma (QGP), a hot and dense phase of deconfined quarks and gluons. An important effect in heavy-ion collisions is jet quenching – the suppression of particle production at large transverse momenta (pT) due to energy loss in the QGP. This suppression is quantified by the nuclear-modification factor RAA, which is the ratio of particle production rate in Pb–Pb collisions to that in proton–proton collisions, scaled for the number of binary nucleon–nucleon collisions. A measured nuclear modification factor of unity would indicate the absence of final-state effects such as jet quenching.
Previous measurements of peripheral collisions revealed less suppression than seen in head-on collisions, but RAA remained significantly below unity. This observation indicates the formation of a dense and strongly interacting system – but it also poses a puzzle. In p–Pb collisions, no suppression has been observed, even though the energy densities are similar to those in peripheral Pb–Pb collisions.
The ALICE collaboration has recently put jet quenching to the test experimentally by performing a rigorous measurement of RAA in narrow centrality bins. The results (figure 1, left) show that the trend of a gradual reduction in the suppression of high-pT particle production as one moves from the most central collisions (corresponding to the 0% centrality percentile) to those with a greater impact parameter does not continue above a centrality of 75%. Instead, the data show a dramatically different behaviour: increasingly strong suppression for the most peripheral collisions. The change at 75% centrality shows that the suppression mechanism for peripheral collisions is fundamentally different from that observed in central collisions, where the suppression can be explained by parton energy loss in the QGP.
In a single Pb–Pb collision several nucleons collide. It has recently been suggested that the alignment of each nucleon collision plays an important role: if the nucleons are aligned, a single collision produces more particles, which results in a correlation between particle production at low pT, which is used to determine the centrality, and at high pT, where RAA is measured. The suppression in the peripheral events can be modelled with a simple PYTHIA- based model that does not implement jet-quenching effects, but incorporates the biases originating from the alignment of the nucleons, yielding qualitative agreement above 75% centrality (figure 1, right).
These results demonstrate that with the correct treatment of biases from the parton–parton interactions the observed suppression in Pb–Pb collisions is consistent with results from p–Pb collisions at similar multiplicities – an important new insight into the nuclear modification factor in small systems.
The LHCb collaboration has discovered a new pentaquark particle, dubbed the Pc(4312)+, decaying to a J/ψ and a proton, with a statistical significance of 7.3 standard deviations. The LHCb data, first presented at Rencontres de Moriond in March, also confirm that the Pc(4450)+ structure previously reported by the collaboration in 2015 has now been resolved into two narrow, overlapping peaks, the Pc(4440)+ and Pc(4457)+, with a statistical significance of 5.4 standard deviations compared to the single-peak hypothesis (figure 1). Together, the results offer rich studies of the strong internal dynamics of exotic hadrons.
In the famous 1964 papers that set out the quark model, Murray Gell-Mann and George Zweig mentioned the possibility of adding a quark–antiquark pair to the minimal meson and baryon states qq̅ and qqq, thereby proposing the new configurations qqq̅q̅ and qqqqq̅. Nearly four decades later, the Belle collaboration discovered the surprisingly narrow X(3872) state with a mass very close to the D0D̅*0 threshold, hinting at a tetraquark structure (cc̅uu̅). A decade after that, Belle discovered narrow Zb0,± states just above the BB̅* and B*B̅* thresholds; this was followed by observations of Zc0,± states just above the equivalent charm thresholds by BES-III and Belle. The existence of charged Zb± and Zc± partners makes the exotic nature of these states clear: they cannot be described as charmonium (cc̅) or bottomonium (bb̅) mesons, which are always neutral, but must instead be a combination such as cc̅ud̅. There is also evidence for broad Zc± states from Belle and LHCb, such as the Zc(4430)±.
A major turning point in exotic baryon spectroscopy was achieved by LHCb in July 2015 when, based on an analysis of Run 1 data, the collaboration reported significant pentaquark structures in the J/ψ−p mass distribution in Λb0→ J/ψpK− decays. A narrow Pc(4450)+ and a broad Pc(4380)+ were reported, both with minimal quark content of cc̅uud (CERN Courier September 2015 p5).
The new results use the data collected at LHCb in Run 1 and Run 2, providing a Λb0 sample nine times larger than that used in the 2015 paper. The new data reproduce the parameters of the Pc(4450)+ and Pc(4380)+ states when analysed the same way as before. However, the much larger dataset makes a more fine-grained analysis possible, revealing additional peaking structures in the J/ψ-p invariant mass spectrum that were not visible before. A new narrow peak, with a width comparable to the mass resolution, is observed near 4312 MeV, right below the Σ+cD̅0 threshold. The structure seen before at 4450 MeV has been resolved into two narrower peaks, at 4440 and 4457 MeV. The latter is right at the Σ+cD̅*0 threshold.
These Pc states join a growing family of narrow exotic hadrons with masses near hadron–hadron thresholds. This is expected in certain models of loosely bound “molecular” states whose structure resembles the way a proton and neutron bind to form a deuteron. Other models, such as of tightly bound pentaquarks, could also explain the Pc resonances. A more complete understanding will require further experimental and theoretical investigation.
Searching for the decay μ+ → e+γ is like looking for a needle in a haystack the size of the Great Pyramid of Giza. This simile-stretching endeavour is the task of the MEG II experiment at the Paul Scherrer Institute (PSI) in Villigen, Switzerland. MEG II is an upgrade of the previous MEG experiment, which operated from 2008 to 2013. All experimental data so far are consistent with muon decays that conserve lepton flavour by the production of two appropriately flavoured neutrinos. Were MEG II to observe the neutrinoless decay of the muon to a positron and a photon, it would be the first evidence of flavour violation with charged leptons, and unambiguous evidence for new physics.
Lepton-flavour conservation is a mainstay of every introductory particle-physics course, yet it is merely a so-called accidental symmetry of the Standard Model (SM). Unlike gauge symmetries, it arises because only massless left-handed neutrinos are included in the model. The corresponding mass and interaction terms of the Lagrangian can therefore be simultaneously diagonalised, which means that interactions always conserve lepton flavour. This is not the case in the quark sector, and as a result quark flavour is not conserved in weak interactions. Since lepton flavour is not considered to be a fundamental symmetry, most extensions of the SM predict its violation at a level that could be observed by state-of-the-art experiments.
Indeed an extension of the SM is already required to include the tiny neutrino masses that we infer from neutrino oscillations. In this extension, neutrino oscillations induce charged lepton-flavour-violating processes but with the branching ratio for μ+ → e+γ emerging to be only 10–54, which cannot be accessed experimentally (see “Charged lepton-flavour violation in the SM” box). A data sample of muons as large as the number of protons in the Earth would not be enough to see such an improbable decay. Charged lepton-flavour violation is therefore a clear signature of new physics with no SM backgrounds.
Finding the needle
The search requires an intense source of muons, and detectors capable of reconstructing the kinematics of the muon’s decay products with high precision. PSI offers the world’s most intense continuous muon beams, delivering up to 108 muons per second. MEG II (previously as MEG) is designed to search for μ+ → e+γ by stopping positive muons on a thin target, and looking for positron–photon pairs from muon decays at rest. This method exploits the two-body kinematics of the decay to discriminate signal events from the backgrounds, which are predominantly the radiative muon decay μ+ → e+ νe ν̅μ γ and the accidental time coincidence of a positron and photon produced by different muon decays.
In the late 1990s, when the first MEG experiment was being designed, theorists argued that the μ+ → e+γ branching ratio could be as high as 10–12 to 10–14, based on supersymmetry arising at the TeV scale. Twenty years later, MEG has excluded branching ratios above 4.2 × 10–13 (figure 1), and supersymmetric particles remain undiscovered at the LHC. Nevertheless, since charged lepton-flavour-violating processes are sensitive to the virtual exchange of new particles, while not requiring their creation as at the LHC, they can probe new physics models (supersymmetry, extra dimensions, leptoquarks, multi-Higgs, etc) up to mass scales of thousands of TeV. Scales such as these are not only unreachable at the LHC, but also at near-future accelerators.
The MEG collaboration therefore decided to upgrade the detectors with the goal of improving the sensitivity of the experiment by a factor of 10. The new experiment, which adopts the same measurement principle, is expected to start taking data at the end of 2019 (figure 2). Photons are reconstructed by a liquid xenon (LXe) detector technology that was pioneered by the MEG collaboration, achieving an unprecedented ~2% calorimetric resolution at energies as low as 52.8 MeV – the energy of the photon in a μ+ → e+γ decay. The LXe detector provides a high-resolution measurement of the position and timing of the photon conversion, precise to a few millimetres and approximately 70 ps. The positrons are reconstructed in a magnetic spectrometer instrumented with drift chambers for tracking, and scintillator bars for timing. A peculiarity of the MEG spectrometer is a non-uniform magnetic field, diminishing from 1.2 T at the centre of the detector to 0.5 T at the extremities. The gradated field prevents positrons from curling too many times. This avoids pileup in the detectors and makes positrons of the same momentum curl with the same radius, independent of their emission angle, thus simplifying the design and operation of the tracking system.
Following a major overhaul that was begun in 2011, all the detectors have now been upgraded. Silicon photomultipliers custom-modified for sensitivity to the ultraviolet LXe scintillation light have replaced conventional photomultipliers on the inner face of the calorimeter. Small scintillating tiles have replaced the scintillating bars of the positron-timing detector to improve timing and reduce pileup. The main challenge when upgrading the drift chambers was dealing with high positron rates. Here, the need for high granularity had to be balanced by keeping the total amount of material low. This reduces both multiple scattering and the rate of positrons annihilating in the material, and contributions to the coincident-photon background in the calorimeter. The solution was the use of extremely thin 40 and 50 μm silver-plated aluminium wires, 20 μm gold-plated tungsten wires, and innovative assembly techniques. All the detectors’ resolutions were improved by a factor of around two with respect to the MEG experiment. The MEG II design also includes a new detector to veto photons coming from radiative muon decays, improved calibration tools and new trigger and data-acquisition electronics to cope with the increased number of readout channels. The improved detector performance will allow the muon beam rate to be more than doubled, from 3.3 × 107 to 7 × 107 muons per second.
The detectors were installed and tested in the muon beam in 2018. In 2019 a test of the whole detector will be completed, with the possibility of collecting the first physics data. The experiment is then expected to run for three years to uncover evidence for the μ+ → e+γ decay if the branching ratio is around 10–13 or set a limit of 6 × 10–14 on its branching ratio.
Charged lepton-flavour violation in the SM – a very small neutrino oscillation experiment
The presence of only massless left-handed neutrinos in the Standard Model (SM) gives rise to the accidental symmetry of lepton-flavour conservation – yet neutrino oscillation experiments have observed neutrinos changing flavour in-transit from sources as far away as the Sun and as near as a nuclear reactor. Such neutral lepton-flavour violation implies that neutrinos have tiny masses and that their flavour eigenstates are distinct from their mass eigenstates. Phases develop between the mass eigenstates as a neutrino travels, and the wavefunction becomes a mixture of the flavour eigenstates, rather than the unique original flavour, as would remain the case for truly massless neutrinos.
The effect on charged lepton-flavour violation is subtle and small. In most neutrino oscillation experiments, a neutrino is created in a charged-current interaction and observed in a later interaction via the creation of a charged lepton of the corresponding flavour in the detector.
μ+ → e+γ may proceed in a similar way, but where the same W boson is involved in both the creation and destruction of the neutrino, and the neutrino oscillates in between (see figure above).
In this process, the neutrino oscillation ν̅μ→ν̅e has to occur at an energy scale E ~ mw, over an extremely short distance of L ~ 1/mw. Considering only two neutrino species with masses m1 and m2, the probability for the oscillation is proportional to sin2 [(m21 – m22) L /4E]. Hence, the μ → eγ branching ratio is suppressed by the tiny factor (m21 – m22)/m2w)2 ≲ 10–49. The exact calculation, including the most recent estimates of the neutrino mixing matrix elements, gives BR(μ → eγ) ~ 10–54.
New directions
In the meantime, PSI researchers are investigating the possibility of building new beamlines with 109 or even 1010 muons per second to allow experimenters to probe even smaller branching ratios. How could a future experiment cope with such high rates? Preliminary studies are investigating a system where photons are converted into pairs of electrons and positrons, and reconstructed in a tracking device. This solution, which has already been exploited previously by the MEGA experiment at Los Alamos National Laboratory, could also improve the photon resolution.
At the same time, other experiments are searching for charged lepton-flavour violation in other channels. Mu3e, also at PSI, will search for μ+ → e+e+e– decays. The Mu2e and COMET experiments, at Fermilab and J-PARC, respectively, will search for muon-to-electron conversion in the field of a nucleus. These processes are complementary to μ+ → e+γ,allowing alternative scenarios to be probed. At the same time, collider experiments such as Belle II and LHCb are working on studies of lepton-flavour violation in tau decays. LHCb researchers are also testing lepton universality, which holds that the weak couplings are the same for each lepton flavour (see The flavour of new physics). As theorists often stress, all these analyses are strongly complementary both with each other and with direct searches for new particles at the LHC.
Ever since the pioneering work of Conversi, Pancini and Piccioni, muons have played a crucial role in the development of particle physics. When I I Rabi exclaimed “who ordered that?”, he surely did not imagine that 80 years later the lightest unstable elementary particle would still be a focus of cutting-edge research.
Serbia became the 23rd Member State of CERN, on 24 March, following receipt of formal notification from UNESCO. Ever since the early days of CERN (former Yugoslavia was one of the 12 founding Member States of CERN in 1954, until its departure in 1961), theSerbian scientific community has made strong contributions to CERN’s projects. This includes at the Synchrocyclotron, Proton Synchrotron and Super Proton Synchrotron facilities. In the 1980s and 1990s, physicists from Serbia worked on the DELPHI experiment at CERN’s LEP collider. In 2001, CERN and Serbia concluded an International Cooperation Agreement, leading to Serbia’s participation in the ATLAS and CMS experiments at the LHC, in the Worldwide LHC Computing Grid, as well as in the ACE and NA61 experiments. Serbia’s main involvement with CERN today is in the ATLAS and CMS experiments, in the ISOLDE facility, and on design studies for future particle colliders – FCC and CLIC – both of which are potentially new flagship projects at CERN.
Serbia was an Associate Member in the pre-stage to membership from March 2012. As a Member State, Serbia will have voting rights in the CERN Council, while the new status will also enhance the recruitment opportunities for Serbian nationals at CERN and for Serbian industry to bid for CERN contracts. “Investing in scientific research is important for the development of our economy and CERN is one of the most important scientific institutions today,” says Ana Brnabić, Prime Minister of Serbia. “I am immensely proud that Serbia has become a fully-fledged CERN Member State. This will bring new possibilities for our scientists and industry to work in cooperation with CERN and fellow CERN Member States.”
On 8 April, CERN unveiled plans for a major new facility for scientific education and outreach. Aimed at audiences of all ages, the Science Gateway will include exhibition spaces, hands-on scientific experiments for schoolchildren and students, and a large amphitheatre to host science events for experts and non-experts alike. It is intended to satisfy the curiosity of hundreds of thousands visitors every year and is core to CERN’s mission to educate and engage the public in science.
“We will be able to share with everybody the fascination of exploring and learning how matter and the universe work, the advanced technologies we need to develop in order to build our ambitious instruments and their impact on society, and how science can influence our daily life,” says CERN director-general, Fabiola Gianotti. “I am deeply grateful to the donors for their crucial support in the fulfilment of this beautiful project.”
The overall cost of the Science Gateway, estimated at 79 m Swiss Francs, is entirely funded through donations. Almost three quarters of the cost has already been secured, thanks in particular to a contribution of 45 m Swiss Francs from Fiat Chrysler Automobiles. Other donors include a private foundation in Geneva and Loterie Romande, which distributes its profits to public utility projects. CERN is looking for additional donations to cover the full cost of the project.
The Science Gateway will be hosted in iconic buildings with a 7000 m2 footprint, linking CERN’s Meyrin site and the Globe of Science and Innovation. It is being designed by renowned architects Renzo Piano Building Workshop and intends to “celebrate the inventiveness and creativity that characterise the world of research and engineering”. Construction is planned to start in 2020 and be completed in 2022.
On 10 April, researchers working on the Event Horizon Telescope – a network of eight radio dishes that creates an Earth-sized interferometer – released the first direct image of a black hole. The landmark result, which shows the radiation emitted by superheated gas orbiting the event horizon of a super massive black hole in a nearby galaxy, opens a brand new window on these incredible objects.
Super massive black holes (SMBHs) are thought to occupy the centre of most galaxies, including our own, with masses up to billions of solar masses and sizes up to 10 times larger than our solar system. Discovered in the 1960s via radio and optical measurements, their origin, as well as their nature and surrounding environments, remain important open issues within astrophysics. Spatially resolved images of an SMBH and the potential accretion disks around them form vital input, but producing such images is extremely challenging.
SMBHs are relatively bright in radio wavelengths. However, since the imaging resolution achievable with a telescope scales with the wavelength (which is long in the radio range) and scales inversely with the telescope diameter, it is difficult to obtain useful images in the radio region. For example, producing an image with the same resolution as the optical Hubble Space Telescope would require a km-wide telescope, while obtaining a resolution that would allow an SMBH to be imaged, would require a telescope diameter of thousands of kilometres. One way around this is to use interferometry to turn many telescopes dishes at different locations into one large telescope. Such an interferometer measures the differences in arrival time of one radio wave at different locations on Earth (induced by the difference in travel path), from which it is possible to reconstruct an image on the sky. This does not only require a large coordination between many telescopes around the world, but also very precise timing, vast amounts of collected data and enormous computing power.
Despite the considerable difficulties, the Event Horizon Telescope project used this technique to produce the first image of an SMBH using an observation time of only tens of minutes. The imaged SMBH lies at the centre of the supergiant elliptical galaxy Messier 87, which is located in the Virgo constellation at a distance of around 50 million light years. Although relatively close in astronomical terms, its very large mass makes its size on the sky comparable to that of the much lighter SMBH in the centre of our galaxy. Furthermore, its accretion rate (brightness) is variable on longer time scales, making it easier to image. The resulting image (above) shows the clear shadow of the black hole in the centre surrounded by an asymmetric ring caused by radio waves that are bent around the SMBH by its strong gravitational field. The asymmetry is likely a result of relativistic beaming of part of the disk of matter which moves towards Earth.
The team compared the image to a range of detailed simulations in which parameters such as the black hole’s mass, spin and orientation were varied. Additionally, the characteristics of the matter around the SMBH, mainly hot electrons and ions, as well as the magnetic field properties were varied. While the image alone does not allow researchers to constrain many of these parameters, combining it with X-ray data taken by the Chandra and NuSTAR telescopes enables a deeper understanding. For example, the combined data constrain the SMBH mass to 6.5 billion solar masses and appears to exclude a non-spinning black hole. Whether the matter orbiting the SMBH rotates in the same direction or opposite to the black hole, as well as details on the environment around it, will require additional studies. Such studies can also potentially exclude alternative interpretations of this object; currently, exotic objects like boson stars, gravastars and wormholes cannot be fully excluded.
The work of the Event Horizon Telescope collaboration, which involves more than 200 researchers worldwide, was published in six consecutive papers in TheAstrophysical Journal Letters. While more images at shorter wavelengths are foreseen in the future, the collaboration also points out that much can be learned by combining the data with that from other wavelengths, such as gamma-rays. Despite this first image being groundbreaking, it is likely only the start of a revolution in our understanding of black holes and, with it, the universe.
A world record for laser-driven wakefield acceleration has been set by a team at the Berkeley Lab Laser Accelerator (BELLA) Center in the US. Physicists used a novel scheme to channel 850 TW laser pulses through a 20 cm-long plasma, allowing electron beams to be accelerated to an energy of 7.8 GeV – almost double the previous record set by the same group in 2014.
Proposed 40 years ago, plasma-wakefield acceleration can produce gradients hundreds of times higher than those achievable with conventional techniques based on radio-frequency cavities. It is often likened to surfing a wave. Relativistic laser pulses with a duration of the order of the plasma period generate large-amplitude electron plasma waves that displace electrons with respect to the background ions, allowing the plasma waves to accelerate charged particles to relativistic energies. Initial work showed that TeV energies could be reached in just a few hundred metres using multiple laser-plasma accelerator stages, each driven by petawatt laser pulses propagating through a plasma with a density of about 1017 cm–3. However, this requires the focused laser pulses to be guided over distances of tens of centimetres. While a capillary discharge is commonly used to create the necessary plasma channel, achieving a sufficiently deep channel at a plasma density of 1017 cm–3 is challenging.
In the latest BELLA demonstration, the plasma channel produced by the capillary discharge was modified by a nanosecond-long “heater” pulse that confined the focused laser pulses over the 20 cm distance. This allowed for the acceleration of electron beams with quasi-monoenergetic peaks up to 7.8 GeV. “This experiment demonstrates that lasers can be used to accelerate electrons to energies relevant to X-ray free-electron lasers, positron generation, and high-energy collider stages,” says lead author Tony Gonsalves. “However, the beam quality currently available from laser-wakefield accelerators is far from that required by future colliders.”
The quality of the accelerated electron beamis determined by how background plasma electrons are trapped in the accelerating and focusing “bucket” of the plasma wave. Several different methods of initiating electron trapping have been proposed to improve the beam emittance and brightness significantly beyond state-of-the-art particle sources, representing an important area of research. Another challenge, says Gonsalves, is to improve the stability and reproducibility of the accelerated electron beams, which are currently limited by fluctuations in the laser systems caused by air and ground motion.
In addition to laser-driven schemes, particle-driven plasma acceleration holds promise for high-energy physics applications. Experiments using electron-beam drivers are ongoing and planned at various facilities including FLASHForward at DESY and FACET-II at SLAC (CERN Courier January/February 2019 p10). The need for staging multiple plasma accelerators may even be circumvented by using energetic proton beams as drivers. Recent experiments at CERN’s Advanced Wakefield Experiment demonstrated electron acceleration gradients of around 200 MV/m using proton-beam-driven plasma wakefields (CERN Courier October 2018 p7).
Experiments at Berkeley in the next few years will focus on demonstrating the staging of laser-plasma accelerators with multi-GeV energy gains. “The field of plasma wakefield acceleration is picking up speed,” writes Florian Grüner of the University of Hamburg in an accompanying APS Viewpoint article. “If plasma wakefields can have gradients of 1 TV/m, one might imagine that a ‘table-top version of CERN’ is possible.”
The LHCb collaboration has released a much anticipated update on its measurement of RK – a ratio that describes how often a B+ meson decays to a charged kaon and either a μ+μ– or an e+e– pair, and therefore provides a powerful test of lepton universality. The more precise measurement, officially revealed at Rencontres de Moriond on 22 March, suggests that the intriguing current picture of flavour anomalies persists.
Since 2013, several results involving the decay of b quarks have hinted at deviations from lepton universality, a tenet of the Standard Model (SM), though none is individually significant enough to constitute evidence of new physics. LHCb has studied a number of ratios comparing b-decays to different leptons and also sees signs that something is amiss in angular distributions of B→K*μ+μ− decays. Data from BaBar and Belle add further intrigue, though with lower statistical significances.
The latest measurement from LHCb is the first lepton-universality test performed using part of the 13 TeV Run 2 data set (2015–2016) together with the full Run 1 data sample, representing in total an integrated luminosity of 5fb-1. The blinded analysis was performed in the range 1.1<q2<6.0 GeV2, where q2 is the invariant mass of the μ+μ– or e+e– pair. It found RK = 0.846+0.060–0.054 (stat) +0.016–0.014 (syst), the most precise measurement to date. However, having shifted closer to the Standard Model prediction, the value leaves the overall significance unchanged at about 2.5 standard deviations.
“I cannot tell you if lepton-flavour universality is broken or not, so sorry for this!” said Thibaud Humair of Imperial College London, who presented the result on behalf of the LHCb collaboration. “All LHCb results for RK are below SM expectations. Together with b → sμ+μ− results, RK and RK* constitute an interesting pattern of anomalies, but the significance is still low,” he said.
Humair’s talk generated much discussion, with physicists pressing LHCb on potential sources of uncertainties and other possible explanations such as the dependence of RK on q2. Other experiments also showed new measurements of lepton universality and other related tests of the Standard Model, such as ATLAS on the branching ration of Bs→μ+μ− and an update from Belle on both RD(*) and RK*. The current experimental activity in flavour physics was reflected by several talks at Moriond from theorists.
“It’s not a discovery, but something is going on,” says David Straub of TUM Munich, who had spent the previous 24 hours working solid to update a global likelihood fit of all parameters relevant to the b anomalies with the updated LHCb and Belle results. The fit, which involves 265 observables showed that b → sl+l– observables such as RK continue to show a “large pull” towards new-physics. “The popular ‘U1 leptoquark’ is still giving excellent fit to the data”, says Straub.
Further reduction in the uncertainty on RK can be expected when the data collected by LHCb in 2017 and 2018 are included in a future analysis. Meanwhile, in Japan, the Belle II physics programme has now begun in earnest and the collaboration is expected to bring further statistical power to the b-anomaly question in the near future.
On 26 February, a new solar power plant powering the SESAME light source in Jordan was officially inaugurated. In addition to being the first synchrotron-light facility in the Middle East region, SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy.
Electricity from the solar power plant will be supplied by an on-grid photovoltaic system constructed 30 km away, and its 6.48 MW power capacity is ample to satisfy SESAME’s needs for several years. “As in the case of all accelerators, SESAME is in dire need of energy, and as the number of its users increases so will its electricity bill,” says SESAME director Khaled Toukan. “Given the very high cost of electricity in Jordan, with this solar power plant the centre becomes sustainable.”
Energy efficiency and other environmental factors are coming under growing scrutiny at large research infrastructures worldwide. The necessary funding for the SESAME installation became available in late 2016 when the Government of Jordan agreed to allocate JD 5 million (US$7.05 million) from funds provided by the European Union (EU) to support the deployment of clean energy sources. The power plant, which uses monocrystalline solar panels, was built by the Jordanian company Kawar Energy and power that is transmitted to the grid will be accounted for to the credit of SESAME.
SESAME opened its beamlines to users in July 2018. Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey are currently members of SESAME, with 16 further countries – plus CERN and the EU – listed as observers.
The Japanese government has put on hold a decision about hosting the International Linear Collider (ILC), to the disappointment of many hoping for clarity ahead of the update of the European strategy for particle physics. At a meeting in Tokyo on 6–7 March, Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) announced, with input from the Science Council of Japan (SCJ), that it has “not yet reached declaration” for hosting the ILC at this time. A statement from MEXT continued: “The ILC project requires further discussion in formal academic decision-making processes such as the SCJ Master Plan, where it has to be clarified whether the ILC project can gain understanding and support from the domestic academic community… MEXT will continue to discuss the ILC project with other governments while having an interest in the ILC project.”
The keenly awaited announcement was made during the 83rd meeting of the International Committee for Future Accelerators (ICFA) at the University of Tokyo. During a press briefing, ICFA chair Geoffrey Taylor emphasised that colliders are long-term projects. “At the last strategy update in 2013 the ILC was seen as an important development in the field, and we were hoping there would be a definite statement from Japan so that it can be incorporated into the current strategy update,” he said. “We don’t have that positive endorsement, so it will proceed at a slower rate than we hoped. ICFA still supports Japan as hosts of the ILC, and we hope it is built here because Japan has been working hard towards it. If not, we can be sure that there will be somewhere else in the world where the project can be taken up.”
The story of the ILC, an electron–positron collider that would serve as a Higgs factory, goes back more than 15 years. In 2012, physicists in Japan submitted a petition to the Japanese government to host the project. A technical design report was published the following year. In 2017, the original ILC design was revised to reduce its centre-of-mass energy by half, shortening it by around a third and reducing its cost by up to 40%.
Meanwhile, MEXT has been weighing up the ILC project in terms of its scientific significance, technical challenges, cost and other factors. In December 2018, the SCJ submitted a critical report to MEXT highlighting perceived issues with the project, including its cost and international organisation. Asked at the March press briefing why the SCJ should now be expected to change its views on the ILC, KEK director-general Masanori Yamauchi responded: “We can show that we already have solutions for the technical challenges pointed out in the latest SCJ report, and we are going to start making a framework for international cost-sharing.”
Writing in LC NewsLine, Lyn Evans, director of the Linear Collider Collaboration (which coordinates planning and research for the ILC and CERN’s Compact Linear Collider, CLIC), remains upbeat: “We did not get the green light we hoped for. Nevertheless, there was a significant step forward with a strong political statement and, for the first time, a declaration of interest in further discussions by a senior member of the executive. We will continue to push hard.”
Japan’s statement has also been widely interpreted as a polite way for the government to say “no” to the ILC. “The reality is that it is naturally difficult for people outside the machinery of any national administration to understand fully how procedures operate, and this is certainly true of the rest of the world with regard to what is truly happening with ILC in Japan,” says Phil Burrows of the University of Oxford, who is spokesperson for the CLIC accelerator collaboration.
A full spectrum of views was expressed at a meeting of the linear-collider community in Lausanne, Switzerland, on 8–9 April, with around 100 people present. “The global community represented at the Lausanne meeting restated the overwhelming physics case for an electron–positron collider to make precision measurements in the Higgs and top-quark sectors, with superb sensitivity to new physics,” says Burrows. “We are in the remarkable situation that we have not one, but two, mature options for doing this: ILC and CLIC. I hope that the European Strategy Update recommendations will reflect this consensus on the physics case, position Europe to play a leading role, and hence ensure that one of these projects proceeds to realisation.”
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.