The CMS collaboration presented 15 new results at the fourth annual Large Hadron Collider Physics (LHCP) conference on 13–18 June in Lund, Sweden. The results included a mixture of searches for new physics and Standard Model measurements at a centre-of-mass energy of 13 TeV. CMS also summarized its detector and physics-object performance on recently collected 2016 data, demonstrating that the collaboration has emerged from the winter shutdown ready for discovery physics.
The search for new physics in 13 TeV proton collisions continues in earnest, with six new results presented at LHCP. A combined search for high-mass resonances decaying to the Zγ final state, with Z bosons decaying to leptons, in the 8 and 13 TeV data sets yields no significant deviation from background expectations for masses ranging from a few hundred GeV to 2 TeV (EXO-16-021). A similar search in the same channel, but with Z bosons decaying to quarks, produced a similar conclusion (EXO-16-020). CMS has also searched for heavy Z´ bosons that decay preferentially to third-generation fermions, including decays to pairs of top quarks (B2G-15-003) and τ leptons (EXO-16-008), and found no excess above the Standard Model prediction.
The top quark-pair analysis uses special techniques to search the all-hadronic final state, where the highly boosted top quarks are reconstructed as single jets, while the search in the τ lepton channel is carried out in four final states depending on the decay mode. No significant signals are observed in either search, resulting in the exclusion of Z´ bosons up to a mass of 3.3 (3.8) TeV for widths of 10 (30)% relative to the mass in the top search, and 2.1 TeV in the τ lepton search. Another search using the τ lepton looks for heavy neutrinos from right-handed W bosons and third-generation scalar leptoquarks in events containing jets and two hadronically decaying taus. This is the first such search for heavy neutrinos using τ leptons, and CMS finds the data well described by Standard Model backgrounds.
CMS continues to probe for possible dark-matter candidates, most recently in final states that contain top quarks (EXO-16-017) or photons (EXO-16-014) plus missing energy. The data are consistent with Standard Model backgrounds and limits are placed on model parameters associated with the dark matter and graviton hypotheses. A search for supersymmetric particles in the lepton-plus-jets final state was also presented for the first time (SUS-16-011). This analysis targets so-called compressed spectra in which weakly interacting supersymmetric particles can have similar masses, giving rise to muons and electrons with very low transverse momentum. No significant signals are observed and limits are placed on the masses of top squarks and gluinos under various assumptions about the mass splittings of the intermediate states.
Finally, a search for a heavy vector-like top quark T decaying to a standard top quark and a Higgs boson (B2G-16-005) was presented for the first time at LHCP. For T masses above 1 TeV, the top quark and Higgs boson are highly boosted and their decay products are reconstructed using similar techniques as in B2G-15-003. Here the data are also consistent with background expectations, allowing CMS to set limits on the product of the cross section and branching fraction for T masses in the range 1.0–1.8 TeV.
Several new Standard Model measurements were shown for the first time at LHCP, including the first measurement of the top-quark cross section at 5 TeV (TOP-16-015) based on data collected during a special proton–proton reference run in 2015 (figure 1). A first measurement by CMS of the WW di-boson cross-section at 13 TeV was also reported (SMP-16-006), where the precision has already reached better than 10%. Finally, three new results on Higgs boson physics were presented for the first time, including the first searches at 13 TeV for vector boson fusion Higgs production in the bottom quark decay channel (HIG-16-003) and a search for Higgs bosons produced in the context of the MSSM model that decay via the τ lepton channel (HIG-16-006). A first look at Higgs lepton-flavor-violating decays in the 13 TeV data (HIG-16-005), using the μτ channel, does not confirm a slight (2.4σ) excess observed in Run 1, although more data is needed to make a definitive conclusion.
The decays of the B0s and B0 into muon pairs represent an important test of the Standard Model. Such decays take place through a flavour-changing neutral current process, which occurs only through loop diagrams and is further suppressed because the two muons are required to have equal helicity in order to conserve angular momentum.
Although the very small value of the predicted branching fractions (3.7 × 10–9 and 1.1 × 10–10 for the B0s and B0, respectively) opens the possibility to search for new physics, the decays present a challenge for experimental programs. Physicists have been placing upper limits on these processes for more than 30 years, with the values decreasing by roughly two orders of magnitude every decade.
ATLAS recently presented the result of a study based on data collected during LHC Run 1, completing the results obtained by CMS and LHCb (CERN Courier September 2013 p19). The new analysis exploits multivariate techniques for the reduction of background events that could mask the small signal from B-meson decays. A first classifier is used to reduce the background due to muons from uncorrelated decays of B hadrons, while a second classifier is used to reduce the fraction of hadrons wrongly identified as muons. Misidentification contributes to the background due to partially reconstructed decays, and is at the origin of the resonant background due to B0s decays into pairs of charged mesons when both are mistaken as muons. ATLAS has achieved values of about 0.10% and 0.05% for the probability of a kaon or a pion to be wrongly identified as a muon, pushing the resonant background below the predicted level of the signal.
For the B0s meson, the branching fraction measured by ATLAS is B(B0s → μ+μ–) = (0.9+1.1–0.8) × 10–9, with an upper limit of 3.0 × 10–9 at a 95% confidence level. The result agrees, within uncertainties, with those of CMS and LHCb. It is lower than the Standard Model prediction but is compatible at the level of two standard deviations. For the B0 an upper limit B(B0 → μ+μ–) < 4.2 × 10–10
is set at a confidence level of 95%, which again is compatible with previous evidence and predictions.
The new result constrains models for new physics that predict a significant enhancement of these B decays, such as some with an extended Higgs sector. Deviations in the direction of lower branching fractions require further clarification with data collected during LHC Run2.
The LHCb experiment has recently made the most precise measurement yet of the asymmetry in oscillations between the matter and antimatter versions of Bs mesons. The measurement exploits the full LHCb data set recorded during Run 1 of the LHC and is consistent with the Standard Model prediction.
Subtle quantum mechanics effects allow the Bs meson, which contains a strange quark and a beauty antiquark, to spontaneously transform into its own antiparticle, BS, in which the quark-antiquark assignment is reversed. Due to quantum interference effects, in the Standard Model this transition occurs at almost exactly the same rate as the reverse process, with the asymmetry between them being predicted to be two parts in a hundred thousand. Finding an asymmetry that is significantly different from this value would suggest that particle-antiparticle oscillations can be indirectly affected by the presence of heavy new particles, as are predicted in new physics models.
Many of the oscillations can occur within the finite lifetime of the Bs mesons, and an asymmetry would therefore appear as a difference in the numbers of Bs and BS meson decays observed by LHCb. Semi-leptonic decays into a charmed hadron, a muon and a neutrino are particularly suited, and the LHCb data set contains around two million of them. The challenge is to avoid being fooled by fake sources of asymmetry due to small imperfections in the detector. Novel methods have been developed to control these sources based on extensive use of the rich samples of signals with charm and charmonium decays.
The final measured asymmetry is 0.45±0.26±0.20%, which is a factor of two more precise than the next-best measurement from the D0 experiment at Fermilab (see figure). The 13 TeV data that are now being recorded will provide an increased rate of Bs and Bd mesons, which will enable LHCb to probe far smaller asymmetries and enhance its sensitivity to possible new physics effects.
In the early stages of a high-energy collision, high-pT partons can be created, before producing sprays of hadrons that are measured experimentally as jets. Not only do high-pT partons carry information about the parton scattering itself, but they also serve as probes for the environment they cross. In nucleus–nucleus collisions, for instance, high-pT partons probe the strongly interacting medium of quarks and gluons (the quark-gluon plasma, QGP). Due to the interactions of these partons with the QGP, particle production is suppressed at large transverse momentum compared to an incoherent superposition of nucleon–nucleon collisions.
Although this observation is one of the key results in heavy-ion collisions, it is a priori not clear to which extent the suppression is caused by hot nuclear-matter effects, such as the jet-medium interaction, and to which extent by cold nuclear-matter effects, such as the presence of the nucleus itself. Unlike in lead–lead collisions, modifications of jet production due to hot nuclear-matter effects are not expected in proton–lead collisions. Therefore, measurements of the nuclear modification of jet spectra in proton–lead collisions can be used to disentangle cold from hot nuclear-matter effects.
ALICE has recently measured charged jet spectra and their nuclear modification in proton–lead collisions for transverse momenta within 20–120 GeV/c. The main corrections include the subtraction of a mean underlying event density and a statistical treatment of within-event fluctuations, as well as an unfolding of the detector response. One of the challenges in analysing proton–lead collisions is to be able to measure the collision geometry (called the event centrality), and also to evaluate the mean number of binary nucleon–nucleon collisions for different centralities. Several methods for centrality determination were tested in ALICE and the least-biased method was used for this measurement.
The measurement produces a clear result: for the probed acceptance, and within the systematic and statistical uncertainties, all nuclear modification factors are compatible with unity. The charged jet spectra measured in proton–lead collisions at an energy of 5.02 TeV do not show any significant centrality dependence and they scale with the jet spectra in proton–proton collisions at the same energy. Therefore, there is no evidence that high transverse momentum jets are modified by the cold nuclear medium, confirming the conclusion drawn from measurements of the nuclear modification factor for single high-transverse momentum hadrons.
The European Extremely Large Telescope (E-ELT) will be the largest optical/near-infrared telescope in the world, boasting a primary mirror 39 m in diameter. Its aim is to measure the properties of the first stars and galaxies and to probe the nature of dark matter and dark energy, in addition to tracking down Earth-like planets.
At a ceremony in Garching bei München, Germany, on 25 May, the European Southern Observatory (ESO) signed a contract with the ACe Consortium for the construction of the dome and telescope structure of the E-ELT. With an approximate value of €400 million it is the largest contract ever awarded by ESO and the largest contract ever in ground-based astronomy. The occasion also saw the unveiling of the construction design of the E-ELT, which is due to enter operation in 2024.
The construction of the E-ELT dome and telescope structure can now commence, taking telescope engineering into new territory. The contract includes not only the enormous 85 m-diameter rotating dome, with a total mass of around 5000 tonnes, but also the telescope mounting and tube structure, with a total moving mass of more than 3000 tonnes. Both of these structures are by far the largest ever built for an optical/infrared telescope and dwarf all existing ones.
The E-ELT is being built on Cerro Armazones, a 3000 m-high peak about 20 km from ESO’s Paranal Observatory. The access road and leveling of the summit have already been completed and work on the dome is expected to start on site in 2017.
New results reported in two papers in Nature from the CLOUD experiment at CERN imply that the pre-industrial climate may have had brighter and more extensive clouds than previously thought, sharpening our understanding of the impact of human activities on climate. CLOUD (Cosmics Leaving Outdoor Droplets) is designed to understand how aerosol particles form and grow in the atmosphere, and the effect this has on clouds and climate. It comprises a 26 m3 vacuum chamber containing atmospheric particles, into which beams of charged pions are fired from the Proton Synchrotron to mimic the seeding of clouds by galactic cosmic rays.
The increase in aerosols and clouds since pre-industrial times is one of the largest sources of uncertainty in climate change, according to the Intergovernmental Panel on Climate Change. The new CLOUD results show that organic vapours emitted by trees produce abundant aerosol particles in the atmosphere in the absence of sulphuric acid. Previously, it was thought that sulphuric acid – which largely arises from burning fossil fuels – was essential to initiate aerosol particle formation. CLOUD finds that oxidized biogenic vapours dominate particle growth in unpolluted environments, starting just after the first few molecules have stuck together and continuing all the way up to sizes above 50–100 nm, where the particles can seed cloud droplets.
The experiment also finds that ions from galactic cosmic rays enhance the production rate of pure biogenic particles by a factor of 10–100 compared with particles without ions, which suggests that cosmic rays played a more important role in aerosol and cloud formation in pre-industrial times than they do in today’s polluted atmosphere.
CLOUD, which has produced a series of high-impact publications following its first results in 2011, is the first experiment to reach the demanding technological performance and ultralow contaminant levels necessary to be able to measure aerosol nucleation and growth under controlled conditions in the laboratory.
CERN’s pioneering AWAKE facility, which aims to drastically reduce the scale of particle accelerators, received its first beam on 16 June. The milestone signals the next stage of commissioning for the novel experiment, which aims to use plasma wakefields driven by a proton beam to accelerate charged particles to high energies over very short distances. The proton beam had to travel around 800 m before entering a 10 m-long plasma cell, which is empty during the current commissioning phase, and then carry on downstream to several detectors. The test was a success, with protons striking the detector straight away, and the team now plans to finalize installation of the experiment, the laser and the full plasma cell. AWAKE hopes to start collecting physics data by the end of the year.
The origin of some of the heaviest chemical elements is due to rapid neutron capture, but the precise location where this cosmic alchemy takes place has been under debate for several decades. While core-collapse supernovae were thought to be the prime production site, a new study suggests that elements heavier than zinc originate from the merger of two neutron stars. Such a dramatic event would have been responsible for the extreme heavy-element enrichment observed in several stars of an ancient dwarf galaxy called Reticulum II.
Nuclear fusion in the core of massive stars produces elements up to and including iron, which is a stable nucleus with the highest binding energy per nucleon. Building heavier nuclei requires energy to compensate for the loss of nuclear binding and is therefore almost impossible to achieve experimentally. But under certain conditions, stars can produce heavier elements by allowing them to capture protons or neutrons.
The relative abundance of certain elements therefore tells researchers whether nucleosynthesis followed an s- or an r-process.
Neutron capture, which is unaffected by Coulomb repulsion, occurs either slowly (s) or rapidly (r). Slow neutron captures occur at a pace that allows the nucleus to undergo beta decay prior to a new capture, and therefore to grow following the line of nuclear stability. The r-process, on the other hand, causes a nucleus to accumulate many additional neutrons prior to radioactive decay. The relative abundance of certain elements therefore tells researchers whether nucleosynthesis followed an s- or an r-process. The rare-earth element europium is a typical r-process element, as are gold, lead and uranium.
For the r-process to work, nuclei need to be under heavy neutron bombardment in conditions that are only found in dramatic events such as a core-collapse supernova or in mergers of two neutron stars. The supernova hypothesis has long been the most probable candidate for the r-process, whereas other scenarios involving rarer events, such as encounters between a neutron star and a black hole, have only been considered since the 1970s. One way to distinguish between the two hypotheses is to study low-metallicity galaxies in which the enrichment of heavy elements is low. This enables astrophysicists to determine if the enrichment is a continuous process or the result of rare events, which would result in stronger differences from one galaxy to the other.
Alexander Ji from the Massachusetts Institute of Technology, US, and colleagues were lucky to find extreme relative abundances of r-process elements in stars located in the ultra-faint dwarf galaxy Reticulum II. Although nearby and in orbit around the Milky Way, this galaxy was only recently discovered and found to be among the most metal-poor galaxies known. This means that Reticulum II formed all of its stars within about the first three-billion years after the Big Bang, and is therefore only enriched in elements heavier than helium by a few generations of stars.
High-resolution spectroscopic measurements of the nine brightest stars in Reticulum II carried out by the team indicate a very strong excess of europium and barium compared with iron in seven of the stars. These abundances exceed by two-to-three orders of magnitude those in any other ultra-faint dwarf galaxy, suggesting that a single rare event produced these r-process elements. The results also show that this event could be a neutron-star merger, but not an ordinary core-collapse supernova. Although it is not possible to conclude that the majority of our gold and uranium comes from neutron-star mergers, the study certainly gives more weight to such a hypothesis in the 60 year-long debate about the origin of r-process elements.
In July 1956, in a brief paper published in Science, a small team based at the Los Alamos National Laboratory in the US presented results from an experiment at a new, powerful fission reactor at the Savannah River Plant, in South Carolina. The work, they wrote, “verifies the neutrino hypothesis suggested by Pauli”. Clyde Cowan, Fred Reines, Kiko Harrison, Herald Kruse and Austin McGuire had demonstrated for the first time that it was possible to detect neutrinos, setting in motion the new field of neutrino physics. The key ingredients were an intense source and a big detector, with more than a touch of ingenuity and patience.
More than two decades previously, in 1930, Wolfgang Pauli had proposed that the “energy crisis” in nuclear beta decay – presented by the continuous energy spectrum of the emitted electron – would be solved if the decaying nucleus also emitted a second, undetected particle. This would allow the energy released to be shared between three objects, including the recoiling nucleus, and so yield electrons with a range of energies, just as observed. The new particle had to be neutral and have a relatively small mass. Pauli called his proposal “a desperate remedy”, in part because he thought that if such a particle did indeed exist, then it “would probably have long ago been seen”.
Nevertheless, Enrico Fermi took the possibility seriously and based his seminal work on beta decay, published in 1934, on a point-contact interaction in which a neutron decays to a proton, electron and (anti)neutrino: n → p e–ν. Soon afterwards, Hans Bethe and Rudolf Peierls calculated the cross-section for the inverse reaction in which a neutrino is absorbed, but when they found a value of about 10–44 cm2, the pair concluded that no one would be able to detect neutrinos (Bethe and Peierls 1934). What they did not count on was the discovery of nuclear fission – which on a macroscopic scale produces copious numbers of neutrinos – or the ingenuity of experimentalists and, later, accelerator physicists.
Notoriously, nuclear fission was first applied in the atomic bombs used towards the end of the Second World War. A few years later, in 1951, Fred Reines, a physicist who had worked on the Manhattan Project at Los Alamos, began to think about how to harness the neutrinos produced during tests of atomic bombs to make a direct detection of the elusive particle. He was soon joined in this strange pursuit by Clyde Cowan, a fellow researcher at Los Alamos, after they were stranded together at Kansas Airport, where the conversation turned to the “supreme challenge” of detecting neutrinos.
Reines had an idea to place a detector close to a bomb-test tower and use the timing of the detonation as a “gate” to minimise background. But what kind of detector? He and Cowan decided on the recently developed medium of liquid scintillator, which could both act as a target for the inverse beta-decay reaction ν p → e+ n, and detect the emitted positrons via their annihilation to gamma rays. It was an audacious plan, not only in taking advantage of a bomb test but also in scaling up the use of liquid scintillator, which until then had been used only in quantities of about a litre. Reines and Cowan named it “Project Poltergeist”, to reflect the neutrino’s ghostly nature.
Remarkably, the Los Alamos director gave approval for the experiment. However, in late 1952, Cowan and Reines were urged to reconsider the more practical idea of using antineutrinos from a nuclear reactor. The challenge was to work out how to reduce the backgrounds, because the antineutrino flux from a reactor would be thousands of times smaller than that from a nuclear explosion. Reines and Cowan realised that in addition to looking for positron annihilation, they could also detect the neutrons through neutron capture – a process that is delayed for several microseconds, thanks to the neutron’s random walk through a medium prior to interacting with a nucleus. In particular, the addition of cadmium to the detector would increase the likelihood of capture and lead to the emission of gamma rays. The signature for inverse beta decay would then be a delayed coincidence between two sets of gamma rays: one from the positron’s annihilation and the other from the neutron’s capture.
The detector for Project Poltergeist contained 300 litres of liquid scintillator with added cadmium chloride, viewed by 90 photomultiplier tubes, and was set up in 1953 at a new reactor at the Hanford Engineering Works in Washington State. This initial experiment showed a small increase in delayed coincidences when the reactor was operating compared with the situation when it was turned off, but it was set against a cosmic-ray background that was more than 10 times higher than the expected signal rate (Reines and Cowan 1953).
This tantalising result encouraged a still more determined effort, with a new detector design that was basically a sandwich with three layers of liquid scintillator and two layers of water with added cadmium chloride to act as the target (figure 1). Positrons produced in a neutrino interaction would be detected almost immediately via two back-to-back gamma rays in the adjacent scintillator tanks, which would be followed a few microseconds later by another burst of gamma rays in the same two scintillator tanks, this time from neutron capture.
The second experiment ran at the newly completed Savannah River Plant for a total of 1371 hours in 1956 and, when the reactor was on, it recorded nearly three delayed coincidences per hour (Cowan et al. 1956). After completing many checks, on 14 June 1956 Reines and Cowan sent a jubilant telegram to Pauli in Zurich, informing him that they had “definitely detected neutrinos from fission fragments by observing inverse beta decay of protons”. At the time, Pauli was in fact at a meeting at CERN, to where the telegram was forwarded, and he reportedly interrupted the meeting to read out the good news, later celebrating with a case of champagne (Reines 1979).
The move to accelerators
At the time of the neutrino’s discovery, laboratories such as CERN and Brookhaven were on their way to building proton synchrotrons that would have sufficient energy and intensity to form beams of neutrinos via decays of pions and kaons produced when protons strike a suitable target. The muons produced in the decays could be stopped by large amounts of shielding, allowing only neutrinos to penetrate to experiments beyond. At Brookhaven, this led to the discovery at the Alternating Gradient Synchrotron (AGS) in 1962 that the neutrinos produced in association with electrons (as in beta decay) are different from those produced in association with muons (as in pion decay): a second type of neutrino, the muon neutrino, had been discovered.
In 1963, an ingenious way to produce neutrino beams of greater intensity first came into use at the Proton Synchrotron (PS) at CERN, where Simon van der Meer had described his concept of the neutrino horn a couple of years earlier (van der Meer 1961). Because neutrinos are electrically neutral, they cannot be focused into a beam using magnets, so he devised instead a way to focus the parent pions and kaons using magnetic fields set up by currents circulating in a metallic cone-shaped “horn” (CERN Courier June 2011 p24). The device concentrated neutrinos produced as the charged particles decayed in flight into a beam, and because it could focus either positive or negative particles, it produced an almost pure beam of neutrinos (from positive parents) or antineutrinos (negative parents). A second technical innovation at CERN enabled the horn to become a formidable device: the technique of “fast ejection”, devised by Berend Kuiper and Günther Plass, could direct all of the protons from one cycle of the PS onto the target at the mouth of the horn (Kuiper and Plass 1959). By mid-1963, thanks to these innovations, CERN had what was at the time the world’s most intense neutrino beam.
In the 1970s, the combination of the neutrino beam from the PS and Gargamelle – the large bubble chamber built at the Saclay Laboratory by a team led by André Lagarrigue – led to the discovery of weak neutral currents (CERN Courier September 2009 p25), thereby providing crucial experimental support for the unification of the weak and electromagnetic forces. The neutrino experiments with Gargamelle also produced key evidence about the existence of quarks and, in particular, their fractional charges (CERN Courier April 2014 p24). Then, in 1977, the Super Proton Synchrotron (SPS) became the source of neutrino beams at higher energies, and for the next 21 years a series of experiments in CERN’s West Area used neutrinos in experiments covering a broad range of physics, from neutral currents and the quark structure of matter through quantum chromodynamics to neutrino oscillations (CERN Courier December 1998 p28).
Around that time, physicists at Fermilab were closing in on a third neutrino type. The DONUT experiment (Direct Observation of the NU Tau) detected neutrinos produced at the Tevatron, and in 2000, the collaboration announced the discovery of the tau neutrino. Although experiments at CERN’s Large Electron–Positron collider had already established from precise measurements of the Z boson that there are three light neutrino types, the observation of the tau neutrino completed the leptonic sector of the Standard Model.
Ten years later, CERN was again setting records for neutrino beams, with the CERN Neutrinos to Gran Sasso (CNGS) project, which directed an intense beam of muon-neutrinos (νμ) to two experiments, ICARUS and OPERA, in the Gran Sasso National Laboratory in Italy about 730 km away. CNGS followed the same principle as CERN’s early record-breaking beam, this time with protons from the SPS. Following first commissioning in 2006 (CERN Courier November 2006 p20), the facility ran for physics from 2008 to the end of 2012, and achieved a maximum beam power of 480 kW – the most powerful at the time. A total of 18.24 × 1019 protons were delivered on target, and the OPERA experiment detected 19,500 neutrino events – with five among them identified as a tau neutrino (ντ), thereby firmly establishing the direct observation of νμ→ ντ oscillations (CERN Courier July/August 2015 p6).
A bountiful legacy
Since the first glimpses of antineutrino interactions 60 years ago in reactor experiments, experiments have gone on to detect neutrinos and antineutrinos produced in a variety of ways – both in beams created at particle accelerators and also naturally by reactions in the Sun, interactions of cosmic rays in the Earth’s atmosphere and, most recently, astrophysical processes. We now know that neutrinos exist not only in three flavour eigenstates – electron (νe), muon (νμ) and tau (ντ) – but also in different mass eigenstates (ν1, ν2 and ν3) with very small masses, and that they can oscillate from one flavour to another through quantum-mechanical mixing (see “Japan eyes up its future”).
Reactor experiments – in particular Double Chooz in France, the Daya Bay Reactor Neutrino Experiment in China (figure 2) and the Reactor Experiment for Neutrino Oscillation (RENO) in South Korea – are still as relevant now as they were in Cowan and Reines’ day. Modern nuclear power plants produce about 1020 electron antineutrinos (νe) per second and experiments based on the same liquid-scintillator concept continue to provide essential contributions to neutrino physics by looking for the “disappearance” of the νe.
Sixty years after the first detection of the neutrino, and more than 80 years after the particle was tentatively predicted, experiments with neutrinos continue to have a leading role in particle physics. Today, experimentalists around the world are vying to determine precisely the mixing parameters of the neutrino, including the masses. The measurements may prove to hold the answers to some key questions in the field – ensuring that the “supreme challenge” of creating and detecting neutrinos will remain a worthwhile and exciting pursuit for the foreseeable future.
When CERN was founded in 1954, the neutrino was technically still a figment of theorists’ imaginations. Six decades later, neutrinos have become the most studied of all elementary particles. Several new and upgraded neutrino-beam experiments planned in Japan and the US, in addition to the reactor-based JUNO experiment in China, aim to measure vital parameters such as the ordering of the neutrino masses and potential CP-violating effects in the neutrino sector. In support of this effort, CERN is mounting a significant R&D programme called the CERN Neutrino Platform to strengthen European participation in neutrino physics.
CERN has a long tradition in neutrino physics. It was the study of neutrino beams with the Gargamelle detector at CERN in 1973 that provided the first evidence for the weak neutral current, and in the late 1970s, three experiments – BEBC, CDHS and CHARM – used a beam from the SPS to further unveil the neutrino’s identity. A milestone came in 1989, when precise measurements at the Large Electron–Positron Collider showed that there are three, and only three, types of light neutrinos that couple to the Z boson. This was followed by searches for neutrino oscillations at NOMAD (also known as WA96) and CHORUS (WA95) during the 1990s, which were eventually established by the Super-Kamiokande collaboration in Japan and the Sudbury Neutrino Observatory in Canada. More recently, from 2006 to 2012, CERN sent a muon-neutrino beam to the ICARUS and OPERA detectors at the Gran Sasso National Laboratory, 732 km away in Italy. The main goal was to observe the transformation of muon neutrinos into tau neutrinos, which was confirmed by the OPERA collaboration in 2015.
Following the recommendations of the European Strategy for Particle Physics in 2013, CERN inaugurated the neutrino platform at the end of 2014. Its aim is to provide a focal point for Europe’s contributions to global neutrino research by developing and prototyping the next generation of neutrino detectors. So far, around 50 European institutes have signed up as members of the neutrino platform, which sees CERN shift from its traditional role of providing neutrino beams to one where it shares its expertise in detectors, infrastructure and international collaboration.
“The neutrino platform pulls together a community that is scattered across the world and CERN has committed significant resources to support R&D in all aspects of neutrino research,” says project leader Marzio Nessi. Specifically, he explains, CERN is using the organisational model of the LHC to help in
developing an international project on US soil and to contribute to neutrino programmes in Japan and elsewhere. “This is precisely what CERN is about,” says Nessi. “The platform provides a structure at CERN to foster active involvement of Europe and CERN in the US and Japanese facilities.”
In December 2014, CERN and the Italian National Institute for Nuclear Physics (INFN) took delivery of the 760 tonne ICARUS detector, which formerly was located at Gran Sasso. The detector is currently being refurbished by the neutrino platform’s WA104 team and in 2017 it will be shipped to Fermilab in the US to become part of a dedicated short-baseline neutrino (SBN) programme there. This programme was approved following unexpected results from the LSND experiment at Los Alamos National Laboratory in the 1990s, which hinted at the existence of a fourth – possibly “sterile” – type of neutrino. The result was followed up by the MiniBooNE experiment at Fermilab, which also saw deviations – albeit different again – from the expected signal.
ICARUS will be installed just behind the previous MiniBooNE site, some 600 m downstream from the source of the beam at Fermilab’s booster ring. It will be the farthest of three detectors in the line of the beam after the Short Baseline Neutrino Detector (SBND, which is currently under design) and MiniBooNE’s successor MicroBooNE (which is already operational). All three detectors employ liquid-argon time projection chambers (LAr-TPCs) to study neutrino oscillations in detail. ICARUS comprises two 270 m3 modules filled with liquid argon: when an energetic charged particle passes through its volume it ionises the liquid and a uniform electric field causes electrons to drift towards the end plates, where three layers of parallel wire planes oriented at different angles (together with the drift time) allow researchers to reconstruct a 3D image of the event.
The refurbishing campaign at CERN concerns many parts of the ICARUS experiment: the photomultipliers, the read-out electronics, the cathode plane and the argon recirculating system. Moreover, it will benefit from European expertise in automatic event reconstruction and the handling of large data sets. Finally, the unique cryostat in which ICARUS will be placed is also being assembled at CERN. “Improving the performances of a detector already successfully operating in the Gran Sasso underground laboratory is extremely challenging in many respects,” says ICARUS technical co-ordinator Claudio Montanari. “Indeed, in order to make it fully functional to operate on surface, many different aspects including data acquisition, background rejection, timing and event reconstruction needed to be rethought.”
Rapid progress made in understanding neutrino oscillations during the past 15 years has also provided a strong case for long-baseline neutrino programmes. A major new international project called DUNE (Deep Underground Neutrino Experiment), which is estimated to begin operations by approximately 2026 as part of Fermilab’s Long Baseline Neutrino Facility (LBNF), will take the form of a near and a far multikiloton detector. The far detector will consist of four 10 kt active LAr-TPC modules sited in a 1.5 km-deep cavern at the Sanford lab in South Dakota, 1300 km away, at which neutrino beams with unprecedented intensities will be fired through the Earth from Fermilab. While the three experiments in the SBN programme will look for the disappearance of electron and muon neutrinos to search for sterile neutrinos, they will also serve as a stepping stone to the large LAr modules required by LBNF. The LBNF/DUNE experiment will allow not just the neutrino-mass hierarchy to be determined but also CP violation to be looked for in the leptonic sector, which could help to explain the missing baryonic matter in the universe.
The CERN Neutrino Platform is building two large-scale prototypes – single-phase and double-phase ProtoDUNE modules – to enable LAr detectors to be scaled up to the multikiloton level. The cryostat for such giant detectors is a particular challenge, and led physicists to explore a novel technological solution inspired by the liquified-natural-gas (LNG) shipping industry. CERN is currently collaborating with French firm Gaztransport & Technigaz, which owns the patent for a membrane-type containment system with two cryogenic liners that support and insulate the liquid cargo. Although this containment system has the advantage of being modular, the challenge in a particle-physics setting is that the cryostats not only have to contain the liquid argon but also all of the detectors and read-out electronics.
Global connection
While the single-phase ProtoDUNE detector uses technology that is very similar to that in ICARUS, a second neutrino-platform project called WA105 aims to prototype the new concept of a “dual-phase” LAr time projection chamber (DLAr-TPC), which is being considered for one or more of the DUNE far-detector 10 kt modules. In a DLAr chamber, a region of gaseous argon resides above the usual liquid phase. Ionisation electrons drift up through the detector volume and are accelerated into the gaseous region near the top of the cryostat by a strong electric field. Here, large electron multipliers amplify the signals, while the anode collects the charged particles and provides the spatial read-out. “The ProtoDUNE tests foreseen at the CERN Neutrino Platform represent the culmination of more than a decade of R&D towards the feasibility of very large liquid-argon time projection chambers for next-generation long-baseline experiments,” says André Rubbia, co-spokesperson of the DUNE collaboration.
ProtoDUNE and WA105 are planned to be ready for test beam by 2018 at a new EHN1 test facility currently under construction in the north area of CERN’s Prévessin site. Most of the civil-engineering work to extend the EHN1 building is complete and all components are under procurement or installation, with staff expected to move in towards the end of the year. The test facility was financed by CERN, with two beamlines due to be commissioned in late 2017.
As ICARUS prepares for its voyage across the Atlantic, and the detectors for the next-generation of US neutrino experiments takes shape, the CERN Neutrino Platform is also working on components for Japan’s neutrino programme (see “NOvA releases new bounds on neutrino mixing parameters”). The Baby-MIND collaboration aims to construct a muon spectrometer – a state-of-the-art prototype for a would-be Magnetized Iron Neutrino Detector (MIND) – and characterise it in a charged-particle beam at CERN. The system will be assembled at CERN during the winter and tested in May next year, before being shipped to Japan in the summer of 2017. Once there, it will become part of the WAGASCI experiment, where it will contribute to a better understanding of the systematics for the T2K neutrino and antineutrino oscillation analysis. Baby-MIND was approved by the CERN research board in December last year as a Neutrino Platform experiment. “Other projects for the Japanese neutrino programme are also under discussion,” says Baby-MIND spokesperson Alain Blondel of the University of Geneva.
Finally, in June it was decided that the CERN Neutrino Platform will also involve a neutrino-theory working group to strengthen the connections between CERN and the worldwide community and help to promote research in theoretical neutrino physics at CERN. “Fundamental questions in neutrino physics, such as the existence of leptonic CP violation, the Majorana nature of neutrinos and the origin of neutrino masses and mixings, will be at the centre of research activities,” explains group-convener Pilar Hernández. “The answers to these questions could have essential implications in other areas of high-energy physics, from collider physics to indirect searches, as well as in our understanding of the universe.”
The CERN Neutrino Platform offers a unique opportunity to build a strong European neutrino community, with immediate physics potential coming from the short-baseline experiments at Fermilab in the US and the new near detector at T2K in Japan. The platform is also making a major contribution to the infrastructure of Fermilab’s Long-Baseline Neutrino Facility (LBNF), including the design and construction of a large LBNF cryostat to be placed underground at the Sanford Underground Research Facility, new large detector prototypes and generic R&D on new detectors and data handling. CERN and Europe will therefore participate fully in the construction, commissioning and physics exploitation of the new high-intensity facility. In addition to R&D for the LBNF/DUNE cryostat, the neutrino platform currently has five approved participants:
• WA104, ICARUS far detector for Fermilab’s short-baseline programme;
• WA105, the engineering prototype for a double-phase LAr-TPC;
• PLAFOND, a generic R&D framework;
• ProtoDUNE, the engineering prototype for a single-phase LAr-TPC;
• BabyMIND, a muon spectrometer for the WAGASCI experiment.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.