Since the first ATLAS results from LHC Run 2 were presented at this summer’s conferences (EPS-HEP 2015 and LHCP 2015) with an amount of data corresponding to an integrated luminosity of approximately 80 pb–1, the LHC has continued to ramp up in luminosity. The maximum instantaneous luminosity for 2015 was 5 × 1033 cm–2s–1, which already approaches the Run 1 record of 7 × 1033 cm–2s–1. ATLAS recorded more than 4 fb–1 in 2015, with different physics analyses using from 3.32 to 3.60 fb–1, depending on the parts of the detector required to be fully operational with good data quality.
The main goal of the early measurements presented this summer was to study in detail the performance of the detector, to characterise the main Standard Model processes at 13 TeV, and to perform the first searches for phenomena beyond the Standard Model at Run 2. These early searches focused on processes such as high-mass quantum and rotating black-hole production in dijet, multijet and lepton-jet event topologies, for which the higher centre-of-mass energy provided an immediate improvement in sensitivity beyond the reach of the Run 1 data.
The recently completed 2015 data set corresponds to more than 30 times that of this summer. With these data, the full programme of measurements and searches at Run 2 has started, and the first results were presented by the collaboration at a joint ATLAS and CMS seminar on 15 December 2015 during CERN Council week.
These new results benefitted from the first calibration of electron, muon and jet reconstruction and trigger algorithms, in situ using the data. The new insertable B layer of pixel detectors significantly improves the precision of the track measurements near the interaction region and is therefore crucial for tagging jets containing heavy quarks.
First measurements include the ZZ cross-section and single top quark, and the Wt production channels at 13 TeV. Top-quark pair production has also been investigated in measurements where the top-quark pair is produced in association with additional jets. These measurements are crucial to provide further checks of the modelling implemented in state-of-the-art generators used to simulate these processes at NLO QCD precision. These measurements can also subsequently be used to further constrain physics beyond the Standard Model that would alter these production modes.
The new data also allowed the first measurements of the Higgs boson production cross-section at 13 TeV, inclusively in the diphoton and ZZ decay channels.
With the increased centre-of-mass energy, and the availability of significantly more data than in the summer, new-particle search results were awaited with much anticipation. A large number of searches for new phenomena motivated by theories beyond the Standard Model in dijet, multijets, photon jets, diphoton, dilepton, single lepton and missing transverse energy channels were completed. Searches for vector-boson pair (VV) and Higgs and vector-boson (VH) topologies with boosted jets have also been completed. Searches for strongly produced supersymmetry (SUSY) that made use of signatures with 0 or 1 lepton, or a Z boson, jets and missing transverse energy and also topologies with B jets, have improved sensitivity from Run 1. Finally, searches for Higgs bosons from extended electroweak symmetry-breaking sectors in final states with a pair of tau leptons, and in pairs of vector bosons, have been performed.
So far, no definitive observation of new physics has been observed in the data, although two excesses have been observed. The first, with a significance of 2.2 standard deviations, was seen in the search for SUSY with gluino production with subsequent decays into a Z boson and missing energy; a 3 standard-deviation excess was observed in this channel in Run 1. The second excess was observed in the search for diphoton resonances where a peak is seen at 750 GeV with a local significance of 3.6 standard deviations, corresponding to a global significance of 2.0 standard deviations. More data will be needed to probe the nature of these excesses.
Limits on a large variety of theories beyond the Standard Model have been derived. The ATLAS experiment is completing its measurements and search programme on the data collected in 2015, and is preparing for the data to come in 2016.
The first phase of collisions after the LHC restart earlier this year provided CMS with data at the novel energy of 13 TeV, enabling CMS to explore uncharted domains of physics. At the end of this exciting year, CMS and ATLAS presented comprehensive overviews of their latest results from analyses performed on the collected data. Here we highlight only a few of the key CMS results – refer to the further reading (below) for more.
Before exploring the “unknown”, CMS first strove to rediscover the “known”, as a means to validate the excellent performance of the detector after emerging from the consolidation and upgrade period of Long Shutdown 1. Convincing performance studies as well as early measurements had already been presented at this year’s summer conferences. Meanwhile, the studies and physics measurements continued as the size of the data sample increased over the course of the autumn. In total, CMS approved 33 new public results for the end-of-year jamboree, capping off a successful period of commissioning, data collection and analysis. In contrast to the studies performed for other Standard Model particles, CMS preferred to remain blinded for studies involving the LHC’s most famous particle, the Higgs boson discovered in 2012, because the collected data sample was not large enough for a Higgs boson signal to be detectable.
However, it was the anticipation of results on searches for new phenomena that filled CERN’s main auditorium beyond capacity. The CMS focus was on searches that would already be sensitive to new physics with the small data sample collected in 2015. Hadron jets play a crucial role in searches for exotic particles such as excited quarks, whose observation would demonstrate that quarks are not elementary particles but rather composite objects, and for heavier cousins of the W boson. These new particles would demonstrate their presence by transforming into two particle jets (a “dijet”). The highest-mass dijet event observed by CMS is shown in the figure. In carrying out this study, CMS searches for bumps in the mass distribution of the dijet system. Seeing no significant excess over the background, a new CMS publication based on the 13 TeV data imposes limits on the masses of these hypothetical particles ranging from 2.6 TeV to 7 TeV, depending on the new-physics model.
CMS also searched for the presence of heavy particles such as a Z´ (Z-prime) boson in the dilepton spectrum, in which unstable exotic particles would transform into pairs of electrons or muons. While CMS observed high-mass events, with dielectrons up to a mass of 2.9 TeV and dimuons up to 2.4 TeV, the data are compatible with the Standard Model and do not provide evidence for new physics.
Finally, CMS observed a slight excess in events with two photons at a diphoton mass around 760 GeV. However, small fluctuations such as this have been observed regularly in the past, including at LHC Run 1, and often disappear as more data is collected. Therefore we are still far from the threshold associated with a new discovery, but the stage is set for great excitement and anticipation in the upcoming 2016 run of the LHC.
Beyond its rich programme in flavour physics based on proton–proton collisions, LHCb opened the door in 2015 to a new domain of physics exploration related to cosmic-ray and heavy-ion physics. Due to its forward coverage, the detector has access to a unique kinematic range in colliding-beam physics. In addition, using a system developed for precise luminosity measurements based on the beam-gas imaging method, neon, helium and argon gas has been injected during some periods into the interaction region to exploit the LHC proton and ion beams for fixed-target physics at the highest available energies.
The measurement of proton–helium collisions has been motivated by recent results from AMS and other space detectors, which suggest that the antiproton yield in cosmic rays may exceed the expected value from secondary production in the interstellar medium. The accuracy of such predictions is limited by the poor knowledge of the proton–helium cross-section for proton energies at the TeV scale. By measuring proton–helium collisions, LHCb mimics the conditions for secondary production, and has the potential to help in the interpretation of these exciting results.
In proton–argon collisions, a nucleon–nucleon centre-of-mass energy of 110 GeV is generated, which is in between those achieved in experiments at the SPS in the 1980s and 1990s and those probed at RHIC more recently. While the produced energy densities are too low to create quark–gluon plasma (QGP), they allow the study of cold-nuclear-matter (CNM) effects, which are crucial to determine QGP formation.
During the last weeks of the LHC physics programme of 2015, the LHCb collaboration also participated in the heavy-ion run, taking data in both fixed-target mode by recording lead–argon collisions at a centre of mass energy of 69 GeV, and in colliding-beam mode, collecting lead–lead collisions at 5 TeV. In both modes, the energy densities are large enough to create a QGP, however lead–argon collisions have lower multiplicities than lead–lead collisions, and are therefore easier to analyse. The experiment is able to reconstruct lead–lead collisions up to a centrality of about 50%. The rapidity coverage by the LHCb detector in fixed-target mode in the nucleon–nucleon centre-of-mass frame is about –3 < y < 1; in colliding-beam mode, the range between 2 < y < 5 is covered. The experiment has precise tracking, vertexing, calorimetry and powerful particle identification over the full detector acceptance.
Comparison of collisions in the various configurations allows QGP effects to be disentangled from CNM effects. The various beam configurations are summarised in the diagram.
The focus of LHCb measurements will, on the one hand, be on hard probes such as open heavy-flavour states and quarkonia, which can be carried out down to very low pT. On the other hand, open questions in the soft sector of QCD can be addressed, which cannot be treated perturbatively. LHCb is looking forward to exciting measurements in a variety of beam configurations in the years ahead.
After the restart of the LHC physics programme in June 2015 with world-record proton– proton collisions at √s = 13 TeV, nuclear beams reappeared in the LHC tunnel in November 2015 with subsequent first collisions between 208Pb ions. With unprecedented centre-of-mass energy values of 5.02 TeV in the nucleon–nucleon system, collection of these data marks the beginning of a new chapter in the precision study of properties of hot and dense hadronic matter, and the quest to understand QCD confinement.
Measurement of the inclusive production of charged hadrons in high-energy nucleus– nucleus reactions is a key observable to characterise the global properties of the collision, in particular, whenever the collision energy increases significantly (almost a factor of two with respect to the LHC Run 1). Particle production at collider energies originates from the interplay of perturbative (hard) and non-perturbative (soft) QCD processes. Soft scattering processes and parton hadronisation dominate the bulk of particle production at low transverse momenta, and can only be modelled phenomenologically. On the other hand, with an increase in collision energy, the role of hard processes – parton scatterings with large momentum transfer – increases. Such measurements, which contribute essential information to estimate the initial energy density leading to the formation and evolution of the quark–gluon plasma and its relation to the collision geometry, also provide valuable insight into the initial-state partonic structure of the colliding nuclei.
The ALICE experiment has measured the centrality-dependence of the inclusive charged-particle density (dNch/dη) at mid-rapidity (|η| < 0.5) in Pb–Pb collisions at √sNN = 5.02 TeV. For an event sample corresponding to the most central 5% of the hadronic cross-section, the pseudorapidity density of primary charged particles at midrapidity is 1943±54, which corresponds to 10.2±0.3 per participating nucleon pair. This represents an increase of a factor of about 2.4 relative to p–Pb collisions at the same collision energy, and a factor of about 1.2 to central Pb–Pb collisions at 2.76 TeV. Previous measurements were performed by ALICE, ATLAS and CMS at the LHC at √sNN = 2.76 TeV, and also at lower energies in the range √sNN = 17–200 GeV with SPS and RHIC experiments. The figure shows a compilation of results on mid-rapidity charged-particle density for most central nucleus–nucleus collisions and elementary proton–proton and proton(deuteron)–nucleus collisions. Particle production in nucleus–nucleus collisions increases more rapidly with the centre-of-mass energy (per nucleon pair) than in proton–proton and proton(deuteron)–nucleus collisions, in agreement with expectations from the power-law extrapolation of lower-energy results. The characteristics of the centrality dependence of dNch/dη and comparison with several phenomenological models is reported in a recent publication by the ALICE collaboration.
MicroBooNE, an experiment designed to measure neutrinos and antineutrinos generated by Fermilab’s Booster accelerator (CERN Courier September 2014 p8), has recorded its first neutrino events. MicroBooNE is the first of three neutrino detectors of the lab’s new short-baseline neutrino (SBN) programme, recommended by the 2014 report of the US Particle Physics Project Prioritization Panel (P5). The ICARUS detector (being refurbished at CERN) as far detector, MicroBooNE as intermediate detector and SBND as near detector will compose the SBN project.
Designed to search for sterile neutrinos and other new physics phenomena in low-energy neutrino oscillations, the SBN programme aims to confirm or refute the hints of a fourth type of neutrino first reported by the LSND collaboration at Los Alamos National Laboratory, and resolve the origin of a mysterious low-energy excess of particle events seen by the MiniBooNE experiment, which used the same short-baseline neutrino beam line at Fermilab.
MicroBooNE uses a 10.4 m-long liquid-argon time-projection chamber (TPC) filled with 170 tonnes of liquid argon. The TPC probes neutrino oscillations by reconstructing particle tracks as finely detailed 3D images. When a neutrino hits the nucleus of an argon atom, its collision creates a spray of subatomic particles. Tracking and identifying those particles allows scientists to reveal the type and properties of the neutrino that produced them.
The MicroBooNE time-projection chamber is the largest ever built in the US and is equipped with 8256 delicate gold-plated wires. The three layers of wires capture pictures of particle interactions at different points in space and time. The superb resolution of the time-projection chamber will allow scientists to check whether the excess of MiniBooNE events – recorded with a Cherenkov detector filled with mineral oil – is due to photons or electrons.
MicroBooNE will collect data for several years, and computers will sift through thousands of neutrino interactions recorded every day. It will be the first liquid-argon detector to measure neutrino interactions from a neutrino beam with particle energies of less than 800 MeV.
Construction is under way for the two buildings that will house the other detectors of the SBN programme: the new 260 tonne Short-Baseline Near Detector (110 m from the neutrino production target) and the 760 tonne ICARUS detector (600 m) that took data at the Gran Sasso National Laboratory in Italy from 2009 to 2012. Like MicroBooNE (470 m from the target), they are both liquid-argon TPCs.
The MicroBooNE collaboration comprises 138 scientists from 28 institutions, while more than 200 scientists from 45 institutions are collaborating on the SBN programme. The experience and knowledge they will gain is relevant for the forthcoming Deep Underground Neutrino Experiment (DUNE), which will use four 10,000 tonne liquid-argon TPCs to examine neutrino oscillations over a much longer distance (1300 km) and a much higher and broader energy range (0.5–10 GeV).
The observed spectrum of cosmic rays has several puzzling features. For instance, the Alpha Magnetic Spectrometer (AMS) measured an excess of positrons and antiprotons at around 100 GeV. While it is tantalising to interpret such discrepancies as dark-matter signatures, a new study shows that they could simply be due to the injection of cosmic rays by a nearby supernova, which exploded two-million years ago.
The composition and spectral properties of cosmic rays are studied with unprecedented accuracy by the AMS-02 experiment (CERN Courier October 2013 p23). Mounted on the International Space Station since May 2011, it has already detected and characterised close to 100 thousand million cosmic rays from outer space. This wealth of data constrains extremely precisely the spectral properties of electrons, positrons, protons and antiprotons, as well as of helium and heavier nuclei.
Rather than disproving earlier results by the Payload for Antimatter Matter Exploration and Light nuclei Astrophysics (PAMELA) (CERN Courier September 2011 p34), the AMS measurements confirm several cosmic-ray “anomalies”. Indeed, the observations do not follow the expected trend for galactic cosmic rays. The main differences are a softer spectral slope of protons as compared with heavier nuclei in the TeV–PeV energy range, and an excess of positrons and antiprotons above ~30 GeV.
According to a small group of astrophysicists, these puzzling discrepancies could simply be due to an additional source of cosmic rays from a nearby supernova. Led by Michael Kachelrieß from the Norwegian University of Science and Technology in Trondheim, the team demonstrates in a recent paper that this is a valid explanation. The researchers get a good match with the observations for a source injecting ~1043 J in cosmic rays with energies up to at least 30 TeV. The researchers derive this by using a code that follows the trajectories of individual cosmic rays through the galactic magnetic field (GMF) after an instantaneous injection.
They further obtain that the supernova should have been roughly aligned with the local GMF direction at a distance of several-hundred light-years and must have exploded about two-million years ago. A rough estimate based on the size of the Milky Way and the rate of about two supernovas per century (CERN Courier January/February 2006 p10), shows that there is a good chance that one has indeed exploded at the inferred distance from the Sun during the last few-million years. Interestingly, independent evidence for such a nearby supernova explosion was already derived from a deposition of the iron isotope 60Fe in a million-year-old layer of the ocean crust.
The authors suggest further possible tests of the nearby supernova interpretation, such as the measure of the Beryllium isotopic ratio 10Be/9Be, because 10Be has a radioactive lifetime of about one-million years. If the idea withstands further data from AMS-02, this would rule out the hopes that the cosmic ray “anomalies” would be a signature of something more fundamental, like dark-matter annihilation or decay.
Since the establishment of the first hospital-based proton-treatment centres in the 1990s, hadrontherapy has continued to progress in Europe and worldwide. In particular, during the last decade there has been exponential growth in the number of facilities, accompanied by a rapid increment in the number of patients treated, an expanded list of medical indications, and increasing interest in other types of ions, especially carbon. Harnessing the full potential of hadrontherapy requires the expertise and ability of physicists, physicians, radiobiologists, engineers, and information-technology experts, as well as collaboration between academic, research and industrial partners. Thirteen years ago, the necessity to catalyse efforts and co-operation among these disciplines led to the establishment of the European Network for Light Ion Hadrontherapy (ENLIGHT). Its recent annual meeting, held in Cracow in September, offered an ample overview of the current status and challenges of hadrontherapy, as well as stimulating discussion on the future organisation of the community.
Networking is key
ENLIGHT was launched in 2002 (CERN Courier May 2002 p29) with an ambitious, visionary and multifaceted plan to steer European research efforts in using ion beams for radiation therapy. ENLIGHT was envisaged not only as a common multidisciplinary platform, where participants could share knowledge and best practice, but also as a provider of training and education, and as an instrument to lobby for funding in critical research and innovation areas. During the years, the network has evolved, adapting its structure and goals to emerging scientific needs (CERN Courier June 2006 p27).
ICTR-PHE
The third edition of the International Conference on Translational Research in Radio-Oncology | Physics for Health in Europe will be held in Geneva from 15 to 19 February. This unique conference gathers scientists from a variety of fields, including detector physicists, radiochemists, nuclear-medicine physicians and other physicists, biologists, software developers, accelerator experts and oncologists.
ICTR-PHE is a biennial event, co-organised by CERN, where the main aim is to foster multidisciplinary research by positioning itself at the intersection of physics, medicine and biology. At ICTR-PHE, physicists, engineers and computer scientists share their knowledge and technologies, while doctors and biologists present their needs and vision for the medical tools of the future, therefore triggering breakthrough ideas and technological developments in specific areas.
The high standards set by the ICTR-PHE conferences have garnered not only an impressive scientific community, but also ever-increasing interest and participation from industry. ICTR-PHE 2016 is also an opportunity for companies to exhibit their products and services at the technical exhibition included in the programme.
The annual ENLIGHT meeting has always played a defining role in this evolutionary process. This year, new and long-time members were challenged to an open discussion on the future of the network, after a day and a half of inspiring talks on various aspects of hadrontherapy.
Challenges ahead
Emerging topics in all forms of radiation therapy are the collection, transfer and sharing of medical data, and the implementation of big data-analytics tools to inspect them. These tools will be crucial in implementing decision support systems, allowing treatment to be tailored to each individual patient. The flow of information in healthcare, and in particular in radiation therapy, is overwhelming not only in terms of data volume but also in terms of the diversity of data types involved. Indeed, experts need to analyse patient and tumour data, as well as complex physical dose arrays, and to correlate these with clinical outcomes that also have genetic determinants.
Hadrontherapy is facing a dilemma when it comes to designing clinical trials. In fact, from a clinical standpoint, the ever increasing number of hadrontherapy patients would allow randomised trials to be performed – that is, systematic clinical studies in which patients are treated with comparative methods to determine which is the most effective curative protocol.
However, several considerations add layers of complexity to the clinical-trials landscape: the need to compare standard photon radiotherapy not only with protons but also with carbon ions; the positive results of hadrontherapy treatments for main indications; and the non-negligible fact that most of the patients who contact a hadrontherapy centre are well informed about the technique, and will not accept being treated with conventional radiotherapy. Nevertheless, progress on clinical trials is being made. At the ENLIGHT meeting in Cracow, the two dual-ion (proton and carbon) centres in Europe – HIT, in Heidelberg (Germany) and CNAO, in Pavia (Italy) – presented patient numbers and dose-distribution studies carried out at their facilities. The data were collected mainly in cohort studies carried out within a single institution, and the results often highlighted the need for larger statistics and a unified database. More data from patients treated with carbon ions will soon become available, with the opening in 2016 of the MedAustron hadrontherapy centre in Wiener Neustadt (Austria). Clinical trials are also a major focus outside of Europe: in the US, several randomised and non-randomised trials have been set up to compare protons with photons, and to investigate either the survival improvement (for glioblastoma, non-small cell lung cancer, hepatocellular carcinoma, and oesophageal cancer) or the decrease of adverse effects (low-grade glioma, oropharyngeal cancer, nasopharyngeal cancer, prostate cancer and post-mastectomy radiotherapy in breast cancer). Recently, the National Cancer Institute in the US funded a trial comparing conventional radiation therapy and carbon ions for pancreatic cancer.
Besides clinical trials, personalised treatments are holding centre stage in the scientific debate on hadrontherapy. Technology is not dormant: developments are crucial to reduce the costs, to provide treatments tailored to each specific case, and to reach the necessary level of sophistication in beam delivery to treat complex cases such as tumours inside, or close to, moving organs. In this context, imaging is key. Today, it is becoming obvious that the optimal imaging tool will necessarily have to combine different imaging modalities, for example PET and prompt photons. PET is of course a mainstay for dose imaging, but a well-known issue in its application to in-beam real-time monitoring for hadrontherapy comes from having to allow room for the beam nozzle: partial-ring PET scanners cannot provide full angular sampling, therefore introducing artefacts in the reconstructed images. The time-of-flight (TOF) technique is often used to improve the image-reconstruction process. An innovative concept, called a J-PET scanner, detects back-to-back photons in plastic scintillators, and applies compressive sensing theory to obtain a better signal normalisation, and therefore improve the TOF resolution.
A subject of broad and current interest within the hadrontherapy community is radiobiology. There has been great progress in the comprehension of molecular tumour response to irradiation with both ions and photons, and of the biological consequences of the complex, less repairable DNA damage caused specifically by ions. Understanding the cell signalling mechanisms affected by hadrontherapy will lead to improvements in therapeutic efficacy. A particularly thorny issue is the relative biological effectiveness (RBE) of protons and carbon with respect to photons. More extensive and systematic radiobiology studies with different ions, under standardised dosimetry and laboratory conditions, are needed to clarify this and other open issues: these could be carried out at existing and future beamlines at HIT, CNAO and MedAustron, as well as at the proposed CERN OpenMED facility.
The future of ENLIGHT
Since the annual meeting in Summer 2014, the ENLIGHT community has started to discuss the future of the network, both in terms of structure and scientific priorities. It is clear that the focus of R&D for hadrontherapy has shifted since the birth of ENLIGHT, if only for the simple reason that the number of clinical centres (in particular for protons) has dramatically increased. Also, while technology developments are still needed to ensure optimal and more cost-effective treatment, proton therapy is now solidly in the hands of industry. The advent of single-room facilities will bring proton therapy, albeit with some restrictions, to smaller hospitals and clinical centres.
From a clinical standpoint, the major challenge for ENLIGHT in the coming years will be to catalyse collaborative efforts in defining a road map for randomised trials and in studying the issue of RBE in detail. Concerning technology developments, efforts will continue on quality assurance through imaging and on the design of compact accelerators and gantries for ions heavier than protons. Information technologies will take centre stage, because data sharing, data analytics, and decision support systems will be key topics.
Training will be a major focus in the coming years, as the growing number of facilities require more and more trained personnel: the aim will be to train professionals who are highly skilled in their speciality but at the same time are familiar with the multidisciplinary aspects of hadrontherapy.
Over the years, the ENLIGHT community has shown a remarkable ability to reinvent itself, while maintaining its cornerstones of multidisciplinarity, integration, openness, and attention to future generations. The new list of priorities will allow the network to tackle the latest challenges of a frontier discipline such as hadrontherapy in the most effective way.
In recent years, evidence for the existence of dark matter from astrophysical observations has become indisputable. Although the nature of dark matter remains unknown, many theoretically motivated candidates have been proposed. Among them, the most popular ones are Weakly Interacting Massive Particles (WIMPs) with predicted masses in the range from a few GeV/c2 to TeV/c2 and with interaction strengths roughly on the weak scale.
WIMPs are being searched for using three complementary techniques: indirectly, by detecting the secondary products of WIMP annihilation or decay in celestial bodies; by producing WIMPs at colliders, foremost the LHC; and by direct detection, by measuring the energy of recoiling nuclei produced by collisions with WIMPs in low-background detectors.
On 11 November 2015, the most sensitive detector for the direct detection of WIMPs, XENON1T, was inaugurated at the Italian Laboratori Nazionali del Gran Sasso (LNGS) – the largest underground laboratory in the world. XENON1T, led by Elena Aprile of Columbia University, was built and is operated by a collaboration of 21 research groups from France, Germany, Italy, Israel, the Netherlands, Portugal, Sweden, Switzerland, the United Arabic Emirates and the US. In total, about 130 physicists are involved.
XENON1T is the current culmination of the XENON programme of dark matter direct-detection experiments. Starting with the 25 kg XENON10 detector about 10 years ago, the second phase of the experiment, XENON100 (CERN Courier October 2013 p13) with 161 kg, has been tremendously successful: in the summer of 2012, the XENON collaboration published results from a search for spin-independent WIMP–nucleon interactions that provided the most stringent constraints on WIMP dark matter, until superseded by the LUX experiment (CERN Courier December 2013 p8) with a larger target.
XENON100 has since then also provided a series of other important results, such as constraints on the spin-dependent WIMP nucleon cross-section, constraints on solar axions and galactic axion-like particles and, more recently, searches for annual rate modulations, which exclude WIMP–electron scattering that could have provided a dark-matter explanation of the signal observed by DAMA/LIBRA (CERN Courier November 2015 p10).
Low background is key
The new XENON1T detector has an estimated sensitivity that is a factor of 100 better than XENON100. This will be reached after about two years of data taking. With only one week of data-taking, XENON1T will be able to reach the current LUX limit, opening up a new phase in the search for dark matter in early 2016.
The XENON detectors are dual-phase time-projection chambers (TPCs) filled with liquid xenon (LXe) as the target material. Interactions of particles in the liquefied xenon give rise to prompt scintillation light and ionisation. The ionised electrons are drifted in a strong electric field and extracted into the gas above the liquid where a secondary scintillation signal is produced. Both scintillation signals are read out by arrays of photomultiplier tubes (PMTs) placed above and below the target volume. The position of the interaction vertex can be reconstructed in 3D by using the hit pattern on the upper PMT array and the time delay between the prompt and secondary scintillation signal. The position reconstruction facilitates self-shielding by only selecting events that interact with the inner “fiducial” volume of the detector. Because of their small cross-section, WIMPs will interact only once in the detector, so the background (e.g. from neutrons) can be reduced further by selecting single-scatter interactions. Beta and gamma backgrounds are reduced by selecting events with a ratio of secondary-to-prompt signal that is typical for nuclear recoils.
The XENON1T detector is filled with about 3.5 tonnes of liquid xenon in total. Its TPC – 1 m high and 1 m in diameter in a cylindrical shape, laterally defined by highly reflective Teflon – is the largest liquid-xenon TPC ever built. Specially designed copper field-shaping electrodes ensure the uniformity of the drift field for the desired field strength of 1 kV/cm. The TPC’s active volume contains 2 tonnes of LXe viewed by two arrays of 3 inch PMTs – 121 at the bottom immersed in LXe and 127 on the top in the gaseous phase. The xenon gas is liquefied and kept at a temperature of about –95 °C by a system of pulse-tube refrigerators. The xenon gas is stored and can be recovered in liquid phase in a custom-designed stainless-steel sphere that can hold up to 7.6 tonnes of xenon in high-purity conditions. Figure 3 shows the XENON1T detector and service building situated in Hall B at LNGS. Figure 1 shows XENON collaborators active in assembling the TPC in a clean room above ground at LNGS.
The expected WIMP–nucleon interaction rate is less than 10 events in 1 tonne of xenon per year. Background rejection is therefore the key to success for direct-detection experiments. Externally induced backgrounds can be minimised by exploiting the self-shielding capabilities. In addition, the detector is surrounded by a cylindrical water vessel 10 m high and 9.6 m in diameter. It is equipped with PMTs to tag muons that could induce neutrons, with an efficiency of 99.9%.
For a detector the size of XENON1T, radioactive impurities in the detector materials and the xenon itself become the biggest challenge for background reduction. Extensive radiation-screening campaigns, using some of the world’s most sensitive germanium detectors, have been conducted, and high-purity PMTs have been specially developed by Hamamatsu in co-operation with the collaboration. Contamination of the xenon by radioactive radon (mainly 222Rn) and krypton (85Kr), which dominate the target-intrinsic background, led to the development of cryogenic-distillation techniques to suppress the abundance of these isotopes to unprecedented low levels.
The best scenario
After about two years of data taking, XENON1T will be able to probe spin-independent WIMP–nucleon cross-sections of 1.6 × 10–47 cm2 (at a WIMP mass of 50 GeV/c2), see figure 2. In popular scenarios involving supersymmetry, XENON1T will either discover WIMPs or will exclude most of the theoretically relevant parameter space. Following the inauguration, the first physics run is envisaged to start early this year.
Most of the infrastructure, for example the outer cryostat, the Cherenkov muon veto, the xenon cryogenics, the purification and storage systems and the data-acquisition system, has been dimensioned for a larger experiment, named XENONnT, which is designed to contain more than 7 tonnes of LXe. A new TPC, about 40% larger in diameter and height and equipped with about 400 PMTs, will replace the XENON1T TPC. The goal for XENONnT is to achieve another order of magnitude improvement in sensitivity within a few years of data taking. XENONnT is scheduled to start data taking in 2018.
Le LHC dépasse la luminosité nominale pour les ions lourds
Le LHC a terminé l’année 2015 par une exploitation avec des ions lourds. Pour la première fois, la moyenne de l’énergie dans le centre de masse obtenue lors des collisions de noyaux de plomb a atteint 5,02 TeV par paire de nucléons. Cette énergie, inédite pour les collisions plomb-plomb, est presque deux fois supérieure à celle de l’exploitation plomb-plomb précédente, en 2011, et représente quelque 25 fois celle atteinte au RHIC, à Brookhaven ; cette prouesse permet d’élargir l’étude du plasma quarks-gluons à des densités et des températures encore plus élevées.
The extensive modifications made to the LHC during its first long shutdown allowed the energy of the proton beams to be increased from 4 TeV in 2012 to 6.5 TeV, enabling proton–proton collisions at a centre-of-mass energy of 13 TeV, in 2015. As usual, a one-month heavy-ion run was scheduled at the end of the year. With lead nuclei colliding, the same fields in the LHC’s bending magnets would have allowed 5.13 TeV per colliding nucleon pair. However, it was decided to forego the last whisker of this increase to match the equivalent energy of the proton–lead collisions that took place in 2013, namely 5.02 TeV. Furthermore, the first week of the run was devoted to colliding protons at 2.51 TeV per beam. This will allow the LHC experiments to make precise comparisons of three different combinations of colliding particles, p–p, p–Pb and Pb–Pb, at the same effective energy of 5.02 TeV. This is crucial to disentangling the ascending complexity of the observed phenomena (CERN Courier March 2014 p17).
The first (and last, until 2018) Pb–Pb operation close to the full energy of the LHC was also the opportunity to finally assess some of its ultimate performance limits as a heavy-ion collider. A carefully targeted set of accelerator-physics studies also had to be scheduled within the tight time frame.
Delivering luminosity
The chain of specialised heavy-ion injectors, comprising the electron cyclotron resonance ion source, Linac3 and the LEIR ring, with its elaborate bunch-forming and cooling, were recommissioned to provide intense and dense lead bunches in the weeks preceding the run. Through a series of elaborate RF gymnastics, the PS and SPS assemble these into 24-bunch trains for injection into the LHC. The beam intensity delivered by the injectors is a crucial determinant of the luminosity of the collider.
Planning for the recommissioning of the LHC to run in two different operational conditions after the November technical stop resembled a temporal jigsaw puzzle, with alternating phases of proton and heavy-ion set-up (the latter using proton beams at first) continually readapted to the manifold constraints imposed by other activities in the injector complex, the strictures of machine protection, and the unexpected. For Pb–Pb operation, a new heavy-ion magnetic cycle was implemented in the LHC, including a squeeze to β* = 0.8 m, together with manipulations of the crossing angle and interaction-point position at the ALICE experiment. First test collisions occurred early in the morning of 17 November, some 10 hours after first injection of lead.
The new Pb–Pb energy was almost twice that of the previous Pb–Pb run in 2011, and some 25 times that of RHIC at Brookhaven, extending the study of the quark–gluon plasma to still-higher energy density and temperature. Although the energy per colliding nucleon pair characterises the physical processes, it is worth noting that the total energy packed into a volume on the few-fm scale exceeded 1 PeV for the first time in the laboratory.
After the successful collection of the required number of p–p reference collisions, the Pb–Pb configuration was validated through an extensive series of aperture measurements and collimation-loss maps. Only then could “stable beams” for physics be declared at 10.59 a.m. on 25 November, and spectacular event displays started to flow from the experiments.
In the next few days, the number of colliding bunches in each beam was stepped up to the anticipated value of 426 and the intensity delivered by the injectors was boosted to its highest-ever values. The LHC passed a historic milestone by exceeding the luminosity of 1027 cm–2 s–1, the value advertised in its official design report in 2004.
This allowed the ALICE experiment to run in its long-awaited saturated mode with the luminosity levelled at this value for the first few hours of each fill.
Soon afterwards, an unexpected bonus came from the SPS injection team, who pulled off the feat of shortening the rise time of the SPS injection kicker array, first to 175 ns then to 150 ns, allowing 474, then 518, bunches to be stored in the LHC. The ATLAS and CMS experiments were able to benefit from luminosities over three times the design value. A small fraction of the luminosity in this run was delivered to the LHCb experiment, a newcomer to Pb–Pb collisions.
Nuclear beam physics
The electromagnetic fields surrounding highly charged ultrarelativistic nuclei are strongly Lorentz-contracted into a flat “pancake”. According to the original insight of Fermi, Weizsäcker and Williams, these fields can be represented as a flash of quasi-real photons. At LHC energies, their spectrum extends up to hundreds of GeV. In a very real sense, the LHC is a photon–photon and photon–nucleus collider (CERN Courier November 2012 p9). The study of such ultraperipheral (or “near-miss”) interactions, in which the two nuclei do not overlap, is an important subfield of the LHC experimental programme, alongside its main focus on the study of truly nuclear collisions.
From the point of view of accelerator physics, the ultraperipheral interactions with their much higher cross-sections loom still larger in importance. They dominate the luminosity “burn-off”, or rate at which particles are removed from colliding beams, leading to short beam and luminosity lifetimes. Furthermore, they do so in a way that is qualitatively different from the spray of a few watts of “luminosity debris” by hadronic interactions. Rather, the removed nuclei are slightly modified in charge and/or mass, and emerge as new, well-focussed, secondary beams. These travel along the interaction region just like the main beam but, as soon as they encounter the bending magnets of the dispersion-suppressor section, their trajectories deviate, as in a spectrometer.
The largest contribution to the burn-off cross-section comes from the so-called bound-free pair-production (BFPP) in which the colliding photons create electron–positron pairs with the electron in a bound-state of one nucleus. A beam of these one-electron ions, carrying a power of some tens of watts, emerges from the interaction point and is eventually lost on the outer side of the beam pipe.
Controlled quench
The LHC operators have become used to holding their breath as the BFPP loss peaks on the beam-loss monitors rise towards the threshold for dumping the beams (figure). There has long been a concern that the energy deposited into superconducting magnet coils may cause them to quench, bringing the run to an immediate halt and imposing a limit on luminosity. In line with recent re-evaluations of the magnet-quench limits, this did not happen during physics operation in 2015 but may happen in future operation at still-higher luminosity. During this run, mitigation strategies to move the losses out of the magnets were successfully implemented. Later, in a special experiment, one of these bumps was removed and the luminosity slowly increased. This led to the first controlled steady-state quench of an LHC dipole magnet with beam, providing long-sought data on their propensity to quench. On the last night of the run, another magnet quench was deliberately induced by exciting the beam to create losses on the primary collimators.
Photonuclear interactions also occur at comparable rates in the collisions and in the interactions with the graphite of the LHC collimator jaws. Nuclei of 207Pb, created by the electromagnetic dissociation of a neutron from the original 208Pb at the primary collimators, were identified as a source of background after traversing more than a quarter of the ring to the tertiary collimators near ALICE.
These, and other phenomena peculiar to heavy-ion operation, must be tackled in the quest for still-higher performance in future years.
Le programme d’amélioration pour ATLAS et CMS passe à la phase suivante
Le projet LHC haute luminosité permettra également d’étendre le programme de physique des expériences. Des éléments clés des détecteurs devront être remplacés pour que ces instruments puissent traiter l’empilement d’interactions proton-proton – 140 à 200 en moyenne par croisement de paquets. En octobre 2015, le Comité d’examen des ressources du CERN a confirmé que les collaborations peuvent à présent élaborer des rapports de conception technique (TDR). Le franchissement de cette première étape du processus d’approbation est un grand succès pour les expériences ATLAS et CMS.
At the end of the third operational period in 2023, the LHC will have delivered 300 fb–1, and the final focussing magnets, installed at the collision points at each of four interaction regions in the LHC, will need to be replaced. By redesigning these magnets and improving the beam optics, the luminosity can be greatly increased. The High Luminosity LHC (HL-LHC) project aims to deliver 10 times the original design integrated luminosity (number of collisions) of the LHC (CERN Courier December 2015 p7). This will extend the physics programme and open a new window of discovery. But key components of the experiments will also have to be replaced to cope with the pile-up of 140–200 proton–proton interactions occurring, on average, per bunch crossing of the beams. In October 2015, the ATLAS and CMS collaborations met a major milestone in preparing these so-called Phase II detector upgrades for operation at the HL-LHC, when it was agreed at the CERN Resource Review Board that they proceed to prepare Technical Designs Reports.
New physics at the HL-LHC
The headline result of the first operation period of the LHC was the observation of a new boson in 2012. With the present data set, this boson is fully consistent with being the Higgs boson of the Standard Model of particle physics. Its couplings (interaction strengths) with other particles in the dominant decay modes are measured with an uncertainty of 15–30% by each experiment, and scale with mass as predicted (see figure 1). With the full 3000 fb–1 of the HL-LHC, the dominant couplings can be measured with a precision of 2–5%; this potential improvement is also shown in figure 1. What’s more, rare production processes and decay modes can be observed. Of particular interest is to find evidence for the production of a pair of Higgs bosons, which depends on the strength of the interaction between the Higgs bosons themselves. This will be complemented by precise measurements of other Standard Model processes and any deviations from the theoretical predictions will be indirect evidence for a new type of physics.
In parallel, the direct search for physics beyond the Standard Model will continue. The theory of supersymmetry (SUSY) introduces a heavy partner for each ordinary particle. This is very attractive in that it solves the problem of how the Higgs boson can remain relatively light, with a mass of 125 GeV, despite its interactions with heavy particles; in particular, SUSY can cancel the large corrections to the Higgs mass from the 173 GeV top quark. According to SUSY, the contributions from ordinary particles are cancelled by the contributions from the supersymmetric partners. The presence of the lightest SUSY particle can also explain the dark matter in the universe. Figure 2 compares the results achievable at the LHC and HL-LHC in a search for electroweak SUSY particles. Electroweak SUSY production has a relatively low rate, and benefits from the factor 10 increase in luminosity. Particles decaying via a W boson and a Z boson give final states with three leptons and missing transverse momentum.
Other “exotic” models will also be accessible, including those that introduce extra dimensions to explain why gravity is so weak compared with the other fundamental forces.
If a signal for new particles or new interactions begins to emerge – and this might happen in the second ongoing period of the LHC operation, which is running at higher energy compared with the first period – the experiments will have to be able to measure them precisely at the HL-LHC to distinguish between different theoretical explanations.
Experimental challenges
To achieve the physics goals, ATLAS and CMS must continue to be able to reconstruct all of the final-state particles with high efficiency and low fake rates, and to identify which ones come from the collision of interest and which come from the 140–200 additional events in the same bunch crossing. Along with this greatly increased event complexity, at the HL-LHC the detectors will suffer from unprecedented instantaneous particle flows and integrated radiation doses.
Detailed simulations of these effects were carried out to identify the sub-systems that will either not survive the high luminosity environment or not function efficiently because of the increased data rates. Entirely new tracking systems to measure charged particles will be required at the centre of the detectors, and the energy-measuring calorimeters will also need partial replacement, in the endcap region for CMS and possibly in the more forward region for ATLAS.
The possibility of efficiently selecting good events and the ability to record higher rates of data demand new triggering and data-acquisition capabilities. The main innovation will be to implement tracking information at the hardware level of the trigger decision, to provide sufficient rejection of the background signals. The new tracking devices will use silicon-sensor technology, with strips at the outer radii and pixels closer to the interaction point. The crucial role of the tracker systems in matching signals to the different collisions is illustrated in figure 3, where the event display shows the reconstruction of an interaction producing a pair of top quarks among 200 other collisions. The granularity will be increased by about a factor of five to produce a similar level of occupancies as with the current detectors and operating conditions. With reduced pixel sizes and strip pitches, the detector resolution will be improved. New thinner sensor techniques and deeper submicron technologies for the front-end read-out chips will be used to sustain the high radiation doses. And to further improve the measurements, the quantity and mass of the materials will be substantially reduced by employing lighter mechanical structures and materials, as well as new techniques for the cooling and powering schemes. The forward regions of the experiments suffer most from the high pile-up of collisions, and the tracker coverage will therefore be extended to better match the calorimetry measurements. Associating energy deposits in the calorimeters with the charged tracks over the full coverage will substantially improve jet identification and missing transverse energy measurements. The event display in figure 4 shows the example of a Higgs boson produced by the vector boson fusion (VBF) process and decaying to a pair of τ leptons.
The calorimeters in ATLAS and CMS use different technologies and require different upgrades. ATLAS is considering replacing the liquid-argon forward calorimeter with a similar detector, but with higher granularity. For further mitigation of pile-up effects, a high-granularity timing detector with a precision of a few tens of picoseconds may be added in front of the endcap LAr calorimeters. In CMS, a new high-granularity endcap calorimeter will be implemented. The detector will comprise 40 layers of silicon-pad sensors interleaved with W/Cu and brass or steel absorber to form the electromagnetic and hadronic sections, respectively. The hadronic-energy measurement will be completed with a scintillating tile section similar to the current detector. This high-granularity design introduces shower-pointing ability and high timing precision. Additionally, CMS is investigating the potential benefits of a system that is able to measure precisely the arrival time of minimum ionising particles to further improve the vertex identification for all physics objects.
The muon detectors in ATLAS and CMS are expected to survive the full HL-LHC period; however, new chambers and read-out electronics will be added to improve the trigger capabilities and to increase the robustness of the existing systems. ATLAS will add new resistive plate chambers (RPC) and small monitored drift tube chambers to the innermost layer of the barrel. The endcap trigger chambers will be replaced with small-strip thin gap chambers. CMS will complete the coverage of the current RPCs in the endcaps with high-rate capability chambers in gas electron multipliers in the front stations and RPCs in the last ones. Both experiments will install a muon tagger to benefit from the extended tracker coverage.
The trigger systems will require increased latency to allow sufficient time for the hardware track reconstruction and will also have larger throughput capability. This will require the replacement of front-end and back-end electronics for several of the calorimeter and/or muon systems that will otherwise not be replaced. Additionally, these upgrades will allow the full granularity of the detector information to be exploited at the first stage of the event selection.
Towards Technical Design Reports
To reach the first milestone in the approval process agreed with the CERN scientific committees, ATLAS and CMS prepared detailed documentation describing the entire Phase II “reference” upgrade scope and the preliminary planning and cost evaluations. This documentation includes scientific motivations for the upgrades, demonstrated through studies of the performance reach for several physics benchmarks and examined in 140 and 200 collision pile-up conditions. The performance degradation with two scenarios of reduced cost, where the upgrades are descoped or downgraded, was also investigated. After reviewing this material, the CERN LHC Committee and the Upgrade Cost Group reported to the CERN Research Board and the Resource Review Board, concluding that: “For both experiments, the reference scenario provides well-performing detectors capable of addressing the physics at the HL-LHC.”
The success of this first step of the approval process was declared, and the ATLAS and CMS collaborations are now eager to proceed with the necessary R&D and detector designs to prepare Technical Design Reports over the next two years.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.