The Advanced European Infrastructures for Detectors and Accelerators (AIDA-2020) – the largest European-funded project for joint detector development – is making financial support available for small development teams to carry out experiments and tests at one of 10 participating European facilities. The project, which started on 1 May, will run for four years. Its main goal is to bring the community together and push detector technologies beyond current limits by sharing high-quality infrastructures provided by 57 partners from 34 countries, from Europe to Asia.
Building on the experience gained with the original AIDA project (CERN Courier April 2011 p6), the transnational access (TA) activities in AIDA-2020 are to enable financial support for teams to travel from one facility to another, to share existing infrastructures for efficient and reliable detector development. The support is organized around three different themes, providing access to a range of infrastructures: the Proton Synchrotron and Super Proton Synchrotron test beams, the IRRAD proton facility and the Gamma Irradiation Facility (GIF++) at CERN; the DESY II test beam; the TRIGA reactor at the Jožef Stefan Institute; the Karlsruhe Compact Cyclotron (KAZ); the Centre de Recherches du Cyclotron at the Université catholique de Louvain (UCLouvain); the MC40 Cyclotron at the University of Birmingham; the Rudjer Boskovic Institute Accelerator Facility (RBI-AF); and the electromagnetic compatibility facility (EMClab) at the Instituto Tecnológico de Aragón (ITAINNOVA).
Access to high-energy particle beams (TA1) at CERN and DESY enables the use of test beams free-of-charge. Here the main goal is to attract more researchers to participate in beam tests, in particular supporting PhD students and postdoctoral researchers to carry out beam tests of detectors.
With the access to irradiation sources (TA2), the goal is to cover the range of particle sources needed for detector qualification for the High Luminosity LHC (HL-LHC) project. These include proton, neutron and mixed-field sources, as well as gamma irradiation. Through IRRAD, TRIGA, KAZ and MC40, it provides both the extreme fluences of up to 1017 neq/cm2 required for the forward region in HL-LHC experiments, and the lower fluences of 1015 neq/cm2 on 10 cm2 objects for the outer layers of trackers. GIF++ covers irradiation of large-scale objects such as muon chambers, while the Heavy Ion Irradiation Facility at UCLouvain is available for single-event-effects tests of electronics.
The third theme provides access to new detector-testing facilities (TA3). Semiconductor detectors will be one of the main challenges at the HL-LHC. Studying their behaviour with micro-ion beams at RBI will enhance the understanding of these detectors. Electromagnetic compatibility is a key issue when detectors have to be integrated in an experiment, and prior tests in a dedicated facility such as the EMClab at ITAINNOVA will make the commissioning of detectors more efficient.
The Japanese/German BASE collaboration at CERN’s Antiproton Decelerator (AD) has compared the charge-to-mass ratios of the antiproton and proton with a fractional precision of 69 parts in a trillion (ppt). This high-precision measurement was achieved by comparing the cyclotron frequencies of antiprotons and negatively charged hydrogen ions in a Penning trap. The result is consistent with charge–parity–time-reversal (CPT) invariance, which is one of the cornerstones of the Standard Model of particle physics, and constitutes the most precise test comparing baryons and antibaryons performed to date.
In their experiment, the BASE collaboration has profited from techniques pioneered in the 1990s by the TRAP collaboration at the Low Energy Antiproton Ring at CERN. The advanced cryogenic Penning-trap system used in BASE consists of four traps, two of which were used in this measurement – a measurement trap and a reservoir trap (figure 1). When the experiment receives a pulse of 5.3 MeV antiprotons from the AD, they strike the degrader structure, which is designed to slow them down, and release hydrogen. Negatively charged hydrogen ions (H–) can form in the process, producing a composite cloud with the antiprotons that is shuttled to the reservoir trap. BASE has developed techniques to extract single antiprotons and negative hydrogen ions from this cloud whenever needed. Moreover, the reservoir has a lifetime of more than a year, making the BASE experiment almost independent from AD cycles.
Using this extraction technique, and taking the timing from the AD cycle, BASE prepares a single antiproton in the measurement trap, while an H– ion is held in the downstream park electrode, as shown in figure 1. The cyclotron frequency of the antiproton is then measured in exactly 120 s, which corresponds to one AD cycle. The particles are subsequently exchanged by performing appropriate potential ramps, and the cyclotron frequency of the H– ion is measured. Thus, a single comparison of the charge-to-mass ratios takes only 240 s. This is much faster than in previous experiments, enabling BASE to perform about 6500 ratio comparisons in 35 days of measurement time (figure 2). The result is a value of the ratio-comparison: (q/m)p-/(q/m)p – 1 = 1(69) × 10–12, thus confirming CPT at the level of ppt.
The high sampling rate has also enabled the first high-resolution study of diurnal variations in a baryon/antibaryon comparison, which could be introduced by Lorentz-violating cosmic-background fields. The measurement sets constraints on such variations at the level of less than 720 ppt. In addition, by assuming that CPT invariance holds, the measurement can be interpreted as a test of the weak equivalence principle using baryonic antimatter. If matter respects weak equivalence while antimatter experiences an anomalous coupling to the gravitational field, this gravitational anomaly would contribute to a possible difference in the measured cyclotron frequencies. Thus, by following these assumptions, the result from BASE can be used to set a limit on the gravitational anomaly parameter, αg: |αg – 1| < 8.7 × 10–7.
The main goal for the BASE experiment, which was approved in June 2013, is to measure the magnetic moment of the antiproton with a precision of parts per billion. Using the double Penning trap system, the collaboration recently performed the most precise measurement of the magnetic moment of the proton.
The year 2015 began for the ATLAS experiment with an intense phase of commissioning using cosmic-ray data and first proton–proton collisions, allowing ATLAS physicists to test the trigger and detector systems as well as to align the tracking devices. Then the collection of physics data in LHC Run 2 started in June, with proton–proton collisions at a centre-of-mass energy of 13 TeV (CERN Courier July/August 2015 p25). Measurements at this new high-energy frontier were among the highlights of the many results presented by the ATLAS collaboration at EPS-HEP 2015.
An important early goal for ATLAS was to record roughly 200 million inelastic proton–proton collisions with a very low level of secondary collisions within the same event (“pile-up”). This data sample allowed ATLAS physicists to perform detailed studies of the tracking system, which features a new detector, the “Insertable B-layer” (IBL). The IBL consists of a layer of millions of tiny silicon pixels mounted in the innermost heart of ATLAS at a distance of 3.3 cm from the proton beam (CERN Courier October 2013 p28). Together with the other tracking layers of the overall detector, the IBL allows ATLAS to measure the origin of charged particles with up to two times better precision than during the previous run. Figure 1 shows the resolution achieved for the longitudinal impact parameter of the beam.
ATLAS exploited the early data sample at 13 TeV for important physics measurements. It allowed the collaboration to characterize inelastic proton–proton collisions in terms of charged-particle production and the structure of the “underlying event” – collision remnants that are not directly related to the colliding partons in the proton. This characterization is important for validating the simulation of the high-luminosity LHC collisions, which contain up to 40 inelastic proton–proton collisions in a given event (one event involves the crossing of two proton bunches with more than 100 billion protons each). Figure 2 shows the evolution of the charged-particle multiplicity with centre-of-mass energy.
ATLAS also measured the angular correlation among pairs of the produced charged particles, confirming the appearance of a so-called “ridge” phenomenon in events with large particle multiplicity at a centre-of-mass energy of 13 TeV. The “ridge” (figure 3) consists of long-range particle–particle correlations not predicted by any of the established theoretical models describing inelastic proton–proton collisions.
After the low-luminosity phase, the LHC operators began to increase the intensity of the beams. By the time of EPS-HEP 2015, ATLAS had recorded a total luminosity of 100 pb–1, of which up to 85 pb–1 could be exploited for physics and performance studies. ATLAS physicists measured the performance of electron, muon and τ-lepton reconstruction, the reconstruction and energy calibration of jets, and the reconstruction of “displaced” decays of long-lived particles, such as weakly decaying hadrons containing a bottom quark. The precision of the position measurements of displaced decay locations (vertices) is significantly improved by the new IBL detector.
ATLAS used these data to classify the production of J/ψ particles at 13 TeV in terms of their immediate (“prompt”) and delayed (“non-prompt”) origin. While non-prompt J/ψ production is believed to be well understood via the decay of b hadrons, prompt production continues to be mysterious in some aspects.
ATLAS also performed a first study of the production of energetic, isolated photons and a first cross-section measurement of inclusive jet production in 13 TeV proton–proton collisions. Both are correctly described by state-of-the-art theory.
The data samples at high collision energy contain copious numbers of Z and W bosons, the mediators of the weak interaction, whose leptonic decays provide a clean signature in the detector that can be exploited for calibration purposes. ATLAS has studied the kinematic properties of these bosons, also in association with jet production. Their abundance in 13 TeV proton–proton collisions is found to be consistent with the expectation from theory. ATLAS has also observed some rare di-boson (ZZ) events, which – with a hundred times more data – should allow the direct detection of Higgs bosons. Figure 4 shows a candidate ZZ event.
In higher-energy proton collisions, the rate of particle production for many heavier particles for a given luminosity increases. The heaviest known particle, the top quark – with a mass approximately 170 times that of a proton – is predominantly produced in pairs at the LHC, and the cross-section for the production of top-quark pairs is expected to increase by a factor of 3.3 at 13 TeV, compared with the 8 TeV collisions of Run 1. ATLAS has performed an early measurement of the top-pair production cross-section in the cleanest channels where one top quark decays to an electron, an electron-neutrino and a jet containing a b-hadron (“b-jet”), while the other top-quark decays to a muon, a muon-neutrino and a b-jet. The small backgrounds from other processes in this channel allow a robust measurement with small systematic uncertainties. The measured cross-section agrees with the predicted increase of a factor of 3.3. The precision of the measurement is limited by the 9% uncertainty in luminosity, which is expected to improve significantly during the year. Figure 5 shows the evolution of the top-pair production cross-section.
Although the available data sample does not yet allow a significant increase in the sensitivity to the most prominent new physics phenomena, ATLAS has exploited the data to perform important early measurements. The excellent detector performance has allowed the confirmation of theoretical expectations with 13 TeV proton–proton collision energies.
The highlight of EPS-HEP 2015 for the CMS collaboration was the publication of the first physics result exploring the new territory at the LHC energy of 13 TeV: the measurement of the charged-hadron multiplicity (dN/dη), where η, the pseudorapidity, is a measure for the direction of the particle track. When protons collide at the LHC, more than one of their constituents (quarks or gluons) can interact with another one, so every collision produces an underlying spray of charged hadrons, such as pions and kaons, and the greater the energy, the higher the number of produced particles. Knowing precisely how many charged hadrons are created at the new collision energy is important for ensuring that the theoretical models used in the simulations employed in the physics analyses describe these underlying processes accurately. The publication from CMS at 13 TeV reports the differential multiplicity distribution for values of η < 2, and a measured density for central charged hadrons (with |η| < 0.5) of 5.49±0.01 (stat.)±0.17 (syst.). Figure 1 shows the differential distribution and the energy dependence of the new measurement compared with earlier data at lower energies.
CMS has, in addition, produced a full suite of performance plots covering a range of physics objects and final states, using up to 43/pb of 13 TeV data. Figure 2 shows the dimuon mass spectrum obtained from multiple trigger paths, where several resonances from the ω meson to the Z boson can be seen clearly. The B physics group in CMS has studied this spectrum in detail from the J/ψ to the Υ masses, and also the decay-time distributions for events with J/ψ or B+ mesons. Dedicated performance plots were presented at the conference for various muon, electron and photon kinematic and identification variables, as well as the measured reconstruction and identification efficiencies. The reconstruction of several low-mass states, including Ks, Λ, D0, D*, B+, B0 and Bs, demonstrate the good performance of the CMS tracker. In addition, the position of the beam spot has been measured in all three dimensions. Simulations are already found to reproduce these physics-object data well at this early stage.
The physics groups in CMS have also started to study several processes at 13 TeV in some detail. One highlight is a first look for searches in the dijet invariant-mass spectrum, which so far reaches up to approximately 5 TeV (figure 3). Results of the same analysis on Run 1 data were released only in spring, but CMS is already continuing the search where it ended at 8 TeV up to 13 TeV, thus demonstrating the collaboration’s readiness for discovery physics in the new energy regime. The TOP group has studied top–antitop (tt) events in the dilepton and lepton+jet channels, in addition to taking a first look at events consistent with the production of single top quarks.
While eagerly jumping on the new data, CMS continues to produce world-class physics results on the Run 1 data collected at 7 and 8 TeV. The collaboration has recently approved more than 30 new results, which were shown at the conference. These include searches for new physics as well as precision Standard Model measurements. The results presented include measurements of the two-photon production of W-boson pairs through the interaction of two photons, the electroweak production of a W boson accompanied by two jets, production rates for particle jets at 2.76 TeV compared with 8 TeV, as well as the production of two photons along with jets.
Discovered more than two decades ago, the top quark continues to play a vital role in physics analyses for both measurements and searches, because it is the heaviest elementary particle known so far. New CMS results with this special type of quark include measurements of the tt production rates in the fully hadronic sample, and a measurement of the tt+bb process as well as the tt production in conjuction with a Z or W boson. In addition, searches for signs of new physics continue, most recently in the process where top decays to a charm quark and a Higgs boson, t → cH, and the Higgs boson transforms to photons.
On the Higgs front itself, CMS has performed three new searches for non-Standard Model Higgs bosons containing τ leptons in the decay products, while on the supersymmetry front, analyses have looked for dark-matter candidates and other supersymmetric particles. Heavy-ion results from Run 1, using proton–proton, proton–lead and lead–lead collisions, include Υ polarization as a function of charged-particle multiplicity in proton–proton collisions, Z-boson production, jet-fragmentation functions in proton–lead collisions, and nuclear modification of Υ states in lead–lead collisions.
At EPS-HEP2015, the LHCb collaboration presented the first measurement of the J/ψ production cross-section in proton–proton (pp) collisions at 13 TeV. Using this measurement, they also determined the b-quark cross-section at this new, higher energy.
J/ψ mesons can be produced both “promptly”, in the pp collision, and as a product of decays of B hadrons, dubbed “J/ψ-from-b”. The two components are visible in figure 1, which shows the J/ψ decay-time distribution with respect to the pp collision time. The black points with error bars show the data, the solid red line indicates the best fit to the data, and the prompt J/ψ contribution is shown in blue. The black line indicates the J/ψ-from-b contribution, which falls exponentially with a time constant characteristic of the lifetime of B hadrons.
While the prompt J/ψ cross-section is interesting for constraining QCD models, the J/ψ-from-b cross-section is used to compute the b-quark pair total cross-section. The data at 13 TeV confirm the expected rise of the B-particle production rate of about a factor two with respect to 7 TeV. This increase will enable LHCb to obtain even more precise, interesting and, hopefully, surprising results in LHC Run 2.
This analysis was the first to benefit from a new scheme for the LHCb software trigger that was introduced for Run 2. Splitting the event selection into two stages, it allows alignment and calibration to be performed in real time after the first stage of the software trigger and then used directly in the second stage. The same alignment and calibration information is propagated to the offline reconstruction, to ensure consistent and high-quality particle-identification information in the trigger and offline. The identical performance of the online and offline reconstruction achieved in this way offers the opportunity to perform physics analyses directly using candidates reconstructed in the trigger – the online reconstruction is used, for example, in the J/ψ cross-section measurement. The storage of only the triggered candidates leads to a reduction in the event size of an order of magnitude, permitting an increased event rate with higher efficiency.
LHCb also presented the determination based on Run 1 data of the Cabibbo–Kobayashi–Maskawa (CKM) matrix element |Vub|, which describes the transition of a b quark to a u quark. The measurement – published during the conference in Nature Physics – was made by studying a decay Λ0b baryon, Λ0b → pμ–νμ (LHCb 2015a). The measurement of decays involving a neutrino is very challenging at a proton collider, and it was quite a surprise that this measurement could be done.
Measurements of |Vub| by previous experiments had returned two sets of inconsistent results, depending on the method used. Inclusive determination using all b → ulν transitions where l is either a muon or an electron give values of |Vub| above 0.004, while exclusive determinations, mainly from B → πlν, yield values around 0.003. This could be explained by a new particle, in addition to the W boson, contributing to the quark transition with a right-handed current. LHCb’s new measurement is of the exclusive category, but is the first to involve a baryon decay and hence a spin-1/2 particle. The result is |Vub| = (3.27±0.15±0.16±0.06) × 10–3, where the uncertainties are experimental, related to the theoretical calculation, and to the value of |Vcb|, respectively. This number agrees with previous exclusive determinations and is inconsistent with the hypothesis of new right-handed currents. So it still leaves the puzzle of why the inclusive and exclusive measurements do not agree. Further intensive research, both at the experimental and theoretical level, will continue to try to understand this disagreement.
While the above measurement constrains one side of the CKM unitarity triangle, the other (the third being unity) is best constrained by the B-meson oscillation frequency. LHCb presented the most precise measurement to date at the conference, using semileptonic B0 decays. The result of (503.6±2.0±1.3) ns–1 is consistent with, but more precise than, the world average (LHCb 2015b).
In other highlights from Run 1, the collaboration reported new results on long-range correlations in proton–lead collisions. LHCb’s latest measurements show that the so-called “ridges” seen in the most violent collisions span across even larger longitudinal distances, as figure 2 shows at Δφ = 0 below the (truncated) peak at (0,0). This is the first time that the effect has been seen in the forward direction (LHCb 2015c). Moreover, because of its acceptance, the LHCb experiment distinguishes between configurations where the lead-ion enters from the front and those where it is the proton. Somewhat unexpectedly, the ridge is seen in both cases.
PAUCam, the camera for the Physics of the Accelerating Universe (PAU) project, has been successfully installed and commissioned at the William Herschel Telescope (WHT) at the Roque de los Muchachos Observatory on the island of La Palma. Installation took place on 3 June, with commissioning following during the subsequent four nights.
The innovative instrument is designed to measure precisely and efficiently the redshift of galaxies up to large distances – galaxies whose light is only now reaching Earth after starting its journey when the universe was less than half the size of what it is today. One of the primary goals of the project is to study how the expansion rate of the universe is increasing under the influence of the mysterious dark energy that makes up nearly 70% of the universe. PAUCam’s competitiveness comes from its ability to obtain redshifts that are more precise than those of current photometric surveys, over a volume with a larger number density of galaxies than in spectroscopic surveys.
The camera is mounted at the prime focus of the 4 m-diameter WHT, which is part of the Isaac Newton Group of Telescopes, operated at present by a consortium between the governments of the Netherlands, Spain and the UK. The location at the prime focus of the WHT imposes severe weight limitations that are in conflict with the complexity and large size of the instrument. The solution was to build the main body of the camera with carbon fibre, with engineering developed in Spain.
Another innovation of the camera is in the technique used to measure redshifts. With PAUCam, the redshift is measured photometrically, where the same object is imaged multiple times with its light passing through filters of different colours. PAUcam uses a set of 40 narrow-band filters, each passing light in a 10 nm-wide band, from wavelengths in the range of 450–850 nm. Another set of filters passes the light in six wider bands, named u, g, r, i, z and Y. The large number of narrow-band filters allow a determination of redshifts with a precision of 0.003(1+z), where z is the redshift parameter, or 10(1+z) Mpc, the characteristic scale of linear growth in the universe.
The photometric technique used in PAUCam contrasts with the spectroscopic technique, where the redshift of an object is determined by analysing its light through a spectrograph. The latter technique allows very precise determination of redshifts, but at the expense of much longer exposure times. Furthermore, only the spectra of previously selected objects are analysed, while the photometric technique determines, in principle, the redshift of all of the objects in the region of the sky being imaged. In the case of PAUCam, the expectation is to measure the redshift of about 50,000 objects every night of observation. The camera covers the entire field of view of the WHT (1 square degree) with a mosaic of 18 Hamamatsu Photonic red-sensitive CCDs, each with 4000 × 2000 pixels.
PAUCam has been designed and built over the past six years by a fruitful collaboration between astronomers, cosmologists and particle physicists in a consortium of Spanish institutions. The idea to build an instrument capable of contributing significantly to cosmological measurements arose in 2007, in the context of the Consolider Ingenio 2010 project financed by the Spanish government. This programme had as its objective the achievement in Spain of highly innovative projects.
The PAUCam team is now being joined by other European groups to conduct a survey, named PAUS, with the objective of scientifically exploiting the capabilities of the camera. Aside from the survey, observation time with PAUCam will be made available to the international scientific community, for astronomical as well as cosmological measurements.
• PAUCam was designed and built by a consortium comprising the Institut de Física d’Altes Energies (IFAE), the Institut de Ciències de l’Espai (ICE-CSIC/IEEC) and the Port d’Informació Científica(PIC), all in Barcelona, and the Centro de Investigaciones Energéticas Medioambientales y Tecnológicas(CIEMAT) and the Instituto de Física Teórica (IFT-UAM/CSIC), both in Madrid.
In June, Fermilab’s Main Injector accelerator sustained a 521 kW proton beam, and set a world record for the production of high-energy neutrinos with a proton accelerator. The 120 GeV proton beam is used to provide high-energy neutrinos or antineutrinos to three experiments at the laboratory: the long-baseline experiments MINOS+ and NOvA (CERN Courier June 2015 p17) and the neutrino-interaction experiment MINERvA (CERN Courier April 2014 p26).
The record beam surpasses that of the proton beam of more than 400 kW achieved at CERN for the CERN Neutrinos to Gran Sasso (CNGS) beamline, which provided neutrinos for the ICARUS and OPERA long-baseline experiments. The highest beam powers for fixed-target proton beams are achieved with protons in the giga-electron-volt range. Both the Spallation Neutron Source at Oak Ridge National Laboratory and the cyclotron facility at the Paul Scherrer Institute in Switzerland create proton beams with powers in excess of 1 MW. In the 1990s, Los Alamos National Laboratory operated a 0.8 GeV proton beam at about 800 kW for its low-energy neutrino experiment, LSND.
The power of the proton beam is a key element in producing neutrinos at accelerators: the more protons packed in the beam, the higher the number of neutrinos and antineutrinos produced and the better the chance to record neutrino interactions. The protons strike a target to create pions and other short-lived particles; the higher the proton energy, the larger the number of pions produced. Magnetic-focusing horns direct the charged pions into a vacuum pipe that is centred along the desired neutrino-beam direction. As the pions decay, they produce neutrinos and antineutrinos that are boosted in the direction of the original pions.
Since 2011, Fermilab has made significant upgrades to its accelerators and reconfigured the complex to provide the best possible particle beams for neutrino and muon experiments. The next goal for the 3.3 km circumference Main Injector accelerator is to deliver 700 kW in 2016 – double the beam power produced in the Tevatron era.
Fermilab plans to make additional upgrades to its accelerator complex over the next decade. The Proton Improvement Project-II includes the construction of a 800 MeV superconducting linac. Its beam would enable the Main Injector to provide more than 1.2 MW of proton beam power for the international Deep Underground Neutrino Experiment (CERN Courier April 2015 p20).
Fermilab also operates a second neutrino beamline, powered by its 8 GeV booster accelerator. This provides neutrinos for the Short Baseline Neutrino programme, which comprises three neutrino detectors: MicroBooNE (construction complete), ICARUS (upgrades underway at CERN) and the Short Baseline Neutrino Detector (construction to start in 2016).
Based on optical observations, a team of astronomers has, for the first time, demonstrated a link between a very long-lasting gamma-ray burst (GRB) and an unusually bright supernova explosion. The results show that the supernova was not driven by radioactive decay, as expected, but most likely by the spin down of a magnetar, a neutron star with an extremely strong magnetic field.
GRBs have intrigued astronomers since their discovery nearly 50 years ago by US military satellites intended to detect nuclear test explosions conducted by the Soviet Union. Mysterious gamma-ray flashes were detected, not from Earth, but from random directions in the sky. It was only some 30 years later that the detection of their precise locations and the measurement of their redshifts by follow-up observations proved them to be very luminous events from remote galaxies. The further evidence that some of them are associated with supernova explosions settled the issue of their true nature as being a manifestation of the core collapse of a massive star (CERN Courier September 2003 p13).
Astronomers usually distinguish two main classes of GRBs: the short ones that flare up for less than about 2 s and the longer ones. Among the latter, there are a few outstanding bursts lasting more than 10,000 s, which have been proposed to originate in the explosion of giant stars with much larger radii (CERN Courier June 2013 p12). A team led by Jochen Greiner of the Max-Planck-Institut für extraterrestrische Physik in Garching, Germany, has now shown that a supernova explosion is associated with one of these rare ultra-long-duration GRBs, namely GRB 111209A. The supernova’s presence has been derived from observations of the afterglow emission by two telescopes of the European Southern Observatory in Chile: the GROND instrument on the 2.2 m telescope at La Silla and the X-shooter instrument on the Very Large Telescope at Paranal.
The supernova’s spectral and timing properties are both very peculiar. Its luminosity is intermediate between the supernovas usually associated with GRBs and a new class of super-luminous supernovas discovered in 2011. The exceptional luminosity of the latter would be due to energy injection from a rapidly rotating magnetar – a neutron star with a huge magnetic field of up to about 1010 T. The same process could be at play in the supernova of GRB 111209A. Indeed, the huge amount of nickel needed to explain the observed light curve by radioactive decay of 56Ni is not compatible with the rather featureless spectral shape, which suggests a star of low metallicity. While Greiner and colleagues cannot prove that a magnetar is at the origin of the ultra-long GRB of 9 December 2011, nor the source of the luminous and peculiar supernova they observed, they can rule out alternative possibilities, leaving this as the most likely one.
Magnetars have already been invoked to explain the long-lasting afterglow emission of some GRBs (CERN Courier May 2007 p11). Now they seem to be needed to account for powering the prompt emission of some of these powerful flashes of gamma rays. Their advantage is that they would provide a continuous power supply, from hours to months, by losing rotational energy through magnetic-dipole radiation. The flexibility of the magnetar model fits peculiar GRBs and supernovas well, but what about the more standard GRBs? Could they also be powered by a new-born magnetar rather than by a black hole?
On 21 September 1955, Owen Chamberlain, Emilio Segrè, Clyde Wiegand and Tom Ypsilantis found their first evidence of the antiproton, gathered through measurements of its momentum and its velocity. Working at what was known as the “Rad Lab” at Berkeley, they had set up their experiment at a new accelerator, the Bevatron – a proton synchrotron designed to reach an energy of 6.5 GeV, sufficient to produce an antiproton in a fixed-target experiment (CERN Courier November 2005 p27). Soon after, a related experiment led by Gerson Goldhaber and Edoardo Amaldi found the expected annihilation “stars”, recorded in stacks of nuclear emulsions (figure 1). Forty years later, by combing antiprotons and positrons, an experiment at the Low Energy Antiproton Ring (LEAR) at CERN gathered evidence in September 1995 for the production of the first few atoms of antihydrogen.
Over the decades, antiprotons have become a standard tool for studies in particle physics; the word “antimatter” has entered into mainstream language; and antihydrogen is fast becoming a laboratory for investigations in fundamental physics. At CERN, the Antiproton Decelerator (AD) is now an important facility for studies in fundamental physics at low energies, which complement the investigations at the LHC’s high-energy frontier. This article looks back at some of the highlights in the studies of the antiworld at CERN, and takes a glimpse at what lies in store at the AD.
Back at the Bevatron, the discovery of the antineutron through neutral particle annihilation followed in 1956, setting the scene for studies of real antimatter. Initially, everyone expected perfect symmetry between matter and antimatter through the combination of the operations of charge conjugation (C), parity (P) and time reversal (T). However, following the observation of CP violation in 1964, it was not obvious that nuclear forces were CPT invariant and that antinucleons should bind to build antinuclei. These doubts were laid to rest with the discovery of the antideuteron at CERN by a team led by Antonino Zichichi, and at Brookhaven by a team from Columbia University, including Leon Lederman and Sam Ting (CERN CourierMay 2009 p15and October 2009 p22). A decade later, evidence emerged for antihelium-3 and antitritium in the WA33 experiment at CERN’s Super Proton Synchrotron, following the sighting of a few candidates at the 70 GeV proton synchroton at the Institute for High Energy Physics near Serpukhov. More recently, the availability of colliding beams of heavy ions has led to the observation of antihelium-4 by the STAR experiment at Brookhaven’s Relativistic Heavy-Ion Collider (CERN Courier June 2011 p8). At CERN, the ALICE experiment at the LHC observes the production of light nuclei and antinuclei with comparable masses and therefore compatible binding energies (figure 2).
Back in 1949, before the discovery of the antiproton, Enrico Fermi and Chen-Ning Yang predicted the existence of bound nucleon–antinucleon states (baryonium), when they noted that certain repulsive forces between two nucleons could become attractive in the nucleon–antinucleon system. Later, quark models based on duality predicted the existence of states made of two quarks and two antiquarks, which should be observed when a proton annihilates with an antiproton. In the 1970s, nuclear-potential models went on to predict a plethora of bound states and resonance excitations around the two-nucleon mass. There were indeed reports of such states, among them narrow states observed in antiproton–proton (pp) annihilation at CERN’s Proton Synchrotron (PS) and in measurements of the pp cross-section as a function of energy (the S meson with a mass of 1940 MeV).
Baryonium was the main motivation for the construction at CERN of LEAR, which ran for more than a decade from 1982 to 1996 (see box). However, none of the baryonium states were confirmed at LEAR. The S meson was not observed with a sensitivity 10 times below the signal reported earlier in the pp total cross-section. Monoenergetic transitions to bound states were also not observed. The death of baryonium was a key topic for the Antiproton 86 Conference in Thessaloniki. What had happened? The high quality of the antiproton beams from LEAR meant that all of the pions had decayed. The high intensity of antiprotons (106/s compared with about 102/s in extracted beams at the PS) and a high momentum resolution of 10–3–10–4 was crucial at low energies for antiprotons stopping with very small range-straggling.
The spectroscopy of mesons produced in pp annihilation at rest in several experiments at LEAR proved to be much more fruitful. This continued a tradition that had begun in the 1960s with antiprotons annihilating in the 81 cm Hydrogen Bubble Chamber at the PS, leading to the discovery of the E meson (E for Europe, now the η(1440)) and the D meson (now the f1(1285)) in pp → (E, D → KKπ)ππ. The former led to the long-standing controversy about the existence in this mass region of a glueball candidate – a state made only of gluons – which was observed in radiative J/ψ decay at SLAC’s e+e– collider, SPEAR. With the start up of LEAR, the experiments ASTERIX, OBELIX, Crystal Barrel and JETSET took over the baton of meson spectroscopy in pp annihilation. ASTERIX discovered a tensor meson – the AX, now the f2(1565) – which was also reported by OBELIX; its structure is still unclear, although it could be the predicted tensor baryonium state.
Crystal Barrel specialized in the detection of multineutral events. The antiprotons were stopped in a liquid-hydrogen target and π0 mesons were detected through their γγ decays in a barrel-shaped assembly of 1380 CsI (Tl) crystals. Figure 3 shows the detector together with a Dalitz plot of pp annihilation into π0π0π0, measured by the experiment. The non-uniform distribution of events indicates the presence of intermediate resonances that decay into π0π0, such as the spin-0 mesons f0(980) and f0(1500), and the spin-2 mesons f2(1270) and f2(1565). The f0(1500) is a good candidate for a glueball.
ICE, the AA and LEAR
The construction of LEAR took advantage of the antiproton facility that was built at CERN in 1980 to search for the W and Z bosons at the Super Proton Synchrotron (SPS) operating as a –pp collider (CERN Courier December 1999 p15). The antiprotons originated when 26 GeV protons from the PS struck a target. Emerging with an average momentum of 3.5 GeV/c, they were collected in the Antiproton Accumulator (AA), and a pure antiproton beam with small transverse dimensions was generated by stochastic cooling. Up to 1012 antiprotons a day could be generated and stored. The antiprotons were then extracted and injected into the PS. After acceleration to 26 GeV, they were transferred to the SPS where they circulated in the same beam pipe as the protons, but in the opposite direction. After a final acceleration to 270 GeV, the antiprotons and protons were brought into collision.
For injection into LEAR, the 3.5 GeV/c antiprotons from the AA were decelerated in the PS, down to 600 MeV/c. Once stored in LEAR, they were further decelerated to 60 MeV/c and then slowly extracted with a typical intensity of 106/s. LEAR started up in 1982 and saw as many as 16 experiments before being decommissioned in 1996. The LEAR magnet ring lives on in the Low Energy Ion Ring, which forms part of the injection chain for heavy ions into the LHC.
LEAR also benefitted from the Initial Cooling Experiment (ICE), a storage ring designed in the late 1970s to test Simon van der Meer’s idea of stochastic cooling on antiprotons, and later to investigate electron cooling. After essential modifications, the electron cooler from ICE went on to assist in cooling antiprotons at LEAR, and is now serving at CERN’s current antiproton facility, the AD (CERN Courier September 2009 p13). ICE also contributed to measurements on antiprotons, when in August 1978, it successfully stored antiprotons at 2.1 GeV/c – a world first – keeping them circulating for 32 hours. The previous best experimental measurement of the antiproton lifetime, from bubble-chamber experiments, was about 10–4 s; now, it is known to be more than 8 × 105 years.
Fundamental symmetries
The CPT theorem postulates that physical laws remain the same when the combined operation of CPT is performed. CPT invariance arises from the assumption in quantum field theories of certain requirements, such as Lorentz invariance and point-like elementary particles. However, CPT violation is possible at very small length scales, and could lead to slight differences between the properties of particles and antiparticles, such as lifetime, inertial mass and magnetic moment.
At LEAR, the TRAP collaboration (PS196) performed a series of pioneering experiments to compare precisely the charge-to-mass ratios of the proton and antiproton, using antiprotons stored in a cold electromagnetic (Penning) trap. The signal from a single stored antiproton could be observed, and antiprotons were stored in the trap for up to two months. By measuring the cyclotron frequency of the orbiting antiprotons with an oscillator and comparing it with the cyclotron frequency of H– ions in the same trap, the team finally achieved a result at the level of 9 × 10–11. The experiment used H– ions instead of protons to avoid biases when reversing the signs of the electric and magnetic fields.
Under the assumption of CPT invariance, the violation of CP symmetry first observed in the neutral kaon system in 1964 implies that T invariance is also violated. However, in 1998 the CPLEAR experiment demonstrated the violation of T in the neutral kaon system without assuming CPT conservation (CERN Courier March 1999 p21). The K0 and K0 morph into one another as a function of time, and T violation implies that, at a given time t, the probability of finding a K0 when initially a K0 was produced is not equal to the probability of finding a K0 when a K0 was produced. CPLEAR established the identity of the initial kaon by measuring the sign of the associated charged kaon in the annihilation pp → K+K0π– or K–K0π+; that of the kaon at time t was inferred by detecting the decays K0 → π+e–ν and K0 → π–e+ν. Figure 4 shows that a small asymmetry was indeed observed, consistent with expectations from CP violation, assuming CPT invariance.
The CPT theorem also predicts that matter and antimatter should have identical atomic excitation spectra. Antihydrogen – the simplest form of neutral antimatter consisting of a positron orbiting an antiproton – was observed for the first time in the PS210 experiment at LEAR. The circulating 1.9 GeV/c internal antiproton beam traversed a xenon-cluster jet target, allowing the possibility for an e+e– pair to be produced as an antiproton passed through the Coulomb field of a xenon nucleus. The e+ could then be captured by the antiproton to form electrically neutral antihydrogen with a momentum of 1.9 GeV/c, which could be detected further downstream through its annihilation into pions and photons. This production process is rather rare, but nonetheless the PS210 collaboration reported evidence for nine antihydrogen atoms, following about two months of data taking in August–September 1995, and only months before LEAR was shut down. The observation of antihydrogen was confirmed two years later at Fermilab’s Antiproton Accumulator, albeit with a much smaller production cross-section.
At the AD
A new chapter in the story of antihydrogen at CERN opened in 2000 with the start up of the AD, which decelerates antiprotons to 100 MeV/c, before extracting them for experiments on antimatter and atomic physics (CERN Courier November 1999 p17). The PS210 experiment had tried to make antihydrogen in flight, but to study, for example, the spectroscopy of antihydrogen, it is far more convenient to store antihydrogen atoms in electromagnetic traps, just as TRAP had done in its antiproton experiments. This requires antihydrogen to be produced at very low energies, which the AD helps to achieve.
In 2002, the ATHENA and ATRAP experiments at the AD demonstrated the production of large numbers of slow antihydrogen atoms (CERN CourierNovember 2002 p5and December 2002 p5). ATHENA used absorbing foils to reduce the energy of the antiprotons from the AD to a few kilo-electron-volts. A small fraction of the antiproton beam was then captured in a Penning trap, while positrons from a radioactive sodium source were stored in a second trap. The antiproton and positron clouds were then transferred to a third trap and made to overlap to produce electrically neutral antihydrogen, which migrated to the cryostat walls and annihilated. The antihydrogen detector contained two layers of silicon microstrips to track the charged pions from the antiproton annihilation; an array of 192 CsI crystals detected and measured the energies of the photons from the positron annihilation (figure 5). About a million antihydrogen atoms were produced during the course of the experiment, corresponding to an average rate of 10 antiatoms per second.
Antihydrogen has a magnetic dipole moment (that of the positron), which means that it can be captured in an inhomogeneous magnetic field. The first attempt to do this was carried out at the AD by the ALPHA experiment, which successfully captured 38 antihydrogen atoms in an octupolar magnetic field (CERN Courier March 2011 p13). The initial antihydrogen storage time of 172 ms was increased later to some 15 minutes, thus paving the way to atomic spectroscopy experiments. A sensitive test of CPT is to induce transitions from singlet to triplet spin states (hyperfine splitting, or HfS) in the antihydrogen atom, and to compare the transition energy with that for hydrogen, which is known with very high precision. ALPHA made the first successful attempts to measure the HfS with microwave radiation, managing to flip the positron spin and to eject 23 antihydrogen atoms from the trap (CERN Courier April 2012 p7).
An alternative approach is to perform a Stern–Gerlach-type experiment with an antihydrogen beam. The ASACUSA experiment has used an anti-Helmholtz coil (cusp trap) to exert forces on the antihydrogen atoms and to select those in a given positron spin state. The polarization can then be flipped with microwaves of the appropriate frequency. In a first successful test, 80 antihydrogen atoms were detected downstream from the production region (CERN Courier March 2014 p5).
The ASACUSA collaboration has also tested CPT, using antiprotons stopped in helium. The antiproton was captured by ejecting one of the two orbiting electrons, the ensuing antiprotonic helium atom being left in a high-level, long-lived atomic state that is amenable to laser excitation. By using two counter-propagating laser beams (to reduce the Doppler broadening caused by thermal motion), the group was able to determine the antiproton-to-electron mass ratio with a precision of 1.3 ppb (CERN Courier September 2011 p7). An earlier comparison of the charge-to-mass ratio between the proton and the antiproton had been performed with a precision of 0.09 ppb by the TRAP collaboration at LEAR, as described above. When the results from ASACUSA and TRAP are combined, the masses and charges of the proton and antiproton are determined to be equal at a level below 0.7 ppb.
CPT also requires the magnetic moment of a particle to be equal to (minus) that of its antiparticle. The BASE experiment now under way at the AD will determine the magnetic moment of the antiproton to 1 ppb by measuring the spin-dependent axial oscillation frequency in a Penning trap subjected to a strong magnetic-field gradient. The experimental approach is similar to the one used to measure the magnetic moment of the proton to a precision of 3 ppb (CERN Courier July/August 2014 p8). The collaboration has already compared the charge-to-mass ratios of the antiproton and proton, with a fractional precision of 6.9 × 10–11 (p7).
The weak equivalence principle (WEP), which states that all objects are accelerated in exactly the same way in gravitational fields, has never been tested with antimatter. Attempts using positrons or antiprotons have so far failed, as a result of stray electric or magnetic fields. In contrast, the electrically neutral antihydrogen atom is an ideal probe to test the WEP. The AEgIS collaboration at the AD plans to measure the sagging of an antihydrogen beam over a distance of typically 1 m with a two-grating deflectometer. The displacement of the moiré pattern induced by gravity will be measured with high resolution (around 1 μm) by using nuclear emulsions (figure 6) – the same detection technique that was used to demonstrate the annihilation of the antiproton at the Bevatron, back in 1956.
The future is ELENA
Future experiments with antimatter at CERN will benefit from the Extra Low ENergy Antiproton (ELENA) project, which will become operational at the end of 2017. The capture efficiency of antiprotons in experiments at the AD is currently very low (less than 0.1%), because most of them are lost when degrading the 5 MeV beam from the AD to the few kilo-electron-volts required by the confinement voltage of electromagnetic traps. To overcome this, ELENA – a 30 m circumference electron-cooled storage ring that will be located in the AD hall – will decelerate antiprotons down to, typically, 100 keV. Fast extraction (as opposed to the slow extraction that was available at LEAR) is foreseen to supply the trap experiments.
One experiment that will profit from this new facility is GBAR, which also aims to measure the gravitational acceleration of antihydrogen. Positrons will be produced by a 4.3 MeV electron linac and used to create positive antihydrogen ions (i.e. an antiproton with two positrons) that can be transferred to an electromagnetic trap and cooled to 10 mK. After transfer to another trap, where one of the positrons is detached, the antihydrogen will be launched vertically with a mean velocity of about 1 m/s (CERN Courier March 2014 p31).
It is worth recalling that the discovery of the antiproton in Berkeley was based on some 60 antiprotons observed during a seven-hour run. The 1.2 GeV/c beam contained 5 × 104 more pions than antiprotons. Today, the AD delivers pure beams of some 3 × 107 antiprotons every 100 s at 100 MeV/c, which makes the CERN laboratory unique in the world for antimatter studies. Over the decades, antiproton beams have led to the discovery of new mesons and enabled precise tests of symmetries between matter and antimatter. Now, the properties of hydrogen and antihydrogen are being compared, and accurate tests will be performed with ELENA. The odds to see any violation of exact symmetry are slim, the CPT theorem being a fundamental law of physics. However, experience shows that – as with the surprising discovery of the non-conservation of parity in 1957 and CP violation in 1964 – experiments will, ultimately, have the last word.
Il n’y a pas uniquement des hadrons ordinaires dans les débris des ” boules de feu ” produites par les collisions d’ions lourds effectuées à l’expérience ALICE auprès du LHC ; on y trouve aussi des objets composites, aux liaisons lâches, tels que deutérons et hypernoyaux légers et leurs antiparticules. Des études montrent que la production de ces particules, telle qu’elle peut être mesurée, concorde très bien avec les résultats calculés avec la même méthode que pour les hadrons, ce qui implique que les taux de production des objets à liaison lâche sont déterminés à la limite de phase entre le plasma quark-gluon de la boule de feu et un gaz hadronique. Comment est-ce possible ? La réponse est donnée par la thermodynamique.
The main goal of the ALICE experiment at the LHC is to produce and study the properties of matter as it existed in the first few microseconds after the Big Bang. Such matter consists of fermions and bosons, the fundamental entities of the Standard Model. Depending on the temperature, T, only particles with mass much less than T are copious. For T < 1 GeV, or about 1013 K, these are the u, d and s quarks and the gluons of the strong interactions. In addition, there are of course photons, leptons and neutrinos.
This “cosmic matter” can be produced in the laboratory by collisions at relativistic energies between very heavy atomic nuclei, such as lead at the LHC and gold at Brookhaven’s Relativistic Heavy Ion Collider (RHIC). In these collisions, a fireball is formed at (initial) temperatures up to 600 MeV, with a volume exceeding 1000 fm3 – about the volume of a lead nucleus – and with lifetimes exceeding 10 fm/c, about 3 × 10–23 s. This space–time volume is macroscopic for strong interactions, but charged leptons, photons and neutrinos leave the fireball without interacting and play no part in the following discussion. (However, charged leptons and photons do have a role as penetrating probes of the produced matter.) Such deconfined cosmic matter is referred to as quark–gluon plasma (QGP) because its constituents carry colour and can roam freely within the volume of the fireball. At LHC energies, the QGP comprises, in addition to gluons, essentially equal numbers of quarks and antiquarks, i.e. it carries no net baryon number, as would also have been the case in the early universe.
The produced fireball expands and cools until it reaches the (pseudo-)critical temperature, Tc, of the deconfinement–confinement transition. Solving the strong-interaction equations on a discrete space–time lattice leads, in the most recent predictions, to Tc = 155±9 MeV. The yields of hadrons produced in central lead–lead (Pb–Pb) collisions at LHC energies and measured with the ALICE detector can indeed be quantitatively understood by assuming that they all originate from a thermalized state described with a grand-canonical thermal ensemble at Tchem = 156±2 MeV; the “chemical freeze-out” temperature Tchem is therefore very close to or coincides with Tc (see figure 1). While the overall agreement between data and model predictions is excellent, there is a 2.8σ discrepancy for (anti)protons, which is currently under scrutiny. Because the volume of the fireball is fixed by the number of particles produced, the temperature Tchem is the principal parameter determined in the grand-canonical analysis.
Such Pb–Pb collisions produce not only hadrons in the classical sense but also composite and even fragile objects such as light nuclei (d, t, 3He, 4He) and light Λ-hypernuclei, along with their antiparticles. Their measured yields decrease strongly with increasing (anti)baryon number – the penalty factor for each additional (anti)baryon is about 300 – hence (anti)4He production is a very rare process. Note that, because the fireball carries no net baryon number, the yields of the produced antiparticles closely coincide with the corresponding particle yields.
An interesting question is whether the yields of composite objects can be understood in the same grand-canonical scheme as discussed above, or whether such loosely bound objects follow a different systematics. The deuteron binding energy, for example, is only 2.23 MeV, and the energy needed to separate the Λ hyperon from a hypertriton nucleus – a bound state of a proton, neutron and Λ – is only about 130 keV, which is much smaller than the chemical freeze-out temperature, Tchem = 156 MeV.
Furthermore, the radii of such loosely bound objects are generally very large, even exceeding significantly the range of the nuclear force that binds them. The rms radius of the deuteron is 2.2 fm, for example. Even more dramatically, because of the molecular structure of the hypertriton ((p+n) + Λ), its rms radius, which in this case is the rms separation between the d nucleus and the Λ hyperon, is about 10 fm – that is, larger than the radius of the whole fireball.
Identification is the key
Before answering the question of how such exotic and fragile objects are produced, it is important to discuss how well such rare particles can be measured in the hostile environment of a Pb–Pb collision. In a central Pb–Pb collision at LHC energies, more than 15,000 charged particles are produced and up to 3000 go through the central barrel of the ALICE detector, making the task of tracking and identifying all of the different particle species quite a challenge. With ALICE, the identification of all of the produced particles and, in particular, the measurement of light nuclei and Λ-hypernuclei, is only possible because of the experiment’s excellent tracking and particle-identification capabilities via dE/dx and time-of-flight measurements. This is demonstrated in figure 2, which shows an event display from the ALICE time-projection chamber (TPC) for a central Pb–Pb collision. The highlighted black track corresponds to an identified anti4He track, implying that even such rare particles can be tracked with precision. Figure 3 shows the clean identification achieved for anti4He particles.
At first glance it is surprising that, as figure 1 shows, the measured yields of deuterons and hypertritons and their antiparticles agree very well with the yields calculated using the approach described above for hadrons at the same chemical freeze-out temperature, Tchem = 156 MeV. This implies that the yields of these loosely bound objects are determined at the phase boundary from the QGP to a hadron gas. How is this possible for such loosely bound objects whose sizes are much larger than the inter-particle separation at the time of chemical freeze-out?
To understand this, thermodynamics comes to the rescue. If there are no more inelastic collisions after chemical freeze-out, then the transition from the QGP to hadronic matter is followed by an isentropic expansion (i.e. with no change in entropy). Early studies of nucleus–nucleus collisions at the Berkeley Bevalac already showed that, for systems with isentropic expansion, the entropy/net-baryon is proportional to log(d/p), implying that the yield of deuterons and antideuterons is determined by the entropy in the hot phase of the fireball. The same mechanism is at play at LHC energies: in this way, the “snowballs” can survive “hell”, as the experimental data from the ALICE collaboration show.
These facts can be used to search for even more exotic states. ALICE has performed a search for two hypothetical strange dibaryon states. The first one is the H-dibaryon, which is a six-quark bound state of uuddss, first predicted by Robert Jaffe in a “bag-model” calculation in 1977. This early calculation led to a binding energy of 81 MeV. Recent non-perturbative QCD calculations (on the lattice) suggest either a loosely bound state or a resonant state above the ΛΛ threshold. The existence of double-Λ hypernuclei, such as the ΛΛ4He, reduced the allowed binding energy to a maximum of 7.3 MeV, with the most preferred value around 1 MeV. The second hypothetical bound state investigated by ALICE is a possible Λn bound state.
The two searches are performed in central (0–10%) Pb–Pb collisions at √sNN = 2.76 TeV in the decay modes H-dibaryon → Λpπ– and Λn → dπ+. No signals are observed in either of the measured invariant-mass distributions, therefore setting upper limits for the production yields. These limits are well below the yields predicted using the grand-canonical scenario discussed above with Tchem = 156 MeV (see figure 1). In fact, the difference between the upper limit at 99% CL obtained for the Λn bound state is a factor of around 50 below the prediction, whereas the factor between the upper limit at 99% CL and the model prediction for the H-dibaryon is close to 25. Given the success of the model in predicting deuteron and hypertriton yields, it appears that the existence of such bound states is very unlikely.
With the LHC’s Run 2, which has just started, and much more so with the upgraded ALICE apparatus in LHC Run 3, it is expected that ALICE can measure hypernuclei with still higher masses, such as Λ4H and ΛΛ4He and the corresponding antiparticles. These would be the highest-mass antihypernuclei ever observed. In addition, the hypertriton measurement will be extended in the three-body decay Λ3H → d + p + π–. The much higher statistics expected will also allow more detailed measurements, such as determination of the 4He transverse-momentum spectrum. In addition, searches are underway for other hypothetical bound states such as Λnn or other exotic di-baryons.
In summary, the success in describing the production of different hadrons and the yields of loosely bound objects with the same temperature, T, provides strong evidence for isentropic expansion after the transition from the QGP to a hadron gas. This scenario naturally explains the observed yields for loosely bound objects. On the other hand, the upper limits obtained for the H-dibaryon and the Λn bound state are well below the model prediction using the same temperature, T = 156 MeV, casting serious doubts on their existence.
The ALICE data on light (anti)nucleus production in pp, p–Pb and Pb–Pb collisions shows that very loosely bound objects are produced with significant yields for all systems, with the thermodynamic limit reached for the Pb–Pb system. The measured yields are expected to increase with beam energy similar to the way that the overall multiplicity density does. This implies significant production of antideuterons from high-energy cosmic rays, with potential consequences for searches for dark matter. Their yields can be well predicted within the scenario described here.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.