Bluefors – leaderboard other pages

Topics

First light for the PAU camera

PAUCam, the camera for the Physics of the Accelerating Universe (PAU) project, has been successfully installed and commissioned at the William Herschel Telescope (WHT) at the Roque de los Muchachos Observatory on the island of La Palma. Installation took place on 3 June, with commissioning following during the subsequent four nights.

The innovative instrument is designed to measure precisely and efficiently the redshift of galaxies up to large distances – galaxies whose light is only now reaching Earth after starting its journey when the universe was less than half the size of what it is today. One of the primary goals of the project is to study how the expansion rate of the universe is increasing under the influence of the mysterious dark energy that makes up nearly 70% of the universe. PAUCam’s competitiveness comes from its ability to obtain redshifts that are more precise than those of current photometric surveys, over a volume with a larger number density of galaxies than in spectroscopic surveys.

The camera is mounted at the prime focus of the 4 m-diameter WHT, which is part of the Isaac Newton Group of Telescopes, operated at present by a consortium between the governments of the Netherlands, Spain and the UK. The location at the prime focus of the WHT imposes severe weight limitations that are in conflict with the complexity and large size of the instrument. The solution was to build the main body of the camera with carbon fibre, with engineering developed in Spain.

Another innovation of the camera is in the technique used to measure redshifts. With PAUCam, the redshift is measured photometrically, where the same object is imaged multiple times with its light passing through filters of different colours. PAUcam uses a set of 40 narrow-band filters, each passing light in a 10 nm-wide band, from wavelengths in the range of 450–850 nm. Another set of filters passes the light in six wider bands, named u, g, r, i, z and Y. The large number of narrow-band filters allow a determination of redshifts with a precision of 0.003(1+z), where z is the redshift parameter, or 10(1+z) Mpc, the characteristic scale of linear growth in the universe.

The photometric technique used in PAUCam contrasts with the spectroscopic technique, where the redshift of an object is determined by analysing its light through a spectrograph. The latter technique allows very precise determination of redshifts, but at the expense of much longer exposure times. Furthermore, only the spectra of previously selected objects are analysed, while the photometric technique determines, in principle, the redshift of all of the objects in the region of the sky being imaged. In the case of PAUCam, the expectation is to measure the redshift of about 50,000 objects every night of observation. The camera covers the entire field of view of the WHT (1 square degree) with a mosaic of 18 Hamamatsu Photonic red-sensitive CCDs, each with 4000 × 2000 pixels.

PAUCam has been designed and built over the past six years by a fruitful collaboration between astronomers, cosmologists and particle physicists in a consortium of Spanish institutions. The idea to build an instrument capable of contributing significantly to cosmological measurements arose in 2007, in the context of the Consolider Ingenio 2010 project financed by the Spanish government. This programme had as its objective the achievement in Spain of highly innovative projects.

The PAUCam team is now being joined by other European groups to conduct a survey, named PAUS, with the objective of scientifically exploiting the capabilities of the camera. Aside from the survey, observation time with PAUCam will be made available to the international scientific community, for astronomical as well as cosmological measurements.

• PAUCam was designed and built by a consortium comprising the Institut de Física d’Altes Energies (IFAE), the Institut de Ciències de l’Espai (ICE-CSIC/IEEC) and the Port d’Informació Científica(PIC), all in Barcelona, and the Centro de Investigaciones Energéticas Medioambientales y Tecnológicas(CIEMAT) and the Instituto de Física Teórica (IFT-UAM/CSIC), both in Madrid.

Fermilab sets neutrino-beam record

In June, Fermilab’s Main Injector accelerator sustained a 521 kW proton beam, and set a world record for the production of high-energy neutrinos with a proton accelerator. The 120 GeV proton beam is used to provide high-energy neutrinos or antineutrinos to three experiments at the laboratory: the long-baseline experiments MINOS+ and NOvA (CERN Courier June 2015 p17) and the neutrino-interaction experiment MINERvA (CERN Courier April 2014 p26).

The record beam surpasses that of the proton beam of more than 400 kW achieved at CERN for the CERN Neutrinos to Gran Sasso (CNGS) beamline, which provided neutrinos for the ICARUS and OPERA long-baseline experiments. The highest beam powers for fixed-target proton beams are achieved with protons in the giga-electron-volt range. Both the Spallation Neutron Source at Oak Ridge National Laboratory and the cyclotron facility at the Paul Scherrer Institute in Switzerland create proton beams with powers in excess of 1 MW. In the 1990s, Los Alamos National Laboratory operated a 0.8 GeV proton beam at about 800 kW for its low-energy neutrino experiment, LSND.

The power of the proton beam is a key element in producing neutrinos at accelerators: the more protons packed in the beam, the higher the number of neutrinos and antineutrinos produced and the better the chance to record neutrino interactions. The protons strike a target to create pions and other short-lived particles; the higher the proton energy, the larger the number of pions produced. Magnetic-focusing horns direct the charged pions into a vacuum pipe that is centred along the desired neutrino-beam direction. As the pions decay, they produce neutrinos and antineutrinos that are boosted in the direction of the original pions.

Since 2011, Fermilab has made significant upgrades to its accelerators and reconfigured the complex to provide the best possible particle beams for neutrino and muon experiments. The next goal for the 3.3 km circumference Main Injector accelerator is to deliver 700 kW in 2016 – double the beam power produced in the Tevatron era.

Fermilab plans to make additional upgrades to its accelerator complex over the next decade. The Proton Improvement Project-II includes the construction of a 800 MeV superconducting linac. Its beam would enable the Main Injector to provide more than 1.2 MW of proton beam power for the international Deep Underground Neutrino Experiment (CERN Courier April 2015 p20).

Fermilab also operates a second neutrino beamline, powered by its 8 GeV booster accelerator. This provides neutrinos for the Short Baseline Neutrino programme, which comprises three neutrino detectors: MicroBooNE (construction complete), ICARUS (upgrades underway at CERN) and the Short Baseline Neutrino Detector (construction to start in 2016).

Do magnetars power hour-long gamma-ray bursts?

Based on optical observations, a team of astronomers has, for the first time, demonstrated a link between a very long-lasting gamma-ray burst (GRB) and an unusually bright supernova explosion. The results show that the supernova was not driven by radioactive decay, as expected, but most likely by the spin down of a magnetar, a neutron star with an extremely strong magnetic field.

GRBs have intrigued astronomers since their discovery nearly 50 years ago by US military satellites intended to detect nuclear test explosions conducted by the Soviet Union. Mysterious gamma-ray flashes were detected, not from Earth, but from random directions in the sky. It was only some 30 years later that the detection of their precise locations and the measurement of their redshifts by follow-up observations proved them to be very luminous events from remote galaxies. The further evidence that some of them are associated with supernova explosions settled the issue of their true nature as being a manifestation of the core collapse of a massive star (CERN Courier September 2003 p13).

Astronomers usually distinguish two main classes of GRBs: the short ones that flare up for less than about 2 s and the longer ones. Among the latter, there are a few outstanding bursts lasting more than 10,000 s, which have been proposed to originate in the explosion of giant stars with much larger radii (CERN Courier June 2013 p12). A team led by Jochen Greiner of the Max-Planck-Institut für extraterrestrische Physik in Garching, Germany, has now shown that a supernova explosion is associated with one of these rare ultra-long-duration GRBs, namely GRB 111209A. The supernova’s presence has been derived from observations of the afterglow emission by two telescopes of the European Southern Observatory in Chile: the GROND instrument on the 2.2 m telescope at La Silla and the X-shooter instrument on the Very Large Telescope at Paranal.

The supernova’s spectral and timing properties are both very peculiar. Its luminosity is intermediate between the supernovas usually associated with GRBs and a new class of super-luminous supernovas discovered in 2011. The exceptional luminosity of the latter would be due to energy injection from a rapidly rotating magnetar – a neutron star with a huge magnetic field of up to about 1010 T. The same process could be at play in the supernova of GRB 111209A. Indeed, the huge amount of nickel needed to explain the observed light curve by radioactive decay of 56Ni is not compatible with the rather featureless spectral shape, which suggests a star of low metallicity. While Greiner and colleagues cannot prove that a magnetar is at the origin of the ultra-long GRB of 9 December 2011, nor the source of the luminous and peculiar supernova they observed, they can rule out alternative possibilities, leaving this as the most likely one.

Magnetars have already been invoked to explain the long-lasting afterglow emission of some GRBs (CERN Courier May 2007 p11). Now they seem to be needed to account for powering the prompt emission of some of these powerful flashes of gamma rays. Their advantage is that they would provide a continuous power supply, from hours to months, by losing rotational energy through magnetic-dipole radiation. The flexibility of the magnetar model fits peculiar GRBs and supernovas well, but what about the more standard GRBs? Could they also be powered by a new-born magnetar rather than by a black hole?

In the steps of the antiproton

On 21 September 1955, Owen Chamberlain, Emilio Segrè, Clyde Wiegand and Tom Ypsilantis found their first evidence of the antiproton, gathered through measurements of its momentum and its velocity. Working at what was known as the “Rad Lab” at Berkeley, they had set up their experiment at a new accelerator, the Bevatron – a proton synchrotron designed to reach an energy of 6.5 GeV, sufficient to produce an antiproton in a fixed-target experiment (CERN Courier November 2005 p27). Soon after, a related experiment led by Gerson Goldhaber and Edoardo Amaldi found the expected annihilation “stars”, recorded in stacks of nuclear emulsions (figure 1). Forty years later, by combing antiprotons and positrons, an experiment at the Low Energy Antiproton Ring (LEAR) at CERN gathered evidence in September 1995 for the production of the first few atoms of antihydrogen.

Over the decades, antiprotons have become a standard tool for studies in particle physics; the word “antimatter” has entered into mainstream language; and antihydrogen is fast becoming a laboratory for investigations in fundamental physics. At CERN, the Antiproton Decelerator (AD) is now an important facility for studies in fundamental physics at low energies, which complement the investigations at the LHC’s high-energy frontier. This article looks back at some of the highlights in the studies of the antiworld at CERN, and takes a glimpse at what lies in store at the AD.

Back at the Bevatron, the discovery of the antineutron through neutral particle annihilation followed in 1956, setting the scene for studies of real antimatter. Initially, everyone expected perfect symmetry between matter and antimatter through the combination of the operations of charge conjugation (C), parity (P) and time reversal (T). However, following the observation of CP violation in 1964, it was not obvious that nuclear forces were CPT invariant and that antinucleons should bind to build antinuclei. These doubts were laid to rest with the discovery of the antideuteron at CERN by a team led by Antonino Zichichi, and at Brookhaven by a team from Columbia University, including Leon Lederman and Sam Ting (CERN Courier May 2009 p15and October 2009 p22). A decade later, evidence emerged for antihelium-3 and antitritium in the WA33 experiment at CERN’s Super Proton Synchrotron, following the sighting of a few candidates at the 70 GeV proton synchroton at the Institute for High Energy Physics near Serpukhov. More recently, the availability of colliding beams of heavy ions has led to the observation of antihelium-4 by the STAR experiment at Brookhaven’s Relativistic Heavy-Ion Collider (CERN Courier June 2011 p8). At CERN, the ALICE experiment at the LHC observes the production of light nuclei and antinuclei with comparable masses and therefore compatible binding energies (figure 2).

Exit baryonium, enter new mesons

Back in 1949, before the discovery of the antiproton, Enrico Fermi and Chen-Ning Yang predicted the existence of bound nucleon–antinucleon states (baryonium), when they noted that certain repulsive forces between two nucleons could become attractive in the nucleon–antinucleon system. Later, quark models based on duality predicted the existence of states made of two quarks and two antiquarks, which should be observed when a proton annihilates with an antiproton. In the 1970s, nuclear-potential models went on to predict a plethora of bound states and resonance excitations around the two-nucleon mass. There were indeed reports of such states, among them narrow states observed in antiproton–proton (pp) annihilation at CERN’s Proton Synchrotron (PS) and in measurements of the pp cross-section as a function of energy (the S meson with a mass of 1940 MeV).

Baryonium was the main motivation for the construction at CERN of LEAR, which ran for more than a decade from 1982 to 1996 (see box). However, none of the baryonium states were confirmed at LEAR. The S meson was not observed with a sensitivity 10 times below the signal reported earlier in the pp total cross-section. Monoenergetic transitions to bound states were also not observed. The death of baryonium was a key topic for the Antiproton 86 Conference in Thessaloniki. What had happened? The high quality of the antiproton beams from LEAR meant that all of the pions had decayed. The high intensity of antiprotons (106/s compared with about 102/s in extracted beams at the PS) and a high momentum resolution of 10–3–10–4 was crucial at low energies for antiprotons stopping with very small range-straggling.

The spectroscopy of mesons produced in pp annihilation at rest in several experiments at LEAR proved to be much more fruitful. This continued a tradition that had begun in the 1960s with antiprotons annihilating in the 81 cm Hydrogen Bubble Chamber at the PS, leading to the discovery of the E meson (E for Europe, now the η(1440)) and the D meson (now the f1(1285)) in pp → (E, D →  KKπ)ππ. The former led to the long-standing controversy about the existence in this mass region of a glueball candidate – a state made only of gluons – which was observed in radiative J/ψ decay at SLAC’s e+e collider, SPEAR. With the start up of LEAR, the experiments ASTERIX, OBELIX, Crystal Barrel and JETSET took over the baton of meson spectroscopy in pp annihilation. ASTERIX discovered a tensor meson – the AX, now the f2(1565) – which was also reported by OBELIX; its structure is still unclear, although it could be the predicted tensor baryonium state.

Crystal Barrel specialized in the detection of multineutral events. The antiprotons were stopped in a liquid-hydrogen target and π0 mesons were detected through their γγ decays in a barrel-shaped assembly of 1380 CsI (Tl) crystals. Figure 3 shows the detector together with a Dalitz plot of pp annihilation into π0π0π0, measured by the experiment. The non-uniform distribution of events indicates the presence of intermediate resonances that decay into π0π0, such as the spin-0 mesons f0(980) and f0(1500), and the spin-2 mesons f2(1270) and f2(1565). The f0(1500) is a good candidate for a glueball.

ICE, the AA and LEAR

The construction of LEAR took advantage of the antiproton facility that was built at CERN in 1980 to search for the W and Z bosons at the Super Proton Synchrotron (SPS) operating as a –pp collider (CERN Courier December 1999 p15). The antiprotons originated when 26 GeV protons from the PS struck a target. Emerging with an average momentum of 3.5 GeV/c, they were collected in the Antiproton Accumulator (AA), and a pure antiproton beam with small transverse dimensions was generated by stochastic cooling. Up to 1012 antiprotons a day could be generated and stored. The antiprotons were then extracted and injected into the PS. After acceleration to 26 GeV, they were transferred to the SPS where they circulated in the same beam pipe as the protons, but in the opposite direction. After a final acceleration to 270 GeV, the antiprotons and protons were brought into collision.

For injection into LEAR, the 3.5 GeV/c antiprotons from the AA were decelerated in the PS, down to 600 MeV/c. Once stored in LEAR, they were further decelerated to 60 MeV/c and then slowly extracted with a typical intensity of 106/s. LEAR started up in 1982 and saw as many as 16 experiments before being decommissioned in 1996. The LEAR magnet ring lives on in the Low Energy Ion Ring, which forms part of the injection chain for heavy ions into the LHC.

LEAR also benefitted from the Initial Cooling Experiment (ICE), a storage ring designed in the late 1970s to test Simon van der Meer’s idea of stochastic cooling on antiprotons, and later to investigate electron cooling. After essential modifications, the electron cooler from ICE went on to assist in cooling antiprotons at LEAR, and is now serving at CERN’s current antiproton facility, the AD (CERN Courier September 2009 p13). ICE also contributed to measurements on antiprotons, when in August 1978, it successfully stored antiprotons at 2.1 GeV/c – a world first – keeping them circulating for 32 hours. The previous best experimental measurement of the antiproton lifetime, from bubble-chamber experiments, was about 10–4 s; now, it is known to be more than 8 × 105 years.

Fundamental symmetries

The CPT theorem postulates that physical laws remain the same when the combined operation of CPT is performed. CPT invariance arises from the assumption in quantum field theories of certain requirements, such as Lorentz invariance and point-like elementary particles. However, CPT violation is possible at very small length scales, and could lead to slight differences between the properties of particles and antiparticles, such as lifetime, inertial mass and magnetic moment.

At LEAR, the TRAP collaboration (PS196) performed a series of pioneering experiments to compare precisely the charge-to-mass ratios of the proton and antiproton, using antiprotons stored in a cold electromagnetic (Penning) trap. The signal from a single stored antiproton could be observed, and antiprotons were stored in the trap for up to two months. By measuring the cyclotron frequency of the orbiting antiprotons with an oscillator and comparing it with the cyclotron frequency of H ions in the same trap, the team finally achieved a result at the level of 9 × 10–11. The experiment used H ions instead of protons to avoid biases when reversing the signs of the electric and magnetic fields.

Under the assumption of CPT invariance, the violation of CP symmetry first observed in the neutral kaon system in 1964 implies that T invariance is also violated. However, in 1998 the CPLEAR experiment demonstrated the violation of T in the neutral kaon system without assuming CPT conservation (CERN Courier March 1999 p21). The K0 and K0 morph into one another as a function of time, and T violation implies that, at a given time t, the probability of finding a K0 when initially a K0 was produced is not equal to the probability of finding a K0 when a K0 was produced. CPLEAR established the identity of the initial kaon by measuring the sign of the associated charged kaon in the annihilation pp → K+K0π or KK0π+; that of the kaon at time t was inferred by detecting the decays K0 → π+e ν and K0 → πe+ν. Figure 4 shows that a small asymmetry was indeed observed, consistent with expectations from CP violation, assuming CPT invariance.

The CPT theorem also predicts that matter and antimatter should have identical atomic excitation spectra. Antihydrogen – the simplest form of neutral antimatter consisting of a positron orbiting an antiproton – was observed for the first time in the PS210 experiment at LEAR. The circulating 1.9 GeV/c internal antiproton beam traversed a xenon-cluster jet target, allowing the possibility for an e+e pair to be produced as an antiproton passed through the Coulomb field of a xenon nucleus. The e+ could then be captured by the antiproton to form electrically neutral antihydrogen with a momentum of 1.9 GeV/c, which could be detected further downstream through its annihilation into pions and photons. This production process is rather rare, but nonetheless the PS210 collaboration reported evidence for nine antihydrogen atoms, following about two months of data taking in August–September 1995, and only months before LEAR was shut down. The observation of antihydrogen was confirmed two years later at Fermilab’s Antiproton Accumulator, albeit with a much smaller production cross-section.

At the AD

A new chapter in the story of antihydrogen at CERN opened in 2000 with the start up of the AD, which decelerates antiprotons to 100 MeV/c, before extracting them for experiments on antimatter and atomic physics (CERN Courier November 1999 p17). The PS210 experiment had tried to make antihydrogen in flight, but to study, for example, the spectroscopy of antihydrogen, it is far more convenient to store antihydrogen atoms in electromagnetic traps, just as TRAP had done in its antiproton experiments. This requires antihydrogen to be produced at very low energies, which the AD helps to achieve.

In 2002, the ATHENA and ATRAP experiments at the AD demonstrated the production of large numbers of slow antihydrogen atoms (CERN Courier November 2002 p5and December 2002 p5). ATHENA used absorbing foils to reduce the energy of the antiprotons from the AD to a few kilo-electron-volts. A small fraction of the antiproton beam was then captured in a Penning trap, while positrons from a radioactive sodium source were stored in a second trap. The antiproton and positron clouds were then transferred to a third trap and made to overlap to produce electrically neutral antihydrogen, which migrated to the cryostat walls and annihilated. The antihydrogen detector contained two layers of silicon microstrips to track the charged pions from the antiproton annihilation; an array of 192 CsI crystals detected and measured the energies of the photons from the positron annihilation (figure 5). About a million antihydrogen atoms were produced during the course of the experiment, corresponding to an average rate of 10 antiatoms per second.

Antihydrogen has a magnetic dipole moment (that of the positron), which means that it can be captured in an inhomogeneous magnetic field. The first attempt to do this was carried out at the AD by the ALPHA experiment, which successfully captured 38 antihydrogen atoms in an octupolar magnetic field (CERN Courier March 2011 p13). The initial antihydrogen storage time of 172 ms was increased later to some 15 minutes, thus paving the way to atomic spectroscopy experiments. A sensitive test of CPT is to induce transitions from singlet to triplet spin states (hyperfine splitting, or HfS) in the antihydrogen atom, and to compare the transition energy with that for hydrogen, which is known with very high precision. ALPHA made the first successful attempts to measure the HfS with microwave radiation, managing to flip the positron spin and to eject 23 antihydrogen atoms from the trap (CERN Courier April 2012 p7).

An alternative approach is to perform a Stern–Gerlach-type experiment with an antihydrogen beam. The ASACUSA experiment has used an anti-Helmholtz coil (cusp trap) to exert forces on the antihydrogen atoms and to select those in a given positron spin state. The polarization can then be flipped with microwaves of the appropriate frequency. In a first successful test, 80 antihydrogen atoms were detected downstream from the production region (CERN Courier March 2014 p5).

The ASACUSA collaboration has also tested CPT, using antiprotons stopped in helium. The antiproton was captured by ejecting one of the two orbiting electrons, the ensuing antiprotonic helium atom being left in a high-level, long-lived atomic state that is amenable to laser excitation. By using two counter-propagating laser beams (to reduce the Doppler broadening caused by thermal motion), the group was able to determine the antiproton-to-electron mass ratio with a precision of 1.3 ppb (CERN Courier September 2011 p7). An earlier comparison of the charge-to-mass ratio between the proton and the antiproton had been performed with a precision of 0.09 ppb by the TRAP collaboration at LEAR, as described above. When the results from ASACUSA and TRAP are combined, the masses and charges of the proton and antiproton are determined to be equal at a level below 0.7 ppb.

CPT also requires the magnetic moment of a particle to be equal to (minus) that of its antiparticle. The BASE experiment now under way at the AD will determine the magnetic moment of the antiproton to 1 ppb by measuring the spin-dependent axial oscillation frequency in a Penning trap subjected to a strong magnetic-field gradient. The experimental approach is similar to the one used to measure the magnetic moment of the proton to a precision of 3 ppb (CERN Courier July/August 2014 p8). The collaboration has already compared the charge-to-mass ratios of the antiproton and proton, with a fractional precision of 6.9 × 10–11 (p7).

The weak equivalence principle (WEP), which states that all objects are accelerated in exactly the same way in gravitational fields, has never been tested with antimatter. Attempts using positrons or antiprotons have so far failed, as a result of stray electric or magnetic fields. In contrast, the electrically neutral antihydrogen atom is an ideal probe to test the WEP. The AEgIS collaboration at the AD plans to measure the sagging of an antihydrogen beam over a distance of typically 1 m with a two-grating deflectometer. The displacement of the moiré pattern induced by gravity will be measured with high resolution (around 1 μm) by using nuclear emulsions (figure 6) – the same detection technique that was used to demonstrate the annihilation of the antiproton at the Bevatron, back in 1956.

The future is ELENA

Future experiments with antimatter at CERN will benefit from the Extra Low ENergy Antiproton (ELENA) project, which will become operational at the end of 2017. The capture efficiency of antiprotons in experiments at the AD is currently very low (less than 0.1%), because most of them are lost when degrading the 5 MeV beam from the AD to the few kilo-electron-volts required by the confinement voltage of electromagnetic traps. To overcome this, ELENA – a 30 m circumference electron-cooled storage ring that will be located in the AD hall – will decelerate antiprotons down to, typically, 100 keV. Fast extraction (as opposed to the slow extraction that was available at LEAR) is foreseen to supply the trap experiments.

One experiment that will profit from this new facility is GBAR, which also aims to measure the gravitational acceleration of antihydrogen. Positrons will be produced by a 4.3 MeV electron linac and used to create positive antihydrogen ions (i.e. an antiproton with two positrons) that can be transferred to an electromagnetic trap and cooled to 10 mK. After transfer to another trap, where one of the positrons is detached, the antihydrogen will be launched vertically with a mean velocity of about 1 m/s (CERN Courier March 2014 p31).

It is worth recalling that the discovery of the antiproton in Berkeley was based on some 60 antiprotons observed during a seven-hour run. The 1.2 GeV/c beam contained 5 × 104 more pions than antiprotons. Today, the AD delivers pure beams of some 3 × 107 antiprotons every 100 s at 100 MeV/c, which makes the CERN laboratory unique in the world for antimatter studies. Over the decades, antiproton beams have led to the discovery of new mesons and enabled precise tests of symmetries between matter and antimatter. Now, the properties of hydrogen and antihydrogen are being compared, and accurate tests will be performed with ELENA. The odds to see any violation of exact symmetry are slim, the CPT theorem being a fundamental law of physics. However, experience shows that – as with the surprising discovery of the non-conservation of parity in 1957 and CP violation in 1964 – experiments will, ultimately, have the last word.

ALICE investigates ‘snowballs in hell’

Résumé

L’omelette norvégienne d’ALICE

Il n’y a pas uniquement des hadrons ordinaires dans les débris des ” boules de feu ” produites par les collisions d’ions lourds effectuées à l’expérience ALICE auprès du LHC ; on y trouve aussi des objets composites, aux liaisons lâches, tels que deutérons et hypernoyaux légers et leurs antiparticules. Des études montrent que la production de ces particules, telle qu’elle peut être mesurée, concorde très bien avec les résultats calculés avec la même méthode que pour les hadrons, ce qui implique que les taux de production des objets à liaison lâche sont déterminés à la limite de phase entre le plasma quark-gluon de la boule de feu et un gaz hadronique. Comment est-ce possible ? La réponse est donnée par la thermodynamique.

The main goal of the ALICE experiment at the LHC is to produce and study the properties of matter as it existed in the first few microseconds after the Big Bang. Such matter consists of fermions and bosons, the fundamental entities of the Standard Model. Depending on the temperature, T, only particles with mass much less than T are copious. For T < 1 GeV, or about 1013 K, these are the u, d and s quarks and the gluons of the strong interactions. In addition, there are of course photons, leptons and neutrinos.

This “cosmic matter” can be produced in the laboratory by collisions at relativistic energies between very heavy atomic nuclei, such as lead at the LHC and gold at Brookhaven’s Relativistic Heavy Ion Collider (RHIC). In these collisions, a fireball is formed at (initial) temperatures up to 600 MeV, with a volume exceeding 1000 fm3 – about the volume of a lead nucleus – and with lifetimes exceeding 10 fm/c, about 3 × 10–23 s. This space–time volume is macroscopic for strong interactions, but charged leptons, photons and neutrinos leave the fireball without interacting and play no part in the following discussion. (However, charged leptons and photons do have a role as penetrating probes of the produced matter.) Such deconfined cosmic matter is referred to as quark–gluon plasma (QGP) because its constituents carry colour and can roam freely within the volume of the fireball. At LHC energies, the QGP comprises, in addition to gluons, essentially equal numbers of quarks and antiquarks, i.e. it carries no net baryon number, as would also have been the case in the early universe.

The produced fireball expands and cools until it reaches the (pseudo-)critical temperature, Tc, of the deconfinement–confinement transition. Solving the strong-interaction equations on a discrete space–time lattice leads, in the most recent predictions, to Tc = 155±9 MeV. The yields of hadrons produced in central lead–lead (Pb–Pb) collisions at LHC energies and measured with the ALICE detector can indeed be quantitatively understood by assuming that they all originate from a thermalized state described with a grand-canonical thermal ensemble at Tchem = 156±2 MeV; the “chemical freeze-out” temperature Tchem is therefore very close to or coincides with Tc (see figure 1). While the overall agreement between data and model predictions is excellent, there is a 2.8σ discrepancy for (anti)protons, which is currently under scrutiny. Because the volume of the fireball is fixed by the number of particles produced, the temperature Tchem is the principal parameter determined in the grand-canonical analysis.

Such Pb–Pb collisions produce not only hadrons in the classical sense but also composite and even fragile objects such as light nuclei (d, t, 3He, 4He) and light Λ-hypernuclei, along with their antiparticles. Their measured yields decrease strongly with increasing (anti)baryon number – the penalty factor for each additional (anti)baryon is about 300 – hence (anti)4He production is a very rare process. Note that, because the fireball carries no net baryon number, the yields of the produced antiparticles closely coincide with the corresponding particle yields.

An interesting question is whether the yields of composite objects can be understood in the same grand-canonical scheme as discussed above, or whether such loosely bound objects follow a different systematics. The deuteron binding energy, for example, is only 2.23 MeV, and the energy needed to separate the Λ hyperon from a hypertriton nucleus – a bound state of a proton, neutron and Λ – is only about 130 keV, which is much smaller than the chemical freeze-out temperature, Tchem = 156 MeV.

Furthermore, the radii of such loosely bound objects are generally very large, even exceeding significantly the range of the nuclear force that binds them. The rms radius of the deuteron is 2.2 fm, for example. Even more dramatically, because of the molecular structure of the hypertriton ((p+n) + Λ), its rms radius, which in this case is the rms separation between the d nucleus and the Λ hyperon, is about 10 fm – that is, larger than the radius of the whole fireball.

Identification is the key

Before answering the question of how such exotic and fragile objects are produced, it is important to discuss how well such rare particles can be measured in the hostile environment of a Pb–Pb collision. In a central Pb–Pb collision at LHC energies, more than 15,000 charged particles are produced and up to 3000 go through the central barrel of the ALICE detector, making the task of tracking and identifying all of the different particle species quite a challenge. With ALICE, the identification of all of the produced particles and, in particular, the measurement of light nuclei and Λ-hypernuclei, is only possible because of the experiment’s excellent tracking and particle-identification capabilities via dE/dx and time-of-flight measurements. This is demonstrated in figure 2, which shows an event display from the ALICE time-projection chamber (TPC) for a central Pb–Pb collision. The highlighted black track corresponds to an identified anti4He track, implying that even such rare particles can be tracked with precision. Figure 3 shows the clean identification achieved for anti4He particles.

At first glance it is surprising that, as figure 1 shows, the measured yields of deuterons and hypertritons and their antiparticles agree very well with the yields calculated using the approach described above for hadrons at the same chemical freeze-out temperature, Tchem = 156 MeV. This implies that the yields of these loosely bound objects are determined at the phase boundary from the QGP to a hadron gas. How is this possible for such loosely bound objects whose sizes are much larger than the inter-particle separation at the time of chemical freeze-out?

To understand this, thermodynamics comes to the rescue. If there are no more inelastic collisions after chemical freeze-out, then the transition from the QGP to hadronic matter is followed by an isentropic expansion (i.e. with no change in entropy). Early studies of nucleus–nucleus collisions at the Berkeley Bevalac already showed that, for systems with isentropic expansion, the entropy/net-baryon is proportional to log(d/p), implying that the yield of deuterons and antideuterons is determined by the entropy in the hot phase of the fireball. The same mechanism is at play at LHC energies: in this way, the “snowballs” can survive “hell”, as the experimental data from the ALICE collaboration show.

These facts can be used to search for even more exotic states. ALICE has performed a search for two hypothetical strange dibaryon states. The first one is the H-dibaryon, which is a six-quark bound state of uuddss, first predicted by Robert Jaffe in a “bag-model” calculation in 1977. This early calculation led to a binding energy of 81 MeV. Recent non-perturbative QCD calculations (on the lattice) suggest either a loosely bound state or a resonant state above the ΛΛ threshold. The existence of double-Λ hypernuclei, such as the ΛΛ4He, reduced the allowed binding energy to a maximum of 7.3 MeV, with the most preferred value around 1 MeV. The second hypothetical bound state investigated by ALICE is a possible Λn bound state.

The two searches are performed in central (0–10%) Pb–Pb collisions at √sNN = 2.76 TeV in the decay modes H-dibaryon → Λpπ and Λn → dπ+. No signals are observed in either of the measured invariant-mass distributions, therefore setting upper limits for the production yields. These limits are well below the yields predicted using the grand-canonical scenario discussed above with Tchem = 156 MeV (see figure 1). In fact, the difference between the upper limit at 99% CL obtained for the Λn bound state is a factor of around 50 below the prediction, whereas the factor between the upper limit at 99% CL and the model prediction for the H-dibaryon is close to 25. Given the success of the model in predicting deuteron and hypertriton yields, it appears that the existence of such bound states is very unlikely.

With the LHC’s Run 2, which has just started, and much more so with the upgraded ALICE apparatus in LHC Run 3, it is expected that ALICE can measure hypernuclei with still higher masses, such as Λ4H and ΛΛ4He and the corresponding antiparticles. These would be the highest-mass antihypernuclei ever observed. In addition, the hypertriton measurement will be extended in the three-body decay Λ3H → d + p + π. The much higher statistics expected will also allow more detailed measurements, such as determination of the 4He transverse-momentum spectrum. In addition, searches are underway for other hypothetical bound states such as Λnn or other exotic di-baryons.

In summary, the success in describing the production of different hadrons and the yields of loosely bound objects with the same temperature, T, provides strong evidence for isentropic expansion after the transition from the QGP to a hadron gas. This scenario naturally explains the observed yields for loosely bound objects. On the other hand, the upper limits obtained for the H-dibaryon and the Λn bound state are well below the model prediction using the same temperature, T = 156 MeV, casting serious doubts on their existence.

The ALICE data on light (anti)nucleus production in pp, p–Pb and Pb–Pb collisions shows that very loosely bound objects are produced with significant yields for all systems, with the thermodynamic limit reached for the Pb–Pb system. The measured yields are expected to increase with beam energy similar to the way that the overall multiplicity density does. This implies significant production of antideuterons from high-energy cosmic rays, with potential consequences for searches for dark matter. Their yields can be well predicted within the scenario described here.

Massimo Tarenghi: a lifetime in the stars

 

Massimo Tarenghi fell in love with astronomy at age 14, when his mother took away his stamp collection – on which he spent more time than on his schoolbooks – and gave him a book entitled Le Stelle (The Stars). By age 17, he had built his first telescope and become a well-known amateur astronomer, meriting a photo in the local daily newspaper with the headline “Massimo prefers a bigger telescope to a Ferrari.” Already, his dream was “to work at the largest observatory in the world”. That dream came true, because Massimo went on to build and direct the world’s most powerful optical telescope, the Very Large Telescope (VLT), at the European Southern Observatory (ESO)’s Paranal Observatory in Chile.

“I was born as a guy who likes to do impossible things and I like to do them 110%,” says Massimo, who decided to study physics at the University of Milan in the late 1960s “because [Giusepppe] Occhialini was the best in the world and allowed me to do a thesis in astronomy”. His road to the stars began in 1970, when he gained his PhD with a thesis on the production of gamma rays by Sagittarius A – at the time a mysterious radio source, which is now known to harbour the supermassive black hole at the centre of the Milky Way. This was at the time of the first observations in X rays and in infrared of the centre of the Galaxy, and the first of many examples of far-sighted intuition in Massimo’s career.

Following his PhD, Massimo convinced his colleagues at Milan to support the construction of an infrared telescope on the Gornergrat in the Swiss Alps. He was then sent to the Steward Observatory at the University of Arizona, where he did pioneering work in infrared astronomy. He quickly made himself known with a daring request for telescope time involving all of the instruments. “At that time,” he recalls, “there was a clear separation between astronomy for infrared, spectroscopy or photography, and there were three levels of use of an instrument: astronomer without assistant, assistant astronomer, or general (university) public. I asked for all of the instruments – and in particular for the bolometer, which no one had ever dared ask for!” After a three-hour meeting, his proposal to observe infrared galaxies was judged “very interesting but totally crazy”. So the committee suggested a compromise: 10 of his candidate objects would be observed during the telescope’s spare time and then they could review the request. Massimo accepted, and two weeks later seven of his objects had been found to be infrared emitters. “So they gave me the whole bolometer three-months later. I was lucky!” he says with the same enthusiasm as the 28-year-old postdoc he was at the time.

It was, once again, pure intuition. Massimo had chosen his 10 objects based on M82, a galaxy that interacts with its larger neighbour M81. “M82 is a beautiful galaxy with explosions and I thought, when two galaxies interact, they trigger explosions. So I simply had a collection of interacting galaxies and it came out that this is just what they did.” This intuition was to be confirmed by what has become a pillar of astrophysics: when two galaxies interact, the gas inside is compressed, creating a large number of new stars, which produce a large amount of infrared emission as they form.

While still in Arizona, Massimo decided to work on the optical identification of radio galaxies. “At the time, the Hercules cluster was not very well explored, with a redshift of 11,000 km/s. Compared to the well-known Coma cluster, with a redshift of 5000 km/s, it is much further and very difficult to observe between Arizona’s summer storms,” he explains. “Astronomers came to me saying that the cluster was ‘theirs’, as they had started work on it three years earlier. But they had done no observations. So I offered to collaborate, and we decided that whoever took more galaxies would be the first author on the publication. They found 19, I took all of the rest: 300.” That paper is now a cornerstone in astronomy. “It was the first time we saw clearly the existence of a void in the universe,” he continues. “There was no galaxy between 6000 and 9000 km/s. Today we know this void is the remnant of the Big Bang, the structure of the anisotropy of the universe recently observed by the Planck space telescope.”

Breaking records

Dubbing himself “a difficult person,” Massimo broke new ground not just in the way that astronomy is done, but by bringing innovation in the way that telescopes are conceived of and built. Intuition, determination and audacity are indeed the distinctive marks of his 35-year-long career at ESO, which he joined in 1977 as a member of the Science Group, when the organization was still based at CERN (CERN Courier October 2012 p26). He had decided to join ESO to observe with the largest European telescope – the 3.6 m, which was just being inaugurated at ESO’s site at La Silla in Chile.

“I obtained three nights for my cluster of galaxies in the Southern hemisphere,” he recalls, “but I was the second official user, during the first week of observation, and nobody knew if it was going to work. I received those three nights (plus three to compensate in case of problems) because I was the only one in Europe who had experience with big telescopes. I started to complain the first night and obtained the sixth night.” Massimo was then told that there would be another week of tests and he was asked if he would test the telescope, so he took the full week. “My colleagues were jealous but then were surprised that I really tested the telescope to adjust and calibrate it.” After these two weeks, he went back to Geneva and told ESO’s director that astronomers need to be associated with the construction of telescopes from the start. “So I created the role of ‘friend of instrument’ and for each instrument we associated one astronomer in charge,” he explains. The role of instrument scientist, commonplace now at the large telescopes worldwide, is thus Massimo’s invention.

Unsurprisingly, he then became project scientist for ESO’s next telescope, the 2.2 m. Built for the Max Planck Institute, it had been destined for Namibia originally, but with Italy and Switzerland about to join ESO in 1981, the decision was taken to install it at ESO’s site at La Silla. “The Italians are very aggressive astronomers,” Massimo explains, “so we needed to increase telescope time, and I was asked to take the 2.2 m telescope from Heidelberg, put it in place in La Silla, and run the team.” They had to do everything: they had no dome, no foundations, and a budget of only DM 5 million, which was a very limited amount compared with other projects of a similar size. But, as Massimo says, “when you have no money you do great things,” and he had an idea. “I saw a thin aluminium dome on the last page of an amateur astronomers’ magazine, and I asked an engineer in my team to design a scaled-down version of our dome.” With the concrete foundations laid almost manually, and the help of three engineers recruited from Zeiss to build a new electronic system, they succeeded in installing the telescope for a total cost of DM 7 million.

The 2.2 m telescope saw its first light in June 1983, with a record-breaking angular resolution of 0.6 arc seconds. “The reason is simple,” says Massimo. “The dome was so small that we had to move all the electronics underneath, so there was no source of heat coming from the telescope, and that’s how we learnt how to remove heat from the dome.” It was also the first telescope to be operated remotely. “We took the controls from the upper floor to the lower floor, and when we saw that it worked, with the software engineer we decided to do remote control from La Serena. Everybody was laughing, saying it would never work, but there was no reason it should not!” At the time, the connection was through a telephone line – not with optical fibre – and only in one direction. Massimo explains that when they needed somebody to close the dome on the other side, they used the phone to communicate from La Serena to La Silla. On the first occasion, he recalls, “I forgot the guy was still there. All night he was waiting for my call, and he waited five, six hours before he decided to call me, asking whether he could…go to the toilet!”

In 1983, Massimo was asked to be project manager for the New Technology Telescope (NTT), a 3.6 m optical telescope that saw first light six years later. With a record-breaking resolution of 0.33 arc seconds, it produced sharper images of stellar objects than ever obtained with a ground-based telescope. The NTT was the prototype for a new type of telescope that would make the VLT possible. The main revolutionary feature was the application of active optics, in which a thin and flexible primary mirror is kept in its correct shape by a support system that responds to continual real-time analysis of a stellar image. It was ESO’s Ray Wilson who invented the system, but Massimo was involved from early on, and his former institute in Milan built the first test bench with which the system was shown to work in the early 1980s. The thin-mirror technology allowed by active optics was the breakthrough that enabled the construction of the next generation of much larger telescopes, in particular the VLT, built on a second ESO site in Chile, on the mountain of Cerro Paranal in the Atacama desert.

The VLT was proposed in 1986 and approved in 1987. Massimo was given the responsibility to build it in March 1988 by ESO’s director-general Herry van der Laan, and was later fully supported by the following director-general, Riccardo Giacconi. He was part of the team that decided to go from 4 m to 8 m mirrors that could be combined as an astronomical interferometer – a technique that was still in its early days. With four fixed 8.2 m-diameter Unit Telescopes (UTs) and four 1.8 m-diameter movable Auxiliary Telescopes (ATs), the VLT is today the most advanced optical observatory in the world. The UTs work either individually or in a combined mode using interferometry, while the ATs are entirely dedicated to interferometry. “I had under me 250 technicians. It was the craziest project I ever managed,” Massimo remembers, “and I learnt a lesson: if you want to work in the biggest observatory in the world you have to build it!”

Building the Paranal Observatory was not just a scientific experience for Massimo, he is at also at the origin of the award-winning futuristic Residencia, chosen as the set for the James Bond film Quantum of Solace. “I wanted something that could make astronomers at Paranal become human again after 13 or 14 hours of observation, to experience the pleasure of water, and of green, red and all the colours missing in the desert,” he explains. “This is the dream we recreated in this place, water in the desert, for the people working at the most advanced telescope in the world.”

After the VLT, Massimo went on to direct another “crazy” astronomy project, the Atacama Large Millimeter/submillimeter Array (ALMA), the first truly global collaboration in astronomy (CERN Courier October 2007 p23). He also conducted the exploration work for the site of the next-generation facility, the European Extremely Large Telescope (E-ELT), with a record-breaking 39 m-diameter mirror. Construction work started at the Cerro Armazones, the chosen site for E-ELT, in March 2014. Massimo, who celebrated his 70th birthday at the end of July, officially retired from ESO in 2013, but he has not stopped working for European astronomy. He still commutes between the ESO sites in Chile, Santiago and Munich, supporting ESO’s public-relations activities in Chile – and spending endless nights photographing the unique sky above the Atacama desert.

The Beauty of Physics: Patterns, Principles, and Perspectives

By A R P Rau
Oxford University Press
Hardback: £25
Also available as an e-book

41h9-LyYyzL._SX316_BO1,204,203,200_
The selection of topics in this book reflects the author’s four-decade career in research physics and his resultant perspective on the subject. While aimed primarily at physicists, including junior students, it also addresses other readers who are willing to think with symbols and simple algebra in understanding the physical world. Each chapter, on themes such as dimensions, transformations, symmetries, or maps, begins with simple examples accessible to all, while connecting them later to more sophisticated realizations in more advanced topics of physics.

Crackle and Fizz: Essential Communication and Pitching Skills for Scientists

By Caroline van den Brul
Imperial College Press
Hardback: £35
Paperback: £15
E-book: £11

CCboo3_06_15

The introduction of Crackle and Fizz sets out a trope that may sound familiar: a decade-old social faux pas between scientists and journalists at a dinner party, where the speed-dating format for presenting science was met with ire, derision and altogether not having a nice time. The claim is made that this could have been a chance to start over, to reframe science communication and realign the expectations of those involved. To do so misses out on the past few decades of development in the science-communication field, which is now reaching a reflective maturity and presence between academia, industry and media. Unfortunately, the same erasure is a leitmotif in many of the chapters that follow.

Caroline van den Brul’s credentials are impressive, with years at the helm of BBC productions and engagement workshops. This history forms the backbone of the book, setting an anecdote-per-chapter rate that reads more like an autobiography than an attempt to impart any lessons or experience to the reader. The remaining space is given over to consideration of narrative devices useful in contextualizing topics and engagement from a practitioner’s perspective. However, these are only superficially explored and offer little in variation. After many pages promoting the importance of clarity, the titular “Crackle” is eventually revealed in the final chapter to be a (somewhat forced) acronym that summarizes and distils all preceding guidance. Had this been the starting point from which each aspect was explored in depth, the tone and flow of the book may have made for a more compelling read. When used as the conclusion, it feels condescendingly simplified. It’s a shame that, considering van den Brul’s history, the final chapter is the main one worth reading.

Overall, the book feels less like the anticipated dive into years of experience, and more like a pre-lunch conference workshop. If you are in the first stages of incorporating engagement and communication into your current practice, working through each chapter’s closing questions could be of some use. Or, should you feel like refreshing your current framework, they might give you a moment’s pause and adjustment, but no more than any other evaluation.

A Chorus of Bells and Other Scientific Inquiries

By Jeremy Bernstein
World Scientific
Hardback: £25
E-book: £19

51p3NNPjbhL._SX332_BO1,204,203,200_

In this volume of essays, written across a decade, Bernstein covers a breadth of subject matter. The first part, on the foundations of quantum theory, reflects the author’s conversations with the late John Bell, who persuaded him that there is still no satisfactory interpretation of the theory. The second part deals with nuclear weapons, and includes an essay on the creation of the modern gas centrifuge by German prisoners of war in the Soviet Union. Two shorter sections follow: the first on financial engineering, with a profile of Louis Bachelier, the French mathematician who created the subject at the beginning of the 20th century; the second and final part is on the Higgs boson, and how it is used for generating mass.

To Explain the World: The Discovery of Modern Science

By Steven Weinberg
Harper Collins/Allen Lane
Hardback: £20 $28.99
Also available at the CERN bookshop

CCboo1_06_15

Steven Weinberg’s most recent effort is neither a treatise on the history of science nor a philosophical essay. The author presents instead his own panoramic view of the meandering roads leading to the Newtonian synthesis between terrestrial and celestial physics, rightfully considered as the beginning of a qualitatively new era in the development of basic science.

The first and second parts of the book deal, respectively, with Greek physics and astronomy. The remaining two parts are dedicated to the Middle Ages and to the scientific revolution of Copernicus, Galileo and Newton. The aim is to distil those elements that are germane to the development of modern science. The style is more persuasive than assertive: excerpts of philosophers, poets and historians are abundantly quoted and reproduced, with the aim of corroborating the specific viewpoints conveyed in the text. A similar strategy is employed when dealing with the scientific concepts involved in the discussion. More than a third of the 416 pages of the book contain a series of 35 “technical notes” – a quick reminder of a variety of geometric, physical and astronomical themes (the Thales theorem, the careful explanation of epicycles for inner and outer planets, the theory of rainbows and various other topics relevant to the main discussion of the text).

Passing before you through the pages, you will see not only Plato and Aristotle, but also Omar Khayyam, Albertus Magnus, Robert Grosseteste and many other progenitors of modern scientists. Nearly 2000 years separate the natural philosophy of the “Timaeus” from the birth of the scientific method. Many elements contributed serendipitously to the evolution leading from Plato to Galileo and Newton: the development of algebra and geometry, the divorce between science and religion, and an improved attitude of abstract thinkers towards technology. All of these aspects have certainly been important for the tortuous emergence of modern science. But are they sufficient to explain it? Scientists, historians and laymen will be able to draw their own lessons from the past as presented here, and this is just one of the intriguing aspects of this interdisciplinary book.

After reading this book quietly, you might be led to conclude that good scientific ideas and daring conjectures take a long time to mature. It has been an essential feature of scientific progress to understand which problems are ripe to study and which are not. No one could have made progress in understanding the nature of the electron, before the advent of quantum mechanics. The plans for tomorrow require not only boldness and fantasy, but also a certain realism that can be trained by looking at the lessons of the past. Today’s most interesting questions may not be scientifically answerable tomorrow, and lasting progress does not come by looking along a single line of sight, but all around, where there are mature phenomena to be scrutinized. This seems to be true for science as a whole, and in particular for physics.

bright-rec iop pub iop-science physcis connect