Comsol -leaderboard other pages

Topics

Heavy ions in Annecy

CCann1_06_11

Since the early 1980s, the Quark Matter conferences have been the most important venue for showing new results in the field of high-energy heavy-ion collisions. The 22nd in the series, Quark Matter 2011, took place in Annecy on 22–29 May and attracted a record 800 participants. Scheduled originally for 2010, it had been postponed to take place six months after the start of the LHC heavy-ion programme. It was hence – after Nordkirchen in 1987 and Stony Brook in 2001 – the third Quark Matter conference to feature results from a new accelerator.

The natural focus of the conference was on the first results from the approximately 108 lead–lead (Pb+Pb) collisions that each of the three experiments – ALICE, ATLAS and CMS –participating in the LHC heavy-ion programme have recorded at the current maximum centre-of-mass energy of 2.76 TeV per equivalent nucleon–nucleon collision. In addition, the latest results from the PHENIX and STAR experiments at Brookhaven’s Relativistic Heavy Ion Collider (RHIC) and its recent beam energy-scan programme featured prominently, as well as data from the Super Proton Synchrotron (SPS) experiments. The conference aimed at a synthesis in the understanding of heavy-ion data over two orders of magnitude in centre-of-mass energy.

The meeting also covered a range of theoretical highlights in heavy-ion phenomenology and field theory at finite temperature and/or density. And although, as one speaker put it, the wealth of first LHC data contributed much to the spirit that “the future is now”, there were sessions on future projects, including the programme of the approved experiment NA61/SHINE at the SPS, plans for upgrades to RHIC, experiments at the Facility for Antiproton and Ion Research under construction in Darmstadt, a plan for a heavy-ion programme at the Nuclotron-based Ion Collider facility in Dubna, as well as detailed studies for an electron–ion programme at a future electron–proton/electron–ion collider, e-RHIC, at Brookhaven, or LHeC at CERN.

Following a long-standing tradition, the conference was preceded by a “student day” featuring a set of introductory lectures catering for the particular needs of graduate students and young postdocs, who represented a third of the conference participants. The official conference inauguration was held on the morning of 22 May in the theatre at Annecy, the Centre Bonlieu, with welcome speeches from CERN’s director-general, Rolf Heuer, the director of the Institut National de Physique Nucléaire et de Physique des Particules (IN2P3), Jacques Martino, and the president of the French National Assembly, Bernard Accoyer. The same morning session featured an LHC status report by Steve Myers of CERN and a theoretical overview by Krishna Rajagopal of Massachusetts Institute of Technology.

Quark Matter 2011 also continued the tradition of scheduling summary talks of all of the major experiments in the introductory session. When the 800 participants walked in for a late lunch on the first day from the Centre Bonlieu along the Lake of Annecy to the Imperial Palace business centre, the site of the parallel sessions in the afternoon, they had listened to experimental summaries by Jurgen Schukraft for ALICE, Bolek Wyslouch for CMS, Peter Steinberg for ATLAS, Hiroshi Masui for STAR and Stefan Bathe for PHENIX. These 25-minute previews set the scene for the detailed discussions of the entire week.

This short report cannot summarize all of the interesting experimental and theoretical developments but it illustrates the breadth of the discussion with a few of the many highlights. Examples from three particular areas must therefore suffice to illustrate the richness of the new results and their implications.

The importance of flow

Heavy-ion collisions at all centre-of-mass energies have long been known to display remarkable features of collectivity. In particular, in semicentral heavy-ion collisions at ultra-relativistic energies, approximately twice as many hadrons above pT = 2 GeV are produced parallel to the reaction plane rather than orthogonal to it, giving rise to a characteristic second harmonic v2 in the azimuthal distribution of particle production. Only a month after the end of the first LHC heavy-ion run, the ALICE collaboration announced in December 2010 that this elliptic flow, v2, persists unattenuated from RHIC to LHC energies. The bulk of the up to 1600 charged hadrons produced per unit rapidity in a central Pb–Pb collision at the LHC seems to emerge from the same flow field. Moreover, the strength of this flow field at RHIC and at the LHC is consistent with predictions from fluid-dynamic simulations, in which it emerges from a partonic state of matter with negligible dissipative properties. Indeed, one of the main motivations for a detailed flow phenomenology at RHIC and at the LHC is that flow measurements constrain dissipative QCD transport coefficients that are accessible to first-principle calculations in quantum field theory, thus providing one of the most robust links between fundamental properties of hot QCD matter and heavy-ion phenomenology.

CCann2_06_11

Quark Matter 2011 marks a revolution in the dynamical understanding of flow phenomena in heavy-ion collisions. Until recently, flow phenomenology was based on a simplified event-averaged picture according to which a finite impact parameter collision defines an almond-shaped nuclear overlap region; the collective dynamics then translates the initial spatial asymmetries of this event-averaged overlap into the azimuthal asymmetries of the measured particle-momentum spectra. As a consequence, the symmetries of measured momentum distributions were assumed to reflect the symmetries of event-averaged initial conditions. However, over the past year it has become clear – in an intense interplay of theory and experiment – that there are significant fluctuations in the sampling of the almond-shaped nuclear overlap region on an event-by-event basis. The eventwise propagation of these fluctuations to the final hadron spectra results in characteristic odd flow harmonics, v1, v3, v5, which would be forbidden by the symmetries of an event-averaged spatial distribution at mid-rapidity.

In Annecy, the three LHC experiments and the two at RHIC all showed for the first time flow analyses at mid-rapidity that were not limited to the even flow harmonics v2 and v4; in addition, they indicated sizeable values for the odd harmonics that unambiguously characterize initial-state fluctuations (figure 1). This “Annecy spectrum” of flow harmonics was the subject of two lively plenary debates. The discussion showed that there is already an understanding – both qualitatively and on the basis of first model simulations – of how the characteristic dependence on centrality of the relative size of the measured flow coefficients reflects the interplay between event-by-event initial-state fluctuations and event-averaged collective dynamics.

CCann3_06_11

Several participants remarked on the similarity of this picture with modern cosmology, where the mode distribution of fluctuations of the cosmic microwave background also gives access to the material properties of the physical system under study. The counterpart in heavy-ion collisions may be dubbed “vniscometry”. Indeed, since uncertainties in the initial conditions of heavy-ion collisions were the main bottleneck in using data so far for precise determinations of QCD transport coefficients such as viscosity, the measurement of flow coefficients that are linked unambiguously to fluctuations in the initial state has a strong potential to constrain further the understanding of flow phenomena and the properties of hot strongly interacting matter to which they are sensitive.

Quark Matter 2011 also featured major qualitative advances in the understanding of high-momentum transfer processes embedded in hot QCD matter. One of the most important early discoveries of the RHIC heavy-ion programme was that hadronic spectra are generically suppressed at high transverse momentum by up to a factor of 5 in the most central collisions. With the much higher rate of hard processes at the tera-electron-volt scale, the first data from ALICE and CMS have already extended knowledge of this nuclear modification of single inclusive hadron spectra up to p= 100 GeV/c. In the range below 20 GeV/c, these data show a suppression that is slightly stronger but qualitatively consistent with the suppression observed at RHIC. Moreover, the increased accuracy of LHC data allows, for the first time, the identification of a nonvanishing dependence on transverse momentum of the suppression pattern from a factor of around 7 at pT = 6–7 GeV/c to a factor of about 2 at pT = 100 GeV/c, thus adding significant new information.

Another important constraint on understanding high-pT hadron production in dense QCD matter was established by the CMS collaboration with the first preliminary data on Z-boson production in heavy-ion collisions and on isolated photon production at pT up to 100 GeV/c. In contrast to all of the measured hadron spectra, the rate of these electroweakly interacting probes is unmodified in heavy-ion collisions (figure 2). The combination of these data gives strong support to models of parton energy loss in which the rate of hard partonic processes is equivalent to that in proton–proton collisions but the produced partons lose energy in the surrounding dense medium.

The next challenge in understanding high-momentum transfer processes in heavy-ion collisions is to develop a common dynamical framework for understanding the suppression patterns of single inclusive hadron spectra and the medium-induced modifications of reconstructed jets. Already in November 2010, the ATLAS and CMS collaborations reported that di-jet events in heavy-ion collisions show a strong energy asymmetry, consistent with the picture that one of the recoiling jets contains a much lower energy fraction in its jet conical catchment area as a result of medium-induced out-of-cone radiation. At Quark Matter 2011, CMS followed up on these first jet-quenching measurements by showing the first characteristics of the jet fragmentation pattern. Remarkably, these first findings are consistent with a certain self-similarity, according to which jets whose energy was degraded by the medium go on to fragment in the vacuum in a similar fashion to jets of lower energy.

This was the first Quark Matter conference in which data on the nuclear modification factor were discussed in the same session as data on reconstructed jets. All of the speakers agreed in the plenary debate that there will be much more to come. On the experimental side, advances are expected from the increased statistics of future runs, complementary analyses of the intra-jet structure and spectra for identified particles, as well as from a proton–nucleus run at the LHC, which would allow the dominant jet-quenching effect to be disentangled from possibly confounding phenomena. On the theoretical side, speakers emphasized the need to improve the existing Monte-Carlo tools for jet quenching with the aim of constraining quantitatively how properties of the hot and dense QCD matter produced in heavy-ion collisions are reflected in the modifications of hard processes.

Another highlight of the conference was provided by the first measurements of bottomonium in heavy-ion collisions, reported by the STAR collaboration for gold–gold (Au–Au) collisions at RHIC and the CMS collaboration for Pb–Pb collisions at the LHC. The charmonium and bottomonium families represent a well defined set of Bohr radii that are commensurate with the typical thermal length scales expected in dense QCD matter. On general grounds, it has long been conjectured that, depending on the temperature of the produced matter, the higher excited states of the quarkonium families should melt while the deeper-bound ground states may survive in the dense medium.

While the theoretical formulation of this picture is complicated by confounding factors related to limited understanding of the quarkonium formation process in the vacuum and possible new medium-specific formation mechanisms via coalescence, the CMS collaboration presented preliminary data of the Υ family that are in qualitative support of this idea (figure 3). In particular, CMS has established within statistical uncertainties the absence of higher excited states of the Υ family in the di-muon invariant mass spectrum, while the Υ 1s ground state is clearly observed. The rate of this ground state is reduced by around 40% (suppression factor, RAA = 0.6) in comparison with the yield in proton–proton collisions, consistent with the picture that the feed-down from excited states into this 1s state is stopped in dense QCD matter. STAR also reported a comparable yield. Clearly, this field is now eagerly awaiting LHC operations at higher luminosity to gain a clearer view of the conjectured hierarchy of quarkonium suppression in heavy-ion collisions.

In addition to the scientific programme, Quark Matter 2011 was accompanied by an effort to reach out to the general public. The week before the conference, the well known French science columnist Marie-Odile Monchicourt chaired a public debate between Michel Spiro, president of CERN Council, and Etienne Klein, director of the Laboratoire de Recherches sur les Sciences de la Matière at Saclay and professor of philosophy of science at the Ecole Central de Paris, attracting an audience of around 400 from the Annecy area. During the Quark Matter conference, physicists and the general public attended a performance by actor Alain Carré and the world-famous Annecy-based pianist Francois-René Duchable that merged classical music, literature and artistically transformed pictures from CERN. On another evening, the company Les Salons de Genève performed the play The Physicists, by Swiss writer Friedrich Dürrenmatt, in Annecy’s theatre. While the conference reached out successfully to the general public, participants encountered some problems in reaching out because the wireless in the conference centre turned out to be dysfunctional. However, the highlights were sufficiently numerous to reduce this to a footnote. As one senior member of the community put it during the conference dinner: “It was definitively the best conference since the invention of the internet.”

• For the full programme and videos of Quark Matter 2011, see http://qm2011.in2p3.fr.

ISOLDE explores the Island of Inversion

CCiso1_06_11

The nuclear shell model, one of the cornerstones in describing nuclear structure, was invented independently by Maria Goeppert-Mayer and Hans Jensen in 1949, who both received the Nobel prize in 1963. In the model, nuclei with “magic” numbers of protons or neutrons exhibit highly symmetric spherical configurations similar to the electron cloud in noble gases or the carbon atoms in C60 molecules (“buckyballs”). The traditional “magic” numbers in nuclear physics – 2, 8, 20, 28, 50, 82 (and 126 for neutrons) – are well established for stable nuclei. These emerged from a purely phenomenological approach, but modern nuclear theory can trace the magic numbers down to nucleon–nucleon forces derived from low-energy QCD.

Many current studies of nuclear structure with exotic radioactive nuclei focus on the question of whether these magic numbers persist or are altered in going away from the “valley of stability”, where the numbers of protons (Z) and neutrons (N) combine to give the most stable nuclides. Challenging the predictive power of nuclear theory, the aim is to lead the way towards a universal description of nuclear structure. Predictions for nuclei that lie beyond the reach of experiments are also important, for example, for the understanding of nucleosynthesis in exploding stars, at the origin of the chemical elements in the universe.

Such changes have already been observed experimentally in exotic nuclei. For example, the stable isotope 16O (Z=N=8) is an exemplary doubly magic nucleus. However, far from stability, 24O (the oxygen isotope that has the most neutrons while still being bound) behaves in a similar fashion, indicating that locally a new magic number, N=16, appears. Other examples are the disappearance of the magic number N=8 in 12Be and evidence for a new magic number, N=34, in 54Ca.

CCiso2_06_11

Anomalies in nuclear structure in the region around N=20 have been known of experimentally since 1975, when mass measurements of exotic sodium isotopes at CERN’s Proton Synchrotron revealed a tighter binding than expected. This was followed by the discovery in studies of magnesium isotopes that the energy of the first excited state, populated by the decay of sodium, drops from 1482.8 keV in 30Mg to 885.3 keV in 32Mg (N=20) – the opposite of what is expected on approaching a magic number. These features were then attributed to an unexpected onset of deformation, with the nuclei having most likely the shape of a rugby ball rather than being spherical.

Further evidence for this interpretation came from studies of electromagnetic transition strengths and ground-state properties, some of them performed at CERN’s world-leading ISotope OnLine separation (ISOL) facility, ISOLDE. In terms of the nuclear shell model, the nucleon–nucleon forces are believed to change the ordering of some single-particle orbitals, sometimes so drastically that orbitals are lowered even across a closed shell (in this case N=20). The neutron-rich nuclei in the N=20 region, whose ground-state configuration includes valence neutrons that occupy such “intruder” orbitals, form what is known as the “Island of Inversion”.

Shape coexistence

In 30Mg (N=18), all of the experimental and theoretical work points to the coexistence of a spherical ground state with spin (J) and parity (P), JP = 0+, together with a deformed excited 0+ state at 1788.2 keV, which has a wave function with a strong intruder contribution. The latter has been identified at ISOLDE by measuring the conversion electrons of the characteristic electric mono-pole (E0) transition between the two 0+ states, as in this particular case the emission of a gamma ray is forbidden because of angular momentum conservation. In 32Mg – in agreement with theory – all of the data indicate that inversion has taken place so that the energetically favoured intruder configuration dominates the deformed ground state. Consequently, a near-spherical excited 0+ state, the analogue of the ground state in 30Mg, is expected, as illustrated in figure 1. Despite numerous attempts, however, this state has never been observed experimentally – until now.

CCiso3_06_11

An experiment led by the Technische Universität München and the Katholieke Universiteit Leuven and involving 39 experimenters from 14 institutes in 9 countries has at last discovered the excited 0+ state in 32Mg (Wimmer et al. 2010). The experiment was performed in October 2008 at ISOLDE, where fundamental and applied research with radioactive ions is performed at CERN. Operating since 1967, ISOLDE has produced more than 700 isotopes of almost 70 elements as low-energy beams (60 keV). Starting in 2001, nearly 80 isotopes of elements from lithium to radium have been post-accelerated by the Radioactive Beam Experiment (REX) to energies up to 3 MeV/u, enabling the study of nuclear-reactions.

The key idea was that the addition of two neutrons to the spherical ground state of 30Mg should populate either the deformed ground state of 32Mg or the spherical excited 0+ state, depending on which orbital the additional neutrons occupy. Experimentally, this was achieved by a two-neutron transfer reaction in inverse kinematics. A beam of 30Mg impinged on a tritium (t) target from which the two neutrons were transferred to form a 32Mg nucleus in a (t,p) reaction.

The radioactive 30Mg (T1/2 = 335 ms) was produced by the 1.4 GeV proton beam from the PS Booster impinging on a thick, uranium carbide production target. The magnesium atoms were selectively ionized by the Resonant Ionisation Laser Ion Source and the singly charged ions were mass-separated to obtain a pure 30Mg beam. The energy of the ions was then boosted by the REX-ISOLDE facility to 1.83 MeV/u, with a final intensity of around 104 particles a second.

CCiso4_06_11

The experimental set-up consisted of MINIBALL – a high-resolution gamma-ray spectrometer with 24 segmented high-purity germanium detectors – in combination with the newly built Transfer reactions at REX (T-REX) array, which is the key detector for this experiment. T-REX is a 4π array with 58% coverage in solid angle. It consists of a box-like “barrel” of quadratic silicon-strip detectors together with an annular double-sided segmented silicon-strip detector, the “CD”; in both cases there are two layers of silicon detectors (figure 2). Energy and position are measured for charged, target-like particles – protons, deuterons and tritons; these are identified by measuring the energy loss in a thin detector, which is characteristic for a species at a given energy, and the remaining kinetic energy in a thick detector that stops the particles (the ΔE–E method). The compact set-up was housed in a cylindrical vacuum chamber with a diameter of 12 cm to fit inside MINIBALL.

One of the experimental challenges was to make a thin radioactive tritium target, small enough to fit in the centre of T-REX. The technical solution was found in the form of a tritium-loaded titanium foil. The measured energies and angles of the protons emitted from the (t,p) reaction enabled the reconstruction of the excitation energy of the 32Mg nucleus. The angular distributions of the protons allowed for the determination of the transferred orbital angular momentum ΔL, from which the spins and parities of the populated states could be deduced (figure 3).

The shape of the measured angular distribution is characteristic for ΔL=0 and firmly establishes the 0+ assignment for the excited state, just as for the ground state (figure 3). An excitation energy of 1058 keV and a lower limit for its lifetime of 10 ns have been deduced. The population cross-sections for both states and results from recent knockout reactions contribute to a consistent picture of a deformed ground state and a spherical excited 0+ state.

Such a low-lying – lower than predicted by any calculation – and long-lived 0+ state poses a challenge to modern theory. An experimental challenge also remains: to determine the lifetime of the excited state as well as the strength of the E0 transition between the two 0+ states. However, bringing all of the existing pieces of the puzzle together is already enabling a deeper insight into the physics relevant for the formation of the Island of Inversion.

The fascinating phenomenon of different nuclear shapes coexisting at similar energies – the difference is less than 1 per cent of the total binding energy – is also present in other regions of the nuclear chart, the most prominent example being the triple shape coexistence in the neutron-deficient 186Pb isotope. Transfer reactions at REX-ISOLDE are currently limited by the available beam energy to nuclei with mass number A lower than 80, but the upgrade of the facility to HIE-ISOLDE is already on the horizon. This includes an incremental increase of beam energy to 10 MeV/u and will become available in 2015. In particular, at these energies one- and two-nucleon-transfer reactions with heavy radioactive-ion beams will become feasible, opening up a whole new field for studies of single-particle aspects, shape coexistence and the role of pairing interactions. The future of nuclear structure studies with radioactive ion beams at CERN looks bright.

PAMELA’s quest for answers to cosmic questions

CCpam1_06_11

PAMELA – the Payload for Antimatter Matter Exploration and Light nuclei Astrophysics – was launched into space on 15 June 2006 aboard a Soyuz rocket from Bajkonur in Kazakhstan. Since then, it has been orbiting the Earth, installed on the upward side of the Resurs-DK1 satellite at a distance that varies between 350 km and 610 km. On board are different types of detector (figure 1) comprising: a magnetic spectrometer, based on a neodymium-iron-boron permanent magnet and a precision tracking system; a sampling imaging calorimeter, in which pairs of orthogonal millistrip silicon sensor planes are interleaved with tungsten absorber plates; a precise time-of-flight system, using plastic scintillation detectors; an anticoincidence system; and a neutron detector.

The experimental apparatus was designed to provide precise measurements of the particle and nuclei fluxes in the cosmic radiation over a wide energy range. It is sensitive to antiprotons between 80 MeV and 190 GeV, positrons between 50 MeV and 270 GeV, electrons up to 600 GeV, protons up to 1 TeV and nuclei up to a few hundred giga-electron-volts. In addition, in the search for anti­nuclei PAMELA has a sensitivity of about 10–7 in the ratio He/He.

The experiment’s scientific objectives are ambitious and aim to clarify some of the trickiest questions of modern physics: the origin of cosmic rays, their energy spectrum, their antimatter components and particles possibly originating in the annihilation of dark matter particles. With data accumulated over several years, the mission, which is scheduled to finish at the end of the year, is now providing new insights into some of these questions and more.

Cosmic revelations

In 2009, the PAMELA collaboration published an anomalous positron abundance in cosmic rays with energies between 1.5 and 100 GeV. By contrast, as figure 2 shows, the antiproton flux they observe agrees with standard secondary antiproton production in the Galaxy (Adriani et al. 2010). These results were followed more recently with the publication of precision measurements of the proton and helium spectra in the rigidity range 1 GV to 1.2 TV. The proton and helium spectra show different shapes and moreover cannot be described by a single power law, as would be expected from previous observations and from the theoretical models adopted so far. Also, while the spectra of protons and helium gradually soften in the rigidity range 30–230 GV, they both show a hardening at 230–240 GV. Previous experiments did not have the statistical and systematic precision to show this behaviour, although an indirect indication was derived by comparing the results from a range of balloon-borne experiments (JACEE, CREAM and BESS) as well as from the first trial flight of the Alpha Magnetic Spectrometer in 1998.

CCpam2_06_11

So far, supernovae have been considered to be the sites of cosmic-ray acceleration. However, the discrepancies found by PAMELA in the proton and helium spectra have prompted a re-evaluation of the processes that underlie the acceleration, as well as the propagation of cosmic rays. Similar conclusions were drawn from PAMELA’s results on the positron abundance. Theoretical explanations of these data invoke more complex processes of acceleration and propagation, as well as possible contributions from new astrophysical sources, such as pulsars or more exotic ones, such as dark matter.

Conventional diffusive propagation models can, on the other hand, be used to interpret recently published PAMELA data on the electron component of the cosmic radiation. Precision measurements of the electron flux provide information regarding the origin and propagation of cosmic rays in the Galaxy that are not accessible through the study of the cosmic-ray nuclear components because of their differing energy-loss processes. PAMELA collected data between July 2006 and January 2010 by selecting electrons in the energy interval 1–625 GeV. This is the largest energy range covered by any cosmic-ray experiment so far, and the first time that electrons above 50 GeV have been identified in cosmic rays.

CCpam3_06_11

The collaboration derived the electron spectrum in two independent ways – using either the calorimeter or the tracking information – and the two sets of measurements show good agreement within the statistical errors. Figure 3 shows a typical electron event with a track and energy deposited in the calorimeter. The electron spectrum, although in substantial agreement with the results of other recent experiments, in particular the balloon-bourn Advanced Thin Ionization Calorimeter (ATIC) and the Fermi Gamma-Ray Space Telescope, appears softer than the e+e+ spectra they measure. This difference is within the systematic uncertainties between the various measurements, but it is also consistent with a positron component that increases with energy.

Solar events

PAMELA has also measured solar-particle events and their temporal evolution during the major solar emissions of 13–14 December 2006 (figure 4). This was the first direct measurement by a single instrument of proton and helium nuclei of solar origin in a large energy range between 100 MeV/n and 3 GeV/n (Adriani et al. 2011b). The data show a spectral behaviour that is different from those derived from the neutron monitor network, with no satisfactory analytical description fitting the measured spectra. This implies the presence of complex, concurrent acceleration and propagation processes at the Sun and in interplanetary space. Modelling the solar-particle events is also relevant for future manned missions to the Moon and Mars.

CCpam4_06_11

Over the past five years, PAMELA has continuously monitored solar activity during an unusually long-lasting solar minimum, followed by – as of the end of December 2009 – a slow increase of activity, probably marking the beginning of the new solar cycle. This particularly favourable situation is providing the collaboration with an excellent opportunity to study heliospheric effects and underlines the major role that the experiment has in providing unique information about the nature of the cosmic rays at the scale of giga-electron-volts in the heliosphere. By combining data from PAMELA and the ULYSSES space mission, the PAMELA collaboration has also performed a new evaluation of the spatial dependence of cosmic-ray intensities in the heliosphere, with an accurate measurement of the radial and longitudinal gradients (De Simone et al. 2011).

CCpam5_06_11

Many new results from PAMELA were presented recently at the 2011 European Physical Society Conference on High-Energy Physics in Grenoble on 21–27 July and at the International Cosmic Ray Conference in Beijing on 11–18 August, as well as at other conferences. These results concern mainly new data on the electron/positron ratio, the absolute flux of positrons up to 100 GeV, fluxes and ratios of light nuclei, the abundance of hydrogen and helium isotopes, as well as new limits of the anti-helium to helium ratio. The new results confirm earlier findings and also extend the energy range and precision of the data. One interesting feature concerns the change in slope of the positron flux above 20 GeV, as shown in figure 5, which also includes the electron spectrum. Exclusion limits on the existence of new sorts of matter, such as strangelets, are also in the pipeline. The latest interesting PAMELA result concerns the discovery of a radiation belt around the Earth containing trapped antiprotons (Adriani et al. 2011c).

Although all the instrumentation aboard PAMELA is working well, the mission is expected to finish at the end of this year. The collaboration will then continue to work for another two years to analyse all of the data collected and improve the statistics.

• PAMELA was constructed by the WiZard collaboration, which was originally formed around Robert Golden, who first observed antiprotons in space. There are now 14 institutions involved. Italian INFN groups in Bari, Florence, Frascati, Naples, Rome Tor Vergata and Trieste, and groups from CNR, Florence and the Moscow Engineering and Physics Institute form the core. They are joined by groups from The Royal Institute of Technology (KTH) in Sweden, Siegen University in Germany and Russian groups from the Lebedev Institute, Moscow, and the Ioffe Institute, St Petersburg.

LEAP 2011 casts light on antiproton physics

CClea1_06_11

Low-energy antiproton physics is an interdisciplinary field that spans particle, nuclear, atomic and applied physics, as well as astrophysics. It confronts directly the relationship between matter and antimatter, in particular CPT symmetry, one of the foundations of the theory of particle physics. CPT is so fundamental that its violation would require a complete rewriting of particle-physics textbooks. Precision studies with antiprotons may also shed light on the question of why the universe is made almost exclusively of matter but not antimatter. Recent months have witnessed dramatic breakthroughs in the field at CERN’s Antiproton Decelerator (AD), including the trapping of antihydrogen atoms and developments towards an antihydrogen beam. Satellite and balloon experiments are searching for cosmic antimatter, the results of which could have profound implications on cosmology. Antiprotons are also being used to study the properties and structures of atoms, nuclei and hadrons, for which the start of the Facility for Antiproton and Ion Research (FAIR) in Darmstadt will usher in a new era.

Dialogue across disciplines

It was against this stimulating backdrop that LEAP 2011 – the 10th International Conference on Low Energy Antiproton Physics – took place at TRIUMF in Vancouver on 27 April – 1 May. The conference was organized and supported by the Canadian institutions involved in the ALPHA experiment at the AD (the universities of British Columbia, Calgary, Simon Fraser, York and TRIUMF), with additional support from the Canadian Institute of Nuclear Physics, and was chaired by Makoto Fujiwara of TRIUMF/

Calgary, with Mary Alberg of Seattle as co-chair. LEAP 2011 was the first of the series in North America; the conferences have traditionally been held in Europe, with the exception of Yokohama in 2003. It attracted nearly 100 participants and featured more than 60 invited plenary speakers, with an emphasis on promoting young researchers. Several review talks by senior physicists facilitated dialogue across the disciplines. In addition, a dozen posters were presented and presenters were allowed a two-minute talk to advertise their work at a plenary, a format that worked quite effectively. This report presents some of the highlights of a packed programme.

The conference began with a session on antihydrogen physics, with reports on the recent trapping of antihydrogen by the ALPHA experiment and the ASACUSA collaboration’s developments towards an antihydrogen beam, both at the AD. The two results were together voted the number one physics breakthrough for 2010 by Physics World. Key techniques that enabled ALPHA’s trapping of antihydrogen are evaporative cooling and autoresonant excitation of antiproton plasmas. The conference heard how the collaboration’s work has led to the successful confinement of antihydrogen for 1000 s. The next major goal for ALPHA is to perform microwave spectroscopy on trapped antihydrogen. ASACUSA also has plans to use microwave spectroscopy to measure ground-state hyperfine splitting with an antihydrogen beam.

The ATRAP collaboration, again at the AD, presented new results on adiabatic cooling of antiprotons, with up to 3 × 106 antiprotons cooled to 3.5 K, and described the first demonstration of centrifugal separation of antiprotons and electrons, suggesting a new method for isolating low-energy antiprotons. The team also has a scheme for improved antihydrogen production via interactions with positronium atoms, created in the interactions of excited caesium atoms with positrons. Other talks described new possibilities for antimatter gravity experiments with antihydrogen at the AD: AEGIS, already under preparation, and the proposed Gbar.

Ion traps with single-particle sensitivity are another powerful tool. A team from Heidelberg and Mainz has recently observed a single proton spin-flip, a result that paves the path for the comparison of the magnetic moments of protons and antiprotons. At TRIUMF, an ion trap system, TITAN, is being used at the ISAC facility for precision studies of radioactive nuclei.

Talks on applications and new techniques with antiprotons included the ACE experiment at the AD, which is studying the possible use of antiprotons for cancer therapy, and developments towards spin-polarized antiprotons. The session on atomic physics also covered some novel techniques that have possible applications to antihydrogen. One proposal concerns a new pulsed Sisyphus scheme for (anti)hydrogen laser cooling. Another involves using an atomic coil-gun, which can stop beams of paramagnetic species, to trap hydrogen isotopes, followed by single-photon cooling techniques. A Lyman-α laser for antihydrogen cooling is being developed at Mainz.

The positron, or anti-electron, is the other ingredient in antihydrogen atoms. A review on positron accumulation techniques was given by Clifford Surko of the University of California, San Diego – the inventor of the Surko trap now used by many of the antihydrogen experiments. Studies were reported using variations of the Surko trap by ATRAP and the University of Swansea groups. Measurement of hyperfine splitting in positronium could provide precision tests of QED. One experiment on positronium atoms at the University of Tokyo has made the first direct measurement of this splitting, employing a novel sub-THz source, while another aims at precise measurements via the Zeeman effect.

CClea2_06_11

This year marks the 20th anniversary of the discovery of long-lived antiprotonic helium at KEK. Studies of such exotic atoms and fundamental symmetries are an important part of antiproton physics. ASACUSA has made recent progress on precision studies on antiprotonic helium and on microwave measurements of antiprotonic 3He atoms. Recent but still controversial results on muonic hydrogen spectroscopy at the Paul Scherrer Institute indicate a much smaller size for the proton radius than is generally accepted. Hadronic and radioactive atoms were featured in review talks at the conference, focusing on pionic and kaonic atoms, as well as on the fundamental symmetries programme at TRIUMF. The final results of the TWIST experiment at TRIUMF, a precision measurement of muon decay parameters, have greatly reduced systematic uncertainties, providing improved limits for constraining extensions to the Standard Model.

An important pillar of antiproton physics is hadron and QCD physics at “low energy”, ranging from stopped antiprotons to a beam of 15 GeV. At the lower energy end, ASACUSA is studying antiproton in-flight annihilation on nuclei. Following hints from an experiment at KEK, an experiment in a low-momentum antiproton beam at the Japan Proton Accelerator Research Complex (J-PARC) will search for a φ-meson–nucleus bound state using antiproton annihilation on nuclei. Also at J-PARC, a study of double anti-kaonic nuclear clusters in antiproton–3He annihilation has been proposed. Further into the future, the research programme for the major PANDA detector at FAIR, which is expected to start running in 2018, encompasses a breadth of physics that includes searches for exotic states and studies of double Λ hypernuclei. Back to the present, hot news from the Brookhaven National Laboratory concerned the discovery of the anti-alpha nucleus, the heaviest anti-nucleus observed.

The theory talks at the conference covered topics ranging from atomic collisions to cosmology. There were reviews on atomic collision physics with antiprotons and on interactions of antihydrogen with ordinary matter atoms. Calculations of gravitational effects on the interaction between antihydrogen and a solid surface suggest that the antiatoms would settle in long-lived quantum states, the study of which could provide a new way to measure the gravitational force on antihydrogen. Theoretical ideas based on the so-called Standard Model Extension, an effective theory that incorporates CPT and Lorentz violation, could offer the opportunity for probing Planck-scale physics as well as antimatter gravity in antihydrogen experiments. On the hadron physics side, antiproton–proton and antiproton–nucleus collisions provide ways to test theories of strangeness production, the latter offering a window onto the behaviour of strange particles in the nuclear medium that complements heavy-ion studies. In cosmology, baryon asymmetry – or the dominance of matter over antimatter – is a long-standing puzzle, as is the nature of dark matter. Could hidden antibaryons be the dark matter? Such a possibility could explain the two mysteries in one go.

LEAP 2011 featured two dedicated sessions on the universe. In the first, CERN’s John Ellis discussed the nature of dark matter and its connection to low-energy hadron physics and William Unruh, from the University of British Columbia, reported on fascinating experimental work that confirms aspects of Hawking radiation in an analogue system, confirming his own theoretical prediction from some 30 years ago. The second of the sessions focused on experimental searches for antimatter in the universe – a hot topic as the conference was held not long before the launch into space of the Alpha Magnetic Spectrometer. The latest results from the PAMELA detector, which has been in space since 2006, continue to show an anomaly in the positron flux at high energies (PAMELA’s quest for answers to cosmic questions). BESS-Polar II, the second flight of the Balloon-borne Experiment with a Superconducting Spectrometer (BESS) over Antarctica, has a new measurement of the antiproton spectrum based on 24.5 days in which 4.7 × 109 cosmic-ray events were collected, yielding a sensitivity complementary to satellite experiments. The proposed General Antiparticle Spectrometer (GAPS) would be a balloon experiment to search for anti-deuterons from dark-matter annihilations using exotic atom techniques.

Looking to the future, the construction of FAIR at Darmstadt will allow for a dedicated Facility for Low-energy Antiproton and Ion Research (FLAIR), while Fermilab has a proposal to use its Antiproton Source – the world’s most intense – for low-energy experiments once the Tevatron programme comes to an end later this year. Finally the conference returned to the AD, when the proposal for the Extra Low ENergy Antiproton ring (ELENA) was described by Walter Oelert, from the Jülich Research Centre, whose experiment at CERN observed the first antihydrogen atoms in 1996. The conference ended with his remarks on the prospects for antiproton physics. Just a few weeks after the conference, CERN Council approved the construction of ELENA, which will provide significantly enhanced opportunities for antiproton physics at CERN in the coming decade (ELENA prepares a bright future for antimatter research).

This successful conference was capped off by a social programme that included a dinner cruise in Vancouver’s spectacular English bay, and a well-attended public lecture by John Ellis at the University of British Columbia. The future of low-energy antiproton physics appears bright. The next LEAP meeting is planned for Uppsala in 2013, chaired by Tord Johansson.

• For full details of the speakers and many of the presentations, see http://leap2011.triumf.ca. The proceedings will be published in Hyperfine Interactions.

Citizen cyberscience: the new age of the amateur

CCcit1_06_11

The world of journalism has been turned upside-down in recent years by social media technologies that allow a wider range of people to take part in gathering, filtering and distributing news. Although some professional journalists resisted this trend at first, most now appreciate the likes of Facebook, Twitter and blogs in expanding the sources of news and opinion and accelerating dissemination: the audience has become part of the show.

Could the internet one day wreak the same sort of social change on the world of science, breaking down the distinction between amateur and professional? In the world of high-energy physics, that might seem unlikely. What amateur can really contribute something substantial to, say the analysis of LHC data? Yet in many fields of science, the scope for amateur contributions is growing fast.

Modern astronomy, for example, has a long tradition of inspired amateur contributions, such as spotting comets or supernovae. Now, the internet has broadened the range of tasks that amateurs can tackle. For example, the project GalaxyZoo, led by researchers at the University of Oxford, invites volunteers to participate in web-based classification of galaxy images. Such pattern recognition is a task where the human mind still tends to outperform computer algorithms.

Not only can astronomers attract hundreds of thousands of free and eager assistants this way, but occasionally those helpers can themselves make interesting discoveries. This was the case for a Dutch school teacher, Hanny van Arkel, who spotted a strange object in one of the GalaxyZoo images that had stumped even the professional astronomers. It now bears the name “Hanny’s Voorwerp”, the second word meaning “object” in Dutch.

GalaxyZoo is just one of many volunteer-based projects making waves in astronomy. Projects such as Stardust@home, Planet Hunters, Solar Watch and MilkyWay@home all contribute to cutting-edge research. The Einstein@home project uses volunteer computing power to search for – among other things – pulsar signals in radio-astronomy data. Run by researchers at the Max-Planck Institute for Gravitational Research, the project published its first discoveries in Science last year, acknowledging the names of the volunteers whose computers had made each discovery.

Crowdsourcing research

However, it is in fields outside those traditionally accessible to amateurs where some of the most impressive results of citizen-powered science are beginning to be felt. Consider the computer game FoldIt, where players compete to fold protein molecules into their lowest energy configuration. Humans routinely outperform computers at this task, because the human mind is uniquely apt at such spatial puzzles; and teenagers typically out-compete trained biochemists. What the scientists behind the FoldIt project, based at the University of Washington, have also discovered is that the players were spontaneously collaborating to explore new folding strategies – a possibility the researchers had not anticipated. In other words, the amateur protein folders were initiating their own research programme.

CCcit2_06_11

Could high-energy physics also benefit from this type of approach? Peter Skands, a theorist at CERN, thinks so. He has been working with colleagues on a project about fitting models to LHC data, where delicate tuning of the model parameters by eye can help the physicists achieve the best overall fit. Experience with a high-school intern convinced Skands that even people not versed in the gory details of LHC physics could solve this highly visual problem efficiently.

Volunteers can already contribute their processor time to another project that Skands is involved in – simulating collisions in the LHC for the recently launched LHC@Home 2.0 project, where 200 volunteers have already simulated more than 5 billion collision events. Such volunteer computing projects, like Einstein@Home, are not as passive as they might appear. Many of the volunteers have spent countless hours helping developers in the early alpha-test stages of the project by providing detailed bug reports. Message boards and a credit system for the amount of processing completed – features provided by an open-source platform called BOINC – add elements of social networking and gaming to the project.

The LHC@Home 2.0 project also relies on CernVM, a virtual machine technology developed at CERN that enables complex simulation code to run easily on the diverse platforms provided by volunteers. Running fully fledged physics simulations for the LHC on home computers – a prospect that seemed technically impossible when the first LHC@home project was introduced in 2004 to simulate proton-beam stability in the LHC ring – now has the potential to expand significantly the computing resources for the LHC experiments. Projects like LHC@home typically draw tens of thousands of volunteers and their computers, a significant fraction of the estimated 250,000 processor cores currently supporting the four LHC experiments.

A humanitarian angle

LHC@home 2.0 is an example of a project that has benefited from the support of the Citizen Cyberscience Centre (CCC), which was set up in 2009 in partnership between CERN, the UN Institute of Training and Research and the University of Geneva. A major objective of the CCC is to promote volunteer computing and volunteer thinking for researchers in developing regions, because this approach effectively provides huge resources to scientists at next to no cost. Such resources can also be used to tackle pressing humanitarian and development challenges.

One example is the project Computing for Clean Water, led by researchers at Tsinghua University in Beijing. The project was initiated by the CCC with the sponsorship of a philanthropic programme run by IBM, called World Community Grid. The goal is to simulate how water flows through carbon nanotubes and explore the use of arrays of nanotubes for low-cost water filtration and desalination. The simulations would require thousands of years on a typical university computing cluster but can be done in just months using volunteer-computing resources aggregated through World Community Grid.

Another example is volunteer mapping for UNOSAT, the operational satellite-applications programme for UNITAR, which is based at CERN. Although a range of crowd-based mapping techniques are available these days, the use of satellite images to assess accurately the extent of damage in regions devastated by war or natural disasters is not trivial, even for experts. However, rapid and accurate assessment is vital for humanitarian purposes in estimating reconstruction costs and rapid mobilization of the international community and NGOs.

CCcit3_06_11

With the help of researchers at the University of Geneva and HP Labs in Palo Alto, UNOSAT is testing new approaches in crowdsourcing damage assessment by volunteers. These involve using statistical approaches to improve accuracy, as well as models inspired by economics where volunteers can vote on the quality of others’ results.

There are hundreds of citizen-cyberscience projects engaging millions of volunteers but the vast majority supports researchers in industrialized countries. A large part of the CCC activities involve raising awareness in developing regions. With the support of the Shuttleworth Foundation in South Africa, the CCC has been organizing a series of “hackfests”: two-day events where scientists, software developers and citizen enthusiasts meet to build prototypes of new citizen-based projects, which the scientists can then go on to refine. Hackfests have already taken place in Beijing, Taipei, Rio de Janeiro and Berlin, with more planned this year in South Africa and India.

The topics covered to date include: using mobile-phone Bluetooth signals as a proxy for bacteria, tracking how air-borne bacterial diseases such as tuberculosis spread in buildings, monitoring earthquakes using the motion sensors built in to laptop computers and digitizing tables of economics data from government archives. Because the “end-users” – the citizen volunteers themselves – participate in the events, there is a healthy focus on making projects as accessible and attractive as possible, so that even more volunteers sign up and stay active.

At such events, when asked what sort of rewards the most engaged volunteers might appreciate for their online efforts, one striking response – echoed on several occasions – is the opportunity to make a suggestion to the scientists for the course of their future research. In other words, there is a desire on behalf of volunteers to be involved more actively in the process that defines what science gets done. The volunteers who propose this are quite humble in their expectations – they understand that not every idea they have will be useful or feasible. Whether scientists will reject this sort of offer of advice as unwanted interference, or embrace the potentially much larger brainpower that informed amateurs could provide, remains to be seen. But the sentiment is clear: in science, as in journalism, the audience wants to be part of the show.

Hadrons in Munich: from light mesons to heavy ions

CChad1_06_11

Hadron 2011, the 14th International Conference on Hadron Spectroscopy, was the latest in a long series that started in 1985 in Maryland. Originally conceived as a conference on light meson spectroscopy, it now covers all aspects of hadron physics, although spectroscopy and hadron production are still the topics that characterize the meeting. This year, 37 plenary talks, 128 presentations in parallel sessions and 37 posters offered ample possibilities to find out about the latest developments and results, from hypernuclear physics to meson and quarkonium spectroscopy, and from nucleon structure and the meson-baryon interaction to heavy-ion physics.

The conference began by looking at issues related to light mesons, with a summary of recent theoretical progress and experimental tests in chiral dynamics and low-energy ππ-scattering phenomena. There were new results on light meson spectroscopy from the BESIII experiment in Beijing and COMPASS at CERN. While COMPASS impressively confirmed previous findings on π1(1600), an exotic meson seen in high-energy diffraction, new structures have been observed in radiative J/Ψ decays that point towards new and narrow meson states between 1.8 and 2.5 GeV/c2, the details and nature of which have still to be unravelled.

Size and structure

Even after many years of precision experiments, the size of the proton is still a hot topic. New findings in laser spectroscopy of muonic hydrogen, which give the proton radius as more than 5σ smaller than previously determined, have opened the hunt for new explanations, although theory cannot offer effects large enough to solve the puzzle.

CChad2_06_11

Research into nucleon structure has for years shifted to spin degrees of freedom. After precision measurements on the helicity contribution of quarks in polarized nucleons, COMPASS has also set new limits on spin effects resulting from polarized gluons. These findings are confirmed by spin experiments at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven. With this, the focus now turns towards transverse-spin degrees of freedom (transversity). Noncollinear treatment of partons inside the nucleon offers a large number of new observables, which can link to quark angular momenta. Both COMPASS and RHIC have new physics programmes on transverse polarization effects, and measurements of Drell-Yan processes using polarized targets are also on the way. Hopes are high that the unexpected single-spin asymmetries that have been observed in pion production at RHIC may finally be understood.

On the low-Q2 side, big efforts at various laboratories – such as Bonn, Mainz, Jefferson Lab etc – are offering real or virtual photon beams. These allow a coherent set of (double-) polarized scattering and production experiments, also with many-body final states. Using the complete set of polarized measurements, the puzzle of baryon resonances, their identification and quantum numbers seem now to be within reach via new and sophisticated partial-wave analyses.

Quarkonium spectroscopy and the hunt for further quarkonium-like states that seem not to fit the qq picture of the meson have been and still are highlights in hadron physics. Precision experiments finally allowed the BELLE and BaBar experiments at KEK and SLAC, respectively, to observe missing quarkonium states such as hb(1P), hb(2P), as well as ηc (1S) and ηc(2S). More precise determination of the mass and width, as well as unexpected decay patterns were revealed also by BESIII, which has observed about 109 J/Ψ decays. The puzzle of the mass and width of the D(Ds) meson states is on the way to being settled with their spin assignments being resolved. The conference also heard about the remarkable progress in achieving a comprehensive and unified theory description of quarkonium properties at zero and finite temperature in an effective field-theory framework.

The biggest puzzle currently in hadron physics concerns the large number of exotic quarkonium-like states with narrow widths and high excitation energies, as compared with the open-flavour meson channel. New work was reported on the X(3872) and other, partly new states. Theoretical investigations offer a rich choice of possibilities. The X(3872) has a good chance of just being the radial excitation of the χc state, but there is also a beautiful effective field-theory description in the molecular-interpretation case. However, further stunning observations were reported from the beauty sector. Two charged quarkonium-like states found by BELLE lie close in mass to the open b-threshold and have been dubbed Zb, in analogy to the charm sector.

Lattice calculations have shown huge progress with new algorithms, allowing the extraction of excited baryon and meson-state energies. A report from the Flavianet Lattice Averaging Group presented lattice results for kaon and pion physics with the aim of making them accessible to the community. There are also new calculations of hadron structure, the baryon and meson form-factors and the g-2 factor.

First and impressive results were reported from all of the LHC experiments. In particular, CMS and LHCb – offering the best mass resolutions – have confirmed the potential of hadron machines in this field. In addition to the usual quarkonium states, exotic states have also been observed and the elusive Bc mesons have already been seen. At this stage, the focus is on the production cross-section of heavy quarkonia, which can now be understood at LHC energies, assuming colour octet contributions and next-to-leading order (NLO) processes to be relevant. The descriptions follow data up to transverse momenta as high as 20 GeV/c. One of the uncertainties comes from unknown polarization effects that influence acceptance calculations. On the theoretical side, huge progress has been achieved with the full NLO calculation of the J/Ψ cross-section in non-relativistic QCD (NRQCD) and a combined global data analysis of all existing experiments that hints at the universality of the long-distance NRQCD matrix elements.

Hadron machines are unique in the production of b-baryons and Fermilab’s Tevatron has so far been leading this field. The CDF collaboration reported on recent progress with the observation of excited Σb states and a radially excited Λc. CDF and DØ also presented new precision measurements of the mass and width of other charmed baryons.

CChad3_06_11

A thermal medium, of the type generated in heavy-ion collisions at the LHC, can modify hadron properties, especially in the case of quarkonia. The theory of such modifications was reviewed and first results of lead–lead collisions at the LHC presented. Results from ATLAS and CMS show the striking effects of jet-quenching and also the melting of the excited Y-states as compared with the ground-state partner. At lower energies, mass shifts and absorption cross-sections of vector mesons have been studied in the medium. Mass shifts – a long-standing issue, where many predictions have stimulated experimental efforts – have not been observed but small effects have been reported by the HADES experiment at GSI, Darmstadt, on the width of ω mesons in nuclei.

Recent and impressive progress in light meson and quarkonium spectroscopy is in good part a result of recent high-luminosity experiments, which offer 10–100 times the statistical sample of their predecessors. Heavy-meson physics, for long the domain of lepton colliders, is now seeing LHC experiments starting to compete in an impressive way and using their low luminosity data from 2010 to catch up with the Tevatron experiments. An interesting future lies ahead with even further increases in luminosity and precision being offered by future experiments such as BELLE II, the SuperB facility and the PANDA experiment at the Facility for Antiproton and Ion Research.

Two impressive summary talks concluded the conference. Stefano Bianco of Frascati/INFN reviewed the experimental situation, a challenging task in view of the large number of new results presented. On the theoretical side, Chris Quigg of Fermilab gave an inspiring outlook on hadron physics. He recognized the enormous diversity and reach of experimental programmes, which offer insights from unexpected quarters, while remarkable progress has been achieved in theory with the emergence of lattice QCD. However, many puzzles remain, leaving ample opportunities and much work to do, as there are still “simple” questions that the field cannot answer.

Participants enjoyed the coffee breaks in the sunny and secluded courtyard of the Künstlerhaus, a building erected more than 100 years ago for artists to meet and enjoy social events. Long and intense discussions also offered vital scientific exchange around the poster session, making this event a pleasant ending to the day. Long hours of sitting were compensated on Wednesday afternoon with a bicycle tour through the old town of Munich and the English garden, with refreshing drinks in the beer garden. Last but not least, the conference enjoyed a guest talk on neutrino physics by Thierry Lasserre of Saclay, who discussed the mass determination from flavour oscillation and reported fresh results from T2K on hints of νμ→νe oscillation.

Maîtriser le nucléaire : Que sait-on et que peut-on faire après Fukushima ?

par Jean Louis Basdevant

Editions Eyrolles

Broché: €17,50

CCboo2_06_11

Jean Louis Basdevant, ancien professeur à l’Ecole Polytechnique où il donna d’excellents cours, et auteur de nombreux livres pédagogiques, vient de réussir un tour de force en écrivant un livre sur les problèmes du nucléaire en moins d’un mois à la suite de la catastrophe de Fukushima. C’est un livre très pédagogique qui commence par un historique de la radioactivité, puis un exposé du b-a ba de la physique nucléaire, suivi d’une description des avantages et des dangers de la radioactivité dont il quantifie les effets. Suit une description de la fission, puis de la production d’énergie nucléaire présente et future (si les hommes le veulent!), y compris la proposition de fission induite par accélérateur tel que celui proposé par Carlo Rubbia. Vient alors la description de accidents : Lucens (négligeable), Three Mile Island, Tchernobyl (y compris une mise au point sur les doses reçues en France, en fait faibles), et Fukushima, avec une analyse des erreurs et même des fautes qui ont conduit à ces catastrophes.

On passe alors à la fusion, par confinement magnétique et aussi inertielle. Le diagnostic n’est pas très optimiste. Le calendrier d’ITER est sans cesse repoussé et ITER ne produira pas d’énergie électrique. Tout ceci suivi de quelques données sur l’énergie.

On passe aux armes nucléaires et thermonucléaires et leur fonctionnement ou encore les affreuses bombes à neutrons, la lutte contre la prolifération, les dangers du terrorisme, avec la facilité de construire des bombes artisanales, et aussi les bombes classiques contenant des matériaux radioactifs.

Finalement le dernier chapitre est intitulé : « que penser et que faire après Fukushima ». L’auteur se contente de donner des éléments de réponse, sans prendre explicitement parti. Les décideurs devraient certainement lire ce livre pour se faire une opinion sérieuse au lieu de se laisser aller à des réactions émotionnelles incontrôlées. Je recommande vivement la lecture de ce livre.

Bien qu’il soit en Français, je recommande également ce livre aux anglophones : le Français est simple et compréhensible.

Numerical Relativity. Solving Einstein’s Equations on the Computer

By Thomas W Baumgarte and Stuart L Shapiro

Cambridge University Press

Hardback: £55 $90

E-book: $72

CCboo1_06_11

Symmetries are a powerful tool for solving specific problems in all areas of physics. However, there are situations where both exact and approximate symmetries are lacking and, therefore, it is necessary to employ numerical methods. This, in essence, is the main motivation invoked for the use of large-scale simulations in relativistic systems where gravity plays a key role, such as black-hole formation, rotating stars, binary neutron-star evolution and even binary black-hole evolution.

Numerical Relativity by Thomas Baumgarte and Stuart Shapiro is an interesting and valuable contribution to the literature on this subject. Both authors are well known in the field. Shapiro, together with Saul Teukolsky, wrote a monograph on a related subject – Black Holes, White Dwarfs and Neutron Stars (John Wiley & Sons 1983) – that is familiar to students and researchers. The careful reader will recognize various similarities in the overall style of the presentation, with systematic attention to the details of the mathematical apparatus. In Numerical Relativity, 18 chapters are supplemented by a rich appendix. The first part could be used by students and practitioners for tutorials on the so-called Adler-Deser-Misner formalism and, ultimately, on the correct formulation of the Cauchy problem in general relativity.

It seems that the authors implicitly suggest that the future of numerical relativity is closely linked to our experimental ability to observe directly general relativistic effects at work. While astrophysics and gravitational waves have so far provided a rich arena for the applications, the intrinsic difficulties in detecting high-frequency gravitational waves with wide-band interferometers, such as LIGO and VIRGO, might suggest new cosmological applications of numerical techniques in the years to come. This book will take you into an exciting world populated by binary neutron stars and binary black holes.

Still, the achievements of numerical relativity (as well as those of all of the other areas of physics where large-scale computer simulations are extensively used) cannot be reduced simply to the quest for the most efficient algorithm. At the end of nearly 700 pages, the reader is led to reflect: is it wise to commit the research programme of young students and post-docs solely to the development of a complex code? After all, the lack of symmetry in a problem might just reflect the inability of physicists to see the right symmetries for the problem. A balanced perspective for potential readers can be summarized in the words of Vicky Weisskopf, when talking about the proliferation of numerical methods in all areas of physics: “[…] We should not be content with computer data. It is important to find more direct insights into what a theory says, even if such insights are insufficient to yield the numerical results obtained by computers” (Joy of Insight: Passions of a Physicist, Basic Books, 1991).

LHC achieves 2011 data milestone

CCnew1_06_11

In June the LHC made good the promise of delivering an integrated luminosity of 1 fb–1 to the general-purpose detectors, ATLAS and CMS. This was the target for 2011 and it was achieved a little before the middle of the year. At the same time, making use of a technique known as “luminosity levelling”, the LHCb experiment had already recorded around 0.36 fb–1, well on the way to achieving its 1 fb–1 by the end of the year.

Reaching the luminosity milestone was the result of a programme of steady increments in the number of bunches of protons injected into the LHC, with 144 bunches added per beam per step. With each increment, the LHC provides three long “fills” of stable beams, before the next step. By the end of May, the number of bunches reached 1092 per beam, providing a peak luminosity of 1.25 × 1033 cm–2s–1 and a total energy per beam of some 70 MJ. A few long-lived fills soon yielded more than 40 pb–1 at a time for the general-purpose detectors, nearly as much as the LHC delivered in all of 2010, so allowing the 2011 milestone to be reached by 17 June.

The step to 1236 bunches per beam followed on 24 June, with the successful increment only four days later to 1380 bunches per beam – the maximum for the current bunch spacing of 50 ns. Running during these last few days of June included one epic fill that was 19 hours long and delivered an integrated luminosity of 60 pb–1.

At the same time, the technique of luminosity levelling has been employed to deliver a peak luminosity to the LHCb experiment of about 3 × 1032 cm–2s–1. If the beams were allowed to collide head-on in the LHCb detector, this figure would be exceeded, so the beams are initially separated by about 15 μm in the vertical plane. Then, as the beam intensity decays during a fill, this separation is gently reduced to keep the luminosity constant at the acceptable maximum. The LHCb experiment has more specialized physics goals than ATLAS and CMS, and was designed to run at lower luminosity and low multiplicity, processing just one proton–proton interaction per bunch crossing. The decision to increase the bunch intensity in the LHC before increasing the number of bunches, as well as the excellent performance of the detectors, has inspired the collaboration to run with as many as six interactions per crossing. The successful implementation of luminosity levelling means that the physics output of LHCb can be maximized while staying within the limits of peak luminosity that the detector can handle.

CCnew2_06_11

As the total beam intensity of the LHC has been pushed up, the operators have encountered various problems, such as the “unidentified falling objects” (UFOs). These are thought to be dust particles falling through the beam, causing localized beam loss. The losses can push nearby beam-loss monitors over the threshold to dump the beam. This is more of an annoyance than a danger for the LHC, but it does reduce the operational efficiency.

A period of machine development began on 29 June, in which the operators made several investigations for further improvements in the LHC’s performance, including the next steps towards higher beam intensities. One test involved the successful injection of trains of 24 bunches with 25 ns spacing, with up to 216 bunches injected. In other tests, bunches at 50 ns spacing were filled to twice the nominal intensity, with individual bunches at 2.7 × 1011 protons per bunch, the highest intensity achieved. These studies thus offer different paths to higher luminosities in the LHC.

Meanwhile, with the bumper crop of data already in hand, the LHC experiments are now working hard to get results ready for the main summer physics conferences: the European Physical Society’s High Energy Physics conference, being held in Grenoble on 21–27 July, and the Lepton-Photon conference, this year hosted by the Tata Institute in Mumbai on 22–27 August.

• For regular updates on the LHC, see the CERN Bulletin: http://cdsweb.cern.ch/journal/CERNBulletin/.

ALPHA traps antihydrogen for minutes

In November of 2010, the ALPHA collaboration at CERN’s Antiproton Decelerator (AD) grabbed the world’s headlines by trapping a handful of atoms of antihydrogen (CERN Courier January/February 2011 p7). The result demonstrated that it was, indeed, possible to produce trappable antihydrogen atoms. Now, the ALPHA team has shown that it can hold on to the trapped antiatoms for up to 1000 seconds and has succeeded in measuring the energy distribution of the trapped antihydrogen (ALPHA collaboration 2011).

Antihydrogen has been produced at CERN since 2002 by allowing antiprotons from the AD to mix with positrons in a Penning trap comprised of a strong solenoid magnet and a set of hollow, cylindrical electrodes for manipulating the particles. However, being neutral, the antiatoms are not confined by the fields of the Penning trap and annihilate in the apparatus. It has taken eight years to learn how to trap the antihydrogen, mainly because of the weakness of the magnetic dipole interaction that holds the antiatoms. The antihydrogen must be produced with a kinetic energy, in temperature units, of less than 0.5 K, otherwise it will escape ALPHA’s “magnetic bottle”. By contrast, the plasma of antiprotons used to synthesize the antihydrogen begins its time in ALPHA with an energy of up to 4 keV (about 50 million K).

The ALPHA antiatom trap consists of a transverse octupole magnet and two short solenoid or “mirror” coils – all fabricated at the Brookhaven National Laboratory (figure 1). This configuration produces a magnetic minimum at the centre of the device (CERN Courier March 2011 p13). Antihydrogen forms at the magnetic minimum and cannot escape if its energy is below 0.5 K. To see if there is any antihydrogen in the trap, the team rapidly shuts down the magnets (9 ms time constant). Any escaping antiatoms are revealed by their annihilation, which is registered in a three-layer, silicon vertex detector. In 2010, antiatoms were trapped for 172 ms, the minimum time necessary to make certain that no bare antiprotons remained in the trap, and the experiment detected 38 events consistent with the release of trapped antihydrogen.

The ALPHA team has subsequently worked to improve the trapping techniques, succeeding in particular in increasing by a factor of five the number of antiatoms trapped in each attempt; the total number trapped has now risen to 309. The improvements include the addition of evaporative antiproton cooling and optimization of the autoresonant mixing that helps to produce the coldest-possible antiatoms. The team then made measurements in which they increased the time in the trap from 0.4 to 2000 s, yielding 112 detected annihilations in 201 attempts (figure 2). The probability that the detected events are background from cosmic rays is less than 10–15 (8 σ) at 100s, and 4 × 10–3 (2.6 σ) at 2000s. Calculations indicate that most of these trapped antiatoms reach the ground state – which is crucial for future studies with laser and microwave spectroscopy.

The distributions in space and time of the annihilations of the escaping antiatoms are already providing information about their energy distribution in the trap. This can be compared with a theoretical model of how the team thinks the antihydrogen is being produced in the first place.

The long storage time implies that the team can begin almost immediately to look for resonant interactions with antihydrogen – even if only one or two atoms occupy the trap at any given time. For example, resonant microwaves will flip the spin of the positron in the trap, causing a trapped atom to become untrapped, and annihilate. The ALPHA collaboration hopes to begin studies with microwaves in 2011, aiming for the first resonant interaction of an antiatom with electromagnetic radiation. In the longer term, the ALPHA2 device will allow laser interaction with the trapped antiatoms in 2012 – the first step in what the team hopes will be a steady stream of laser experiments with ever-increasing precision.

bright-rec iop pub iop-science physcis connect