Comsol -leaderboard other pages

Topics

GSI team first to trap superheavy element

CCnew4_04_10

An international team of researchers at GSI Darmstadt has successfully contained atoms of nobelium (atomic number 102) in an ion trap. This is the first time that a superheavy element has been trapped. It allowed the team to measure the mass of three isotopes of the element with unprecedented accuracy.

The measurements took place in the SHIPTRAP facility at GSI, which combines an ion trap with the Separator for Heavy Ion reaction Products (SHIP) – a velocity filter that has already been used in the discovery of six superheavy elements at GSI. SHIPTRAP consists of a stopping cell, an RFQ buncher and a double Penning trap system inside a 7 T superconducting magnet. The cell of high-purity helium stops and thermalizes radioactive nuclei, which SHIP delivers at energies of a few 100 keV/u. The stopped ions are extracted into the RFQ structure where they are cooled, accumulated, and bunched. The ions then enter the first Penning trap, where they are selected according to mass by a buffer-gas cooling technique with resolving power of about 50,000. Finally, a purified sample of ions is injected into the second Penning trap where their mass is determined precisely via their cyclotron frequency.

CCnew5_04_10

The nobelium ions were produced in fusion reactions of a beam of 60Ca and a target of lead foil (206–208Pb). They were then separated out from the beam in the SHIP velocity filter, passing at a rate of less than one ion per second (in the case of 252No) into the stopping cell. The decelerated ions were extracted into the RFQ within a few milliseconds and then injected in pulses into SHIPTRAP’s double Penning trap system.

By directly comparing the cyclotron frequency of the nobelium ions in SHIPTRAP with the frequency of precisely known reference ions, the research team was able to determine the masses of the nobelium isotopes 252–254No to uncertainties of about 10 keV/c2 – a relative precision of 0.05 ppm (Block et al. 2010). 254No is now the heaviest radionuclide to have its mass measured directly and 252No is the lowest-production-rate radionuclide whose mass has been measured with a Penning trap.

These mass values provide new, accurate reference points in the region of superheavy elements. The technique also holds promise for identifying elements on the way to the predicted “island of stability”. One of the next goals of SHIPTRAP is to extend these accurate mass measurements to the transactinide region, starting with long-lived rutherfordium isotopes that terminate decay chains originating from Z = 116.

• Element 112, first observed at GSI in 1996, now officially carries the name copernicium and the chemical symbol Cn, after approval by the International Union of Pure and Applied Chemistry (IUPAC). The name honours scientist and astronomer Nicolaus Copernicus. The discoverers had suggested Cp as the symbol, but as this abbreviation has other scientific meanings, they agreed with IUPAC on Cn. Copernicium is the heaviest element officially recognized by IUPAC.

STAR finds heaviest antinucleus

Studies of high-energy collisions of gold ions by the STAR collaboration at the Relativistic Heavy Ion Collider (RHIC), Brookhaven, have revealed evidence of the most massive antinucleus to date. The new antinucleus is an antihypertriton – a negatively charged state containing an antiproton, an antineutron and a Λ. It is also the first antinucleus containing a strange antiquark.

The new state is related to antihelium-3, with the Λ replacing one of the neutrons. The STAR team identified it via its decay into antihelium-3 and a positive pion. Altogether, in an analysis of hundred million collisions, they found 70 ± 17 antihypertritons and 157 ± 30 hypertritons (consisting of pnΛ).

In heavy-ion collisions only a tiny fraction of the emitted fragments are light nuclei, but these states are of fundamental interest. The STAR team finds that the measured yields of hypertritons (antihypertritons) and helium-3 (antihelium-3) are similar. This suggests an equilibrium in the populations of up, down, and strange quarks and antiquarks, contrary to what is observed at lower collision energies.

Super-Kamiokande sees first T2K event

CCnew7_04_10

The international Tokai-to-Kamioka (T2K) collaboration announced the first detection of a long-distance neutrino in the Super-Kamiokande detector on 24 February. The neutrino had travelled 295 km under the Earth’s surface from the beamline at the Japan Proton Accelerator Research Complex (J-PARC) in Tokai, north of Tokyo, to the gigantic Super-Kamiokande underground detector in an old mine near the west coast of Japan.

The T2K experiment uses a high-intensity proton beam at J-PARC in Tokai to generate neutrinos that travel to the 50 kt water Cherenkov detector, Super-Kamiokande. The experiment follows in the footsteps of KEK-to-Kamioka (K2K), which generated muon neutrinos at the 12 GeV proton synchrotron at KEK. With the beam generated at the J-PARC facility, T2K will have a muon-neutrino beam 100 times more intense than in K2K.

The experiment has been built to make high-precision measurements of known neutrino oscillations, and to look for the so-far unobserved type of oscillation that would cause a small fraction of the muon-neutrinos produced at J-PARC to become electron-neutrinos by the time they reach Super-Kamiokande.

CDF and DØ joint paper puts a further squeeze on the Higgs

CCnew2_02_10

Almost a decade after the experiments at CERN’s Large Electron–Positron (LEP) collider set a limit on the mass of the Higgs boson of 114.4 GeV/c2, the two experiments at Fermilab’s Tevatron, CDF and DØ have been able to reduce further the allowed mass range for the missing particle in their first joint Run II publication.

In the proton–antiproton collisions observed at the Tevatron, the Higgs boson could be produced in the fusion of two gluons. If its mass is more than 140 GeV/c2, it will usually decay into a pair of W bosons. The decay of the W bosons into a charged lepton (electron or muon) and a neutrino leads to three different signatures in the detectors: two electrons, two muons or an electron and a muon, in addition to “missing energy” from the undetected neutrinos. This is the key to reducing the background from the jets, which are copiously produced in hadronic collisions.

The Higgs boson is a scalar particle, i.e. it carries no spin. This fundamental property helps to distinguish the decays of the Higgs to two W bosons from other events that contain pairs of W bosons. The two charged leptons from the W boson decays in Higgs events are more likely to be close together than back-to-back in the detector. As a final step in seeking the Higgs, artificial neural networks are trained to distinguish a Higgs signal from background using a large number of kinematic variables.

Both Tevatron experiments have their best sensitivity at a Higgs mass of about 165 GeV/c2, i.e. just around the combined mass of the two W bosons. With about 5 fb–1 of collision data analysed, each experiment alone does not yet have sensitivity to exclude a Higgs boson if it is produced at the rate predicted by the Standard Model. Putting their data together, CDF and DØ can double the number of collisions used, breaking the “Standard Model barrier” for the first time since LEP.

Together, the experiments would expect about 70 Higgs events for a mass of around 165 GeV/c2 but their combined data are consistent with the assumption that no Higgs events have been produced. This observation is translated into a limit that excludes a Higgs boson in the mass range 162–166 GeV/c2 at the 95% confidence level.

The paper describing the combination is the first joint publication of the two collaborations using data from Run II of the Tevatron, which started in 2001. The publication, with 1042 authors, will appear in Physics Review Letters together with the individual results in separate letters. The data used represent about half the number of collisions that will eventually be recorded by CDF and DØ. This will give them the opportunity to increase significantly the sensitivity of Higgs searches in the future.

Together with the precision electroweak data that favour a low-mass Higgs boson, these new results indicate that the most likely mass for the Higgs boson – if it exists – is somewhere between the LEP and Tevatron limits of 114 and 162 GeV/c2.

Collaboration: the key to unlocking exotic nuclei

CCexo1_02_10

The most sophisticated physics experiments today take place at large facilities – accelerators – which in turn need large financial investments that cannot be provided by a single country, even if it is highly developed. Such investigations can be carried out only by collaborations between scientific centres in several countries, each bringing financial and intellectual contributions to the development of the cutting-edge facilities that make it possible to penetrate deeper into the secrets of matter. The LHC at CERN, for example, is the result of contributions from 20 member states and many other countries worldwide. Similarly, the Joint Institute for Nuclear Research (JINR) in Dubna, with 18 member states, is home to several accelerators. This internationally based research leads to new information not only about physics but also other fields, such as astronomy, condensed-matter physics and modern technology. The methods used are also of great importance for interdisciplinary fields, such as nanotechnology, medicine and microelectronics.

The physics of nuclei in exotic states is one of the most important and rapidly developing areas in nuclear physics. Researchers can now produce nuclei in extreme states – such as nuclei with high angular momentum (rapidly rotating nuclei), high excitation energy (“hot” nuclei), highly deformed nuclei (nuclei with unusual, super- and hyper-deformed shapes), nuclei with extremely large numbers of neutrons or protons (neutron-rich and proton-rich nuclei) and superheavy nuclei with a proton number, Z, above 110. The investigations of nuclear matter in such extreme states provide important information about the properties of the microcosm and make possible the modelling of a variety of processes taking place in the universe.

These studies, which rely on collaborative effort between countries, were the subject of the international symposium EXON 2009 held in the Black Sea resort of Sochi, Russia, on 28 September – 2 October. The symposium was organized by the four largest centres involved in the investigation of exotic nuclear states: JINR in Russia; the Grand Accélérateur National d’Ions Lourds (GANIL) in France; the GSI Helmholtzzentrum für Schwerionenforschung in Germany; and the research centre RIKEN in Japan. Some 140 scientists from institutes in 24 countries as well as from JINR attended the conference, with the largest number of participants from outside Russia coming from Germany (20 participants), France (16), Japan (12) and the US (8), with about 40 participants from JINR and 16 from institutes across Russia.

EXON 2009 was the fifth in this series of symposia on exotic nuclei, all of which have been held in Russia, the first one in 1991. All have been of interest not only for the organizers, but also for participants from other research centres. In addition to the discussions of scientific problems and the collaboration necessary to address them, participants have the opportunity to become acquainted with some of the most remarkable places in Russia, while the local research authorities and universities find out about the latest results in nuclear physics and the possibilities of applications in interdisciplinary fields of science and technology.

The scientific programme included invited talks about pressing problems in the physics of exotic nuclei as well as about new projects for large accelerator complexes and experimental facilities. The main discussions about the properties of nuclei at the limits of nucleon stability took place on the first day, with reports on the newly observed unusual states at high values of the ratios of the proton to neutron numbers. The topics included: the change of the “accepted” magic numbers when approaching the limit of neutron stability; the coexistence in the same nucleus of two or more types of deformation; and the increase of nuclear stability resulting from deformation, which is important for understanding the stability of pure neutron matter. As UNESCO had declared 2009 the International Year of Astronomy, one talk was dedicated to investigations in this field. Shigeru Kubono from the University of Tokyo discussed the possibilities of studying important astrophysical problems with the use of radioactive secondary beams.

Superheavy elements

In addition to the talks on light exotic nuclei, other reports covered the results of the latest experiments on the synthesis and properties of superheavy elements. Joint experiments by JINR’s Flerov Laboratory of Nuclear Reactions (FLNR), GSI and the Paul Scherrer Institute have found interesting results on the chemical identification of elements 112 and 114 at the FLNR U400 cyclotron, as Heinz Gäggeler of PSI described. Speakers from different countries reported on a range of investigations of the properties of the superheavy elements using different methods. These reports underlined the importance of the investigations of superheavy elements that are carried out in Dubna by existing collaborations. One striking example is the experiment aimed at the synthesis of element 117 that is currently being performed at the U400 cyclotron by a large group of physicists and chemists under the guidance of Yuri Oganessian and Sergei Dmitriev, in collaboration with scientists from different laboratories in the US, who provided the target material of 249Bk. In addition, theoretical presentations included predictions of possible reactions for synthesizing superheavy elements and of their chemical properties.

A second day was dedicated to reports on the current and future heavy-ion and radioactive beam accelerator complexes in different scientific centres. The four laboratories that co-organized the symposium are currently creating a new generation of accelerators that will make it possible to improve considerably the work on the synthesis and studies of the properties of new exotic nuclei. There were detailed talks on the SPIRAL project at GANIL, the RI Beam Factory at RIKEN, the Facility for Antiproton and Ion Research (FAIR) at GSI and the DRIBs project at JINR. In his talk, Mikhail Itkis, JINR’s vice-director, presented plans for the development of the institute’s accelerator facilities, including the new complex, NICA. Georg Bollen of Michigan State University reported on the project for the Facility for Rare Isotope Beams (FRIB), now funded and to be built at the university. In this way, more centres are joining the group of institutes that are developing a new generation of accelerator complexes.

There were also presentations about other facilities for the production of radioactive beams, including ALTO in Orsay, EXCYT in Catania, RIBRAS at the University of Sao Paulo and the radioactive beams project at the Cyclotron Institute of Texas A&M University. The discussions around these talks showed that beams of radioactive nuclei are fundamental to investigations of the properties of nuclear matter in extreme states.

Round-table discussions also took place during the symposium to consider the results obtained in joint work and possible future collaborations. Bollen, a leader of the FRIB project, suggested including Michigan State University as a co-organizer of the next symposium, EXON 2012, which could take place in the city of Vladivostok in the Russian Federation.

• For full details about the scientific programme and speakers, see http://exon2009.jinr.ru/. There were about 80 talks in total and some 40 posters shown, all of which will be published in conference proceedings by the American Institute of Physics.

Cosmic rays, climate and the origin of life

In his pioneering work on supersaturated vapours, which began in the 1890s, C T R Wilson found that droplets condensed on the ionization trails left by charged particles. This led to many positive advances, among the most significant of which was the use of “cloud chambers” in ushering in the field of elementary particle physics. Just over 50  years ago, Edward Ney at the University of Minnesota suggested that cosmic rays might have an influence on the climate (Ney 1959). He proposed that ions from cosmic rays act as condensation centres for cloud droplets. It is immediately obvious that this is not the phenomenon that Wilson discovered. To make ionization trails visible, cloud chambers need very clean conditions and supersaturation at a level about four times greater than saturation. By contrast, it is rare to find such clean conditions in the atmosphere and supersaturation levels are almost never more than 1% above saturation.

More recently, in 1997 Henrik Svensmark and Eigil Friis-Christensen reported a link between clouds and elementary particles (specifically cosmic rays) that has been used to claim that changes in cosmic-ray intensity cause changes in cloud cover and could affect global warming (Svensmark 2007). In this article we describe work that examines this claim critically. We also touch on the fascinating topic of lightning initiated by cosmic-ray showers and its possible role in the origin and evolution of life.

Cosmic rays and cloud cover

CCcos1_02_10

Figure 1 shows the basic evidence used to claim a causal correlation between cosmic-ray intensity (measured by the neutron monitor operated by the University of Chicago at Climax, Colorado) and low cloud cover (at altitudes below 3.2 km as measured by satellites). The data in the figure are for the period 1985 to 2008, which covers two solar cycles. In the 22nd cycle (1985–1996) the correlation between cosmic-ray intensity and low cloud cover was strong and this was the origin of the claim. However, the next (23rd) solar cycle has now passed (1996–2007) and the correlation is much more difficult to see. This suggests that the 1985–1996 observation might have been “accidental” and the effect of something completely different (such as temperature). Nevertheless, we will put this to one side and consider whether the apparent correlation is causal.

CCcos2_02_10

A test of the causal hypothesis is to examine the correlation as a function of geomagnetic latitude. The 11-year cosmic-ray variation becomes bigger at higher magnetic latitudes because of the effect of the Earth’s magnetic field. Fewer low-energy cosmic rays enter the Earth’s atmosphere near the magnetic equator than near the poles. This effect is measured by the vertical rigidity cut-off (VRCO) – the minimum rigidity for a primary cosmic ray to reach the Earth’s atmosphere – which is computed from the local value of the planet’s magnetic field. Our analysis looked at the differences between the low cloud cover at solar minima in 1985 and 1996 and that at solar maximum in 1990 at different VRCO (Sloan and Wolfendale 2008). These were then compared with the changes in the cosmic-ray rate as measured from neutron monitors located around the world (figure 2). If the dip in the low cloud cover observed in 1990 was caused by the decrease in ionization from cosmic rays then all of the points in figure 2 would follow the line of the cosmic-ray variation, marked NM. They do not.

Cosmic rays are not the only source of ionization in the atmosphere. We have looked for changes in cloud cover associated with a variety of other sources. The ionization released from nuclear weapon tests in the atmosphere was one example that we examined. At large distances from the test centre, radiation levels are high but other effects of the blast are negligible. For example, measurements showed that the Bravo test (the largest of the US tests), which exploded a 15-megatonne device at Bikini Atoll on 1 March 1954, produced radiation levels of 100 R/h at a distance of 480 km from the explosion. This corresponds to 5 × 107 ion pairs/cm3, i.e. seven orders of magnitude more ionization than that produced by cosmic rays. However, no effects on cloud cover were observed. This shows that the efficiency for conversion of ions to cloud droplets must be low. Similarly, we examined radon concentrations in various parts of the world to see if high-radon regions had more cloud cover than their neighbours with low-radon concentrations. We also examined the ionization released in the Chernobyl disaster in 1986. Again, we did not find any significant effects of ionization on cloud cover.

Recently, Svensmark’s group examined the so-called Forbush decreases in cosmic-ray intensity, which are caused by solar coronal mass ejections. The group found that the six strongest (over the past 20 years) are followed by significant drops in low cloud cover and in other indicators of atmospheric water content. We have examined the evidence in detail and concluded that it is not only statistically weak but that it also needs unphysically long periods (6–9 days) for the change in cosmic-ray flux to manifest itself as changes in cloud cover or the cloud water content.

The correlation between low cloud cover and cosmic rays in figure 1 is presumably therefore not causal because we have found that ionization is not efficient at yielding cloud cover. A more likely cause relates to solar irradiance, not least because the change in energy content of solar irradiance is about 108 times that of cosmic rays. In this context, Mirela Voiculescu of Dunarea de Jos University in Romania and colleagues showed correlations between low cloud cover and either the cosmic-ray rate or the solar irradiance in limited geographical areas (Voiculescu et al. 2006). Such areas cover less than 20% of the area of the globe. A close examination of these geographical areas reveals that only the correlation between the solar irradiance and the cloud cover is seen in both solar cycles. By contrast, any correlation that there is with cosmic rays does not appear in both cycles.

Variation in solar irradiance over the 11-year solar cycle is a much more plausible cause of any correlation with cloud cover than cosmic rays; indeed, Joanna Haigh of Imperial College London has modelled such an effect (Haigh 1996). A comparison of the long-term variation of the global average surface temperature with the long-term solar activity shows that less than 14% of the observed global warming in the past 50 years comes from variations in the solar activity.

This is not to say that ionization has no effect on climate at all. There may well be an interesting effect involving the terrestrial electrical circuit, which seems to be affected by cosmic rays. No doubt the CLOUD experiment underway at CERN will throw further light on this problem and tell us just how much such an effect will be.

Lightning and the origin of life

One fall-out of the work described above has been interest in the role of cosmic rays in a particularly dramatic cloud effect: lightning. Alexander Gurevich and Kirill Zybin of the P N Lebedev Physical Institute in Moscow suggested in 2002 that extensive air showers (EAS) created by cosmic rays play a key role in initiating the leader strokes in lightning. This has been confirmed in more recent observations at the Lebedev Institute’s Tien-Shan Mountain Cosmic Ray Station by Gurevich and colleagues.

This phenomenon has a possible relevance to the origin of life on Earth. The current favourite models for this origin are either on comets in outer space, as Fred Hoyle and Chandra Wickramasinghe suggested, or in the black smokers or alkaline vents that result from volcanic activity in the deep oceans. However, another possibility follows from the famous early experiments of Stanley Miller and Harold Urey, in which they passed a spark through a mixture of liquids (water, methane, ammonia, etc) – the “prebiotic soup”. This resulted in the appearance of the basic building blocks of life, such as amino acids, RNA and monomers. One problem, however, was that the available spark energy, from lightning, was thought to be inadequate.

CCcos3_02_10

This is where the long-term variability of EAS rates may have relevance. We have shown that there should have been periods during which the EAS rate was higher by orders of magnitude than at present (Erlykin and Wolfendale 2001). Our theory is based on the statistical nature of supernova explosions, which are thought to be the originators of high-energy cosmic rays. Figure 3 shows how, from time to time, periods of high cosmic-ray intensity of tens of thousands of years will occur, as a nearby supernova explodes. This will lead to high lightning rates. One of these, occurring at around 4 Gy before the present (a not unlikely occurrence), could have led to the formation of the building blocks of life via the Miller-Urey mechanism. Life could then have evolved from such a start.

CCcos4_02_10

Perhaps less speculatively is the role of NOx (NO + N2O) generated by lightning strokes. It seems that nearly 20% of the contemporary concentration of NOx is produced by lightning. Its rate of production would certainly vary considerably. NOx is poisonous to mammals but promotes growth in plants; thus, an effect on evolution of species, both positive and negative, is likely.

In conclusion, the interaction of cosmic rays with the Earth’s atmosphere is a topic of considerable interest. Although it is unlikely that cosmic rays are a significant contributor to global warming, their contribution to the pool of aerosol-cloud condensation nuclei could be non-negligible; the CLOUD experiment has a big role to play in elucidating the interesting science involved. On a wider canvas it would not be surprising if electrical effects in the atmosphere, initiated by cosmic rays, played a role in the evolution of the Earth’s inhabitants.

• The authors are grateful to the John Taylor Foundation for supporting this work.

Lepton Photon goes back to its roots in Hamburg

CClep1_02_10

The first time that the International Symposium of Lepton and Photon Interactions at High Energies took place in Hamburg, in 1965, it was in its earlier incarnation that referred to electrons, rather than all leptons. The DESY electron synchrotron had started up the previous year, so the young laboratory was an obvious host for a conference that was relatively specialized. Since then, high-energy electrons have revealed the reality of quarks and the complex nature of the proton; muons have provided signatures of new states of matter, from charmonium to quark–gluon plasma; neutrinos from beyond the Earth have given glimpses of physics beyond the Standard Model; and photons have begun to offer a new view of the high-energy universe. “Lepton Photon” has thus grown to encompass all of particle physics and the 24th symposium, held in Hamburg during DESY’s 50th anniversary year, was no exception.

Within its standard format of invited plenary sessions only, Lepton Photon 2009 presented a clear and concise overview of particle physics today. Expectations for the future formed a recurrent theme, not only in view of the imminent start-up of the LHC but also looking to upgrades, new experiments and facilities to push frontiers in energy and luminosity. This report will focus mainly on recent results presented at the conference in topics varying from QCD and heavy ions to neutrinos and dark matter.

When originally planned, it was likely that this conference would be dominated by the first collisions at the LHC. This news should now fall to the summer conferences in 2010, but the LHC still loomed large in the presentations at Lepton Photon 2009. The first scientific session heard the latest news about the steady progress towards the restart, following the incident of September 2008. The four major experiments, ALICE, ATLAS, CMS and LHCb, took advantage of the prolonged shutdown to complete installation work, implement improvements and make thorough tests with cosmic rays – efforts that led to a highly successful restart in November and December last year.

HERA’s harvest

CClep2_02_10

The harvest of data from HERA – the world’s only electron–proton collider, which ran from 1992 until 2007 – continues to paint a remarkably clear picture of the internal workings of the proton within the context of QCD, the theory of the strong force. The precision that comes from combining HERA-I data (1996–2000) from the H1 and ZEUS experiments yields impressively accurate distribution functions for the gluons and the quark–antiquark sea in the latest QCD analysis at next-to-leading order (NLO), especially at low values of the momentum fraction. Both H1 and ZEUS have made the first measurements of the structure function, FL, at low x and there are also new results from ZEUS with improved precision at high values of momentum-transfer-squared (high Q2).

The HERMES collaboration at HERA took a different approach by observing the collisions of the electron beam with a gas target. The analysis of kaon production from deuterons indicates that the density of strange quarks – present in the quark–antiquark sea in protons and neutrons – varies differently with x than does that of the sea of lighter quarks. H1, meanwhile, has new measurements for charm and bottom quarks, which agree with QCD analyses.

The main aim of HERMES was to learn more about contributions to the nucleon’s spin, the goal also of COMPASS at CERN (using muons), fixed-target experiments with electrons at Jefferson Lab and polarized proton–proton collisions at RHIC at Brookhaven. Results from these studies have fed the first global NLO QCD analyses of both polarized deep-inelastic lepton–nucleon and proton–proton scattering. The results reinforce the puzzling discovery that the quarks and antiquarks contribute only 25–35% of the nucleon’s spin. They also indicate a large negative contribution from the strange quark at low x, with small contributions so far from the gluon, derived for the first time from the polarized proton–proton data, but subject to large uncertainties.

A fuller view

CClep3_02_10

While parton distribution functions (PDFs) give a picture of the momentum fraction carried by the constituents in a nucleon, generalized PDFs (GPDFs) give a fuller view that includes information on longitudinal and transverse momentum, which should allow the contribution of orbital angular momentum to the nucleon’s spin to be derived. GPDFs can be extracted from measurements of deeply virtual Compton scattering. The e1-dvcs experiment with the CLAS detector at Jefferson Lab has made an extensive set of high-quality measurements of the beam-spin asymmetry, which will constrain the GPDFs. Also at Jefferson Lab and elsewhere, experiments have studied transversity, which gives a measure of helicity-flip. Last summer the HERMES collaboration reported clear evidence for a non-zero “Sivers effect” in semi-inclusive deep-inelastic scattering from a transversely polarized target, which suggests a non-zero orbital angular momentum for the quarks in a nucleon. A recent fit to data from both HERMES and COMPASS to determine the Sivers function indicates that the orbital angular momentum is mainly from the valence quarks.

Fermilab’s Tevatron continues its Run II, which began in 2001, with proton–antiproton collisions at a total energy of 1.96 TeV. Here the study of jets of particles reveals the hard scattering of the quarks. The DØ collaboration has now measured the angular distribution of pairs of jets (dijets) at 1.96 TeV collision energy, for dijet masses ranging from 0.25 TeV to more than 1.1 TeV – in effect, the first “Rutherford” experiment to go above 1 TeV, a century after Hans Geiger and Ernest Marsden published their results on alpha-particle scattering, which gave the first evidence for Rutherford scattering. This sets the most stringent limits to date on the scale for quark structure, Λ> 2.9 GeV, and also on the scale of extra dimensions.

The large amounts of data accumulated in Run II provide a major test-bed for QCD and an important hunting ground for new particles and new physics. By the time of the conference, the collider had delivered 7 fb–1 and the collaborations had analysed 2.7 fb–1. The CDF and DØ experiments have high-precision results that agree well with NLO perturbative QCD for inclusive jets and dijets, setting limits on new particles with masses up to more than 1.2 TeV. By contrast, there are discrepancies that are still to be understood in the production of isolated photons.

CClep4_02_10

While results such as these from HERA and the Tevatron continue to consolidate QCD, there has also been impressive theoretical progress in making more precise predictions, in particular in higher-order calculations in readiness for the LHC. Leading-order calculations are already automated and are beginning to include more final-state particles, as in 2→6 body. There are important breakthroughs at NLO, with the first calculation of a 2→4 body cross-section, qqı→ttıbbı, in 2008 and developments in automation, for example in calculating W+3 jets, in 2009. These are important for estimating backgrounds to searches at the LHC. At next-to-NLO, there is progress in calculations on processes that will provide “standard candles” at the LHC.

At the same time, lattice QCD is moving from simulation to the calculation of real physical quantities, a quarter of a century after its invention. Improved algorithms with light quarks have led to new results on the hadron spectrum, with masses agreeing well with experiment. Contributions to flavour physics are also progressing with improved inputs for the Cabibbo–Kobayashi–Maskawa (CKM) matrix. A steady increase in computing power, to the petaflop scale, should lead to further improvements through simulations with smaller spacings (from 0.1 to 0.05 fm) on larger volumes (3–6 fm scale), which will be better suited for studies of QCD in hot, dense matter.

The ultimate test for QCD lies arguably in the hot and dense matter that forms in relativistic heavy-ion collisions and in determining its equation of state and bulk thermodynamic properties. Lattice QCD provides access to this extreme state through simulation, while RHIC at Brookhaven has been the main scene for such studies since 2000. The elliptical flow observed at RHIC is consistent with a phase transition – and this is what recent lattice QCD simulations also clearly indicate. It is also consistent with the formation of an almost perfect fluid, with a ratio of viscosity-to-entropy-density almost 10 times lower than in superfluid helium. Intriguing puzzles remain, however. The BRAHMS and PHOBOS experiments at RHIC shut down in 2006, but STAR and PHENIX are being upgraded. The LHC will also target hot QCD matter and perhaps observe the kinds of shockwave described in hydrodynamical calculations of a fluid-like medium.

Making inroads

CClep5_02_10

In the electroweak sector, the Tevatron continues to make inroads into the areas that were out of reach to experiments at the Large Electron Positron (LEP) collider and SLAC Linear Collider, in particular measuring the W boson and top quark as never before. Nine years after the discovery of top, through ttı pair-production, both CDF and DØ finally observed the electroweak production of single top quarks in 2009. The combined results presented at the conference yield σt = 2.76 + 0.58 – 0.47 pb; they also allow a measurement of the CKM matrix element, |Vtb| = 0.91 ± 0.08. The experiments together now know the top quark mass to 0.7%, with a combined measurement of 173.1 ± 0.6 (stat ± 1.1 (syst GeV. Other results include a new world average for the mass of the W boson of 80399 ± 23 MeV, incorporating Tevatron data that gives an average of 80420 ± 31 MeV.

The Tevatron experiments also continue to chip away at channels that are difficult to pull out of the data, but which will be important in searches for the Higgs boson at the LHC. For example, both DØ and CDF have now observed the production and decay of a pair of Z bosons to four leptons – the smallest cross-section of diboson states in the Standard Model – at significances of 5.3σ and 5.7σ, respectively.

CClep6_02_10

For real progress, the Standard Model is still screaming out for hard evidence for (or against) the Higgs boson. Direct searches at the Tevatron now exclude a Standard Model Higgs with a mass in the range 160–170 GeV (at 95% CL), while precision measurements, including the Tevatron’s masses for the W boson and top quark, push the mass below 163 GeV. By 2011, or soon after, the Tevatron should provide sufficient luminosity to exclude the Standard Model Higgs directly – or provide evidence for it.

In addition to squeezing the Higgs, CDF and DØ continue to search for new phenomena, but so far without success. At the same time, a variety of experiments are putting pressure on the Standard Model, searching for cracks that might lead to new physics. The Standard Model, meanwhile, remains so impervious that effects not much bigger than 2σ seem hopeful: at HERA, combined data from H1 and ZEUS show a slight excess (2.6σ) at high-momentum-transfer events in e+p interactions with multilepton final states.

Low-energy experiments also offer a route to new physics, for example, through measurements of finite electric dipole moments (EDMs) and rare muon decays. Here, the experiment at the University of Washington in Seattle delivered an important result in 2009, with a new limit on the EDM of 199Hg of <3.1 × 10–29 e cm (at 95% CL) – a factor of seven reduction in the previous upper limit. The collaboration has further improvements in the pipeline, which should increase the experimental sensitivity by a factor of 3 to 5. In the search for rare muon decays, the MEG experiment at PSI found a preliminary result for the branching ration of μ+ → e+γ of <3.0 × 10–11 from data collected in 2008.

Flavour physics offers a different line of attack, in particular through the CKM matrix, which links the different quark flavours. Testing the unitarity of the matrix ultimately tests the integrity of the Standard Model. New measurements on nuclear β-decays and from the KLOE, CLEO-c, Belle and BaBar experiments, as well as from CDF and DØ, continue to probe the matrix with increasing precision, with the result that the magnitudes of the matrix elements agree well with unitarity, although there are some small (up to 2σ) inconsistencies in results from different analyses. The global fit to the unitarity triangle is also good with the angles summing to (185 ± 13)°, although again there is some tension concerning sin2β at the 2σ level. High luminosity at the B factories at KEK and SLAC are making possible an impressive series of measurements on rare B decays with potential to expose new physics. The decay B→τυ, for example, which puts constraints on a charged Higgs particle, disagrees with the CKM fit at the 2.4σ level.

Neutrinos have been the only particles that have so far provided a playground outside the Standard Model, with the discovery of neutrino oscillations – and hence neutrino mass – in atmospheric and solar neutrinos some 10 years ago. Since then various experiments have pinned down oscillation parameters to the level of a few per cent, with different types of experiment being suited to different parameters. For example, experiments with solar electron-neutrinos and reactor electron-antineutrinos give access to the mass m21 and mixing angle θ12. The reactor experiment KamLAND finds Δm212 = 7.58 × 10–5 eV2 (to a level of 2.7%) and tan2 θ12 = 0.56 (to ˜25%) compared with the global solar result from the solar neutrino experiments of Δm212 = 4.90 × 10–5 eV2 (˜34%) and tan2 θ12 = 0.437 (˜10%). The Borexino experiment is now producing interesting results for solar neutrinos, over an energy range that includes electron-neutrinos from 7Be and the carbon-nitrogen-oxygen cycle as well as from 8B.

Using the muon-neutrino beams at the Neutrinos at the Main Injector facility at Fermilab, the MINOS experiment has measured the disappearance of muon-neutrinos, observing 848 events against an expectation of 1060 ± 60 for no oscillations and disfavouring other theoretical possibilities at a level of 6σ. MiniBooNE is, by contrast, investigating oscillations at lower energies with neutrinos from the Fermilab Booster neutrino beam. Set up to investigate the excess electron-antineutrino events seen in a muon–antineutrino beam by the LSND experiment at Los Alamos, MiniBooNE finds no significant excess across an energy range of 200–1250 GeV, but the results are as yet inconclusive regarding oscillations with Δm2 at the 1 eV2 scale suggested by the LSND result. Intriguingly, however, MiniBooNE does continue to observe an excess of electron-like events in the muon-neutrino beam, in the energy region between 200 and 475 GeV, as first reported in 2007.

Neutrinos from outer space have the potential to provide a new view of the cosmos, but their sources continue to elude discovery. There is more success with charged cosmic rays, where the Pierre Auger Observatory is making headway in the study of ultra-high-energy cosmic rays, with as many as 58 events above 55 EeV (55 × 1018 eV). The latest results confirm the extragalactic origin of these ultra-high-energy particles and their anisotropic distribution and underpin the collaboration’s enthusiasm for an Auger North array in the Northern Hemisphere to complement the existing Auger South array in Argentina.

The greatest success in pinning down sources comes from the cosmic gamma-ray experiments, with the Cherenkov arrays such as HESS and MAGIC complemented by the new spacecraft Fermi and AGILE. At very high energies the number of identified sources has risen from 12 in 2003 to an impressive 96 in 2009, which includes new categories such as starburst galaxies (2) and Wolf-Rayert objects (3) as well as the more familiar active galactic nuclei (24) and pulsar wind nebulae (23). The FermiLAT collaboration has also significantly increased the number of identified sources of high-energy gamma rays, finding 205 with a significance level of more than 10σ.

CClep7_02_10

Cosmic radiation is also offering a tantalizing window on dark matter to complement the direct laboratory-based searches for dark-matter particles. The direct searches have seen much progress in looking for the hypothesized axions and weakly interacting massive particles, but confirmed detection remains elusive. Similarly, cosmic rays provide conflicting and unconfirmed evidence. The FermiLAT collaboration and the PAMELA experiment find increases in electrons and positrons, respectively, which could indicate dark matter but are probably effects from nearby pulsars.

From QCD to dark matter, Lepton Photon 2009 took a sweeping view across the whole range of particle physics today. While the Standard Model stands firm there remain many unanswered questions. In the closing talk, Guido Altarelli raised the spectre of the anthropic solution: perhaps we live in a universe that is very unlikely but that allows our existence, but he swiftly said he thought that this was not appropriate. In his view, supersymmetry remains the best solution to difficulties such as the hierarchy problem and, if this is the case, the LHC should find the light supersymmetric particles. The LHC is thus heavily charged with the expectations of the worldwide particle-physics community. If all goes well, results from the new collider should indeed dominate the next meeting in the Lepton Photon series, which is to be held in Mumbai in 2011.

The fascinating world of strange exotic atoms

CCeds1_01_10

The field of exotic atoms has a long history and it is currently experiencing a renaissance, from both the experimental and theoretical points of view. On the experimental side, new hadronic beams are either already available, with kaons at the DAΦNE facility at Frascati, or will soon become available with the start-up of the Japan Proton Accelerator Research Complex (J-PARC). New detectors, with improved performance in energy resolution, stability, efficiency, trigger capability etc, are also starting to operate. On the theoretical side the field has advanced significantly through recent developments in chiral effective-field theories and their applications to hadron–nuclear systems. In light of these developments it was appropriate for the international workshop “Hadronic atoms and nuclei – solved puzzles, open problems and future challenges in theory and experiment” to address these topics on 12–16 October 2009, at the European Centre for Theoretical Studies in Nuclear Physics and related areas, ECT*, Trento.

Unique methods

So what are hadronic atoms and why is there a growing interest in studying them? An exotic hadronic atom is formed whenever a hadron (pion, kaon, antiproton) from a beam enters a target, is stopped inside and replaces an orbiting electron. Such an exotic atom is usually formed in a highly excited state; a process of de-excitation through the respective atomic levels then follows. The X-ray transitions to the lowest orbits (1s) are affected by the presence of the strong interaction between the nucleus and the hadron, which shifts the 1s level with respect to the value calculated on a purely electromagnetic basis and limits the lifetime (increases the width) of the level. Extracting these quantities via the measurement of the X-ray transitions provides fundamental information on the low-energy hadron–hadron and hadron–nuclear interactions, which is impossible to obtain by any other method. Quantities such as kaon–nucleon scattering lengths, for example, turn out to be directly accessible by measuring the properties of exotic atoms. These are key quantities for dealing in a unique way with important aspects of low-energy QCD in the strangeness sector, such as chiral-symmetry breaking.

The DAΦNE Exotic Atoms Research (DEAR) experiment has measured kaonic hydrogen with unprecedented precision, which led to a lively debate at the workshop on the procedure for extracting the kaon–proton scattering length as well as its compatibility with existing kaon–nucleon scattering data. The SIDDHARTA collaboration, also at DAΦNE, presented the results of an even more precise measurement performed in 2009 on kaonic hydrogen, which will be complemented with an exploratory measurement of kaonic deuterium. The E570 experiment at KEK and SIDDHARTA have both measured kaonic helium and found that there is agreement with theory, thereby solving the “kaonic helium puzzle” – a long-standing discrepancy between measured and theoretical values for the 2p level in 4He. The new E17 experiment planned at J-PARC will in the near future measure the X-ray spectrum of kaonic 3He with the highest precision. With other experiments already in the pipeline at existing and/or future machines at GSI, J-PARC and DAΦNE, the future of hadronic atoms will extend its horizons both in terms of precision as well as in dealing with new types of exotic atoms not previously measured, such as kaonic deuterium or sigmonic atoms (where a sigma replaces an electron).

CCeds2_01_10

Another hot topic that was intensively discussed at the workshop concerns the recent studies of K mediated bound nuclear systems. Theory originally suggested that the (strongly attractive) isospin I=0 KN interaction in few-body nuclear systems can favour the formation of discrete and narrow K-nuclear bound states with large binding energy (100 MeV or even more). However, recent work suggests that such deeply bound kaonic nuclear states do not exist: antikaon–nuclear systems might be only weakly bound and short-lived. There are different interpretations for the existing experimental results based, for example, on the interaction of negative kaons with two or more nucleons. This topic is related to a new puzzle in the physics of kaon–nucleon interactions: the nature of the Λ(1405) – does it have a single- or double-pole structure? There were long discussions about this at the workshop.

New frameworks

All of these topics have important consequences in astrophysics, for example, in the physics of neutron stars. The workshop reviews of experimental results covered experiments at KEK, Brookhaven and Dubna, as well as FINUDA at DAΦNE, the FOPI detector at GSI, OBELIX at the former Low-Energy Antiproton Ring at CERN, and the DISTO detector at the former Saturne laboratory in France. There was also a critical review of current theories and models. Discussions about future perspectives centred on an integrated strategy in which complementary facilities should bring together the various pieces of the overall puzzle. Among these are experiments proposed at J-PARC (E15, E17), GSI (upgrades of the FOPI and HADES detectors) and DAΦNE (AMADEUS), together with the possibility of using antiprotons to create single- and double-strangeness nuclei at CERN, J-PARC or the Facility for Low-energy Antiproton and Ion Research at GSI.

The workshop proved that the field of hadronic atoms and kaonic nuclei is active. While some puzzles, such as those concerning kaonic hydrogen and kaonic helium, are now solved thanks to the newer experiments (E570 at KEK, DEAR and SIDDHARTA at DAΦNE), many problems remain unresolved, or “open”. The workshop formulated and targeted important questions that still need experimental results and deeper theoretical understanding. There are many future challenges in both the experimental and theoretical sectors, which were formulated within a single framework for the first time.

There was also a round-table discussion, led by Avraham Gal from the Hebrew University of Jerusalem, that dealt with the search for the K-nuclear bound state. This proved extremely useful because it established common ground on what information (i.e. experimental results) could bring light to the field in future. This is important because new experiments are about to start, including the upgrades to AMADEUS, E15, HADES and FOPI.

The five-day workshop also included a visit to the Fondazione Bruno Kessler (FBK) centre for scientific and technological research. This gave the opportunity for the FBK to demonstrate its capacity to perform research in the field of frontier detectors for future experiments and to establish contacts with experiments that are potentially interested in such developments. In addition, there were presentations of the EU Seventh Framework Programmes (FP7), with Carlo Guaraldo of LNF-INFN Frascati describing the HadronPhysics2 project. In particular, experimentalists and theoreticians came together in a session dedicated to the LEANNIS Network in HadronPhysics2 FP7 – a network that focuses on low-energy antikaon–nucleon and nucleus interactions – in which topics and perspectives in the field were presented and discussed.

One important success of the workshop was that young people made up around half of the participants and that researchers from many countries took part, including Israel and Iran. This made it an occasion for not only scientific exchanges but for cultural and social ones as well, proving once again that scientists are part of society, with an important role.

Combined HERA data set scene for the LHC

CCher1_01_10

High-energy physics experiments address fundamental questions using large facilities and complex detectors, which often use innovative detection techniques. It is usual to build and operate more than one such detector at the same accelerator – to confront, compare and eventually merge the measurements. Combining measurements made by similar detectors becomes feasible and ultimately mandatory when these detectors are well understood and tested with many physics analyses. This step was achieved recently by the H1 and ZEUS experiments, which took data at DESY’s HERA collider from 1992 until 2007.

HERA was the only electron–proton collider ever built, providing collisions between electrons or positrons of 27.5 GeV and protons of up to 920 GeV to give a centre-of-mass energy of 320 GeV. The data collected at HERA are unique and have led to precise measurements of the proton structure, in particular in the region of low Bjorken-x, below 0.01, where no other measurement exists. At HERA, the point-like electron probes the gluon-fabric of the proton down to scales as small as 1/1000 of the proton’s radius. These measurements provide a clean testing ground for the Standard Model. Furthermore, searches for new physics signals at HERA are complementary to searches made at other colliders.

So far, H1 and ZEUS have published individual measurements investigating a plethora of different processes in more than 400 scientific articles. However, for the first time, three joint publications have recently been submitted to the Journal of High Energy Physics. These combinations of the H1 and ZEUS data in coherent analyses address a new paradigm in this field of research.

Universal structure functions

Combining data sets improves individual measurements because the amount of information increases. The theory of statistics states that uncertainties diminish by a factor of √2 when the amount of data is doubled. When systematic uncertainties are taken into account, however, the effects are more subtle: there is no gain for errors correlated between the experiments. Typical examples of this kind are theoretical calculations that are needed to extract the experimental results. Uncertainties that are fully uncorrelated (not only between the experiments but also from one measurement point to the next) have a similar behaviour to statistical errors and are reduced by the magical factor of √2. Finally, the most interesting case comes from errors that are correlated within each experiment, but uncorrelated between experiments. One example is the energy scale of the calorimetric measurements: the technologies of the calorimeters in the two experiments are different and they are calibrated using independent procedures. Hence the respective errors are independent between the experiments but are nevertheless correlated from one measurement to the next within each experiment. These uncertainties are reduced by more than the usual factor of √2. This can basically be seen as an effect of a cross-calibrating of the detectors with respect to each other using a large number of independent measurements.

CCher2_01_10

The paper on the measurement of inclusive deep inelastic scattering cross sections, submitted for publication by the H1 and ZEUS collaborations, contains a combination of more than 1402 individual measurements from 14 publications to obtain 741 cross-section measurements of unprecedented precision. All of the available data on neutral and charged-current interactions taken during the first phase of HERA running from 1992 to 2000 are used. The data cover virtualities, Q2, of the exchanged bosons from 0.2 GeV2 up to the highest values reachable at HERA of around 30,000 GeV2, and values of Bjorken-x as small as 0.2 × 10–6. These data extend into the electroweak regime from regions where perturbative QCD has never been tested. At small values of Bjorken-x, x <10–2, no other measurements exist. In this region the gain from the combination process is impressive: the individual measurements are dominated by systematic errors that become drastically reduced down to as little as 1%. These cross-sections depend on the universal proton-structure functions, F2, F3 andFL, which encapsulate the parton content of the proton. The structure function F2 dominates over most of the phase space, except at high Q2, where parity-violating weak effects lead to a non-zero contribution from xF3, and at large y, where the longitudinal part of the cross-section arising from gluon radiation leads by FL being sizeable.

Figure 1 shows parts of the universal structure function, F2, as a function of the variable x, for various values of the photon virtuality, Q2. The increase of F2 towards low x, discovered in the first years of HERA, is confirmed with a precision approaching 1%. As Q2 grows, F2 becomes steeper towards low x, reflecting the contributions to the quark component from gluon fluctuations in a qq pair. This rise is a fundamental discovery and reveals the role of gluons in binding nuclear matter. It is possible to decompose this structure function into one part that arises from “hard scattering” (so-called coefficient functions) and another non-perturbative part, which reflects the partonic content of the proton. Using the new data, the collaborations have extracted a new set of parton-distribution functions (HERAPDF 1.0), shown on the left in figure 2 for a high photon virtuality, Q2=10,000 GeV2. This partonic content is universal and can be used to make predictions for other processes involving protons – for example, cross-sections in proton–proton collisions at the LHC.

One example within the Standard Model is the production of single weak bosons at the LHC. This process can be regarded as a “standard candle” and can even be used to determine luminosity in the collider because the measurement can be done with great accuracy. The precision of the corresponding theoretical predications is dominated by the uncertainties that originate from the knowledge of the proton–parton distributions, which in turn come from the measurements at HERA (figure 2, right).

Ultimately, the H1 and ZEUS measurements provide the standard candle against which any new phenomenon at the LHC in the mass range of up to a few hundred giga-electron-volts will have to be compared. The new physics may well be in this range – in which case a precise knowledge of the production cross-section would be crucial in order to explore the properties of these new particles.

New physics arises in many theoretical extensions of the Standard Model. According to these extensions, a peak in a mass spectrum or a deviation in a certain variable should be observable. However, new physics can also manifest itself beyond the “standard predictions” and show up as spectacular events in regions of phase space where, according to the Standard Model, only few events should be seen. Events with energetic isolated leptons are an example of such a golden channel. Experimentally, they provide a clean signature; theoretically, they benefit from robust predictions.

The experiments at HERA reported the observation of events with isolated leptons (electrons or muons) and missing transverse momentum as early as 10 years ago. In the Standard Model this topology is explained by the production of a W boson, which decays to an energetic charged lepton and a neutrino. The neutrino escapes undetected, leading to “missing” momentum. The observation of such a rare process (typically one such event is recorded in 10 million other events) is a challenge and requires the full experimental information of the multilayer/multipurpose H1 and ZEUS detectors. Some of the observed events also contain a prominent hadronic jet – which makes them unlikely as W candidates because any hadronic recoil would typically be produced at low transverse momentum.

CCher3_01_10

H1 observed a discrepancy with the Standard Model amounting to as much as three standard deviations, but with no effect seen in ZEUS. To clarify this point, the collaborations undertook a joint analysis effort. They investigated carefully all of the differences and studied all of the systematic effects. The individual results stand up to this scrutiny. By interpreting the difference as a statistical fluctuation, the two experiments can perform a common analysis. This leads to a decrease in the significance of the observed excess for events with large hadronic transverse momentum to below two standard deviations; it also improves significantly the measurement of the W cross-section (figure 3). Thus, this measurement becomes an important confirmation of the weak sector of the Standard Model in a unique configuration.

CCher4_01_10

The third joint paper deals with events with more than one charged lepton that are dominantly produced by photon–photon collisions, the photons originating from the colliding electrons and protons. In individual analyses, H1 and ZEUS found a few hundred events containing several leptons, both electrons and muons, at high transverse momentum, including some events where the scalar sum of the lepton momenta exceeds 100 GeV. In a combined analysis, seven events are observed in this region in positron–proton collisions for an expected number of 1.9 ± 0.2, while no such event is observed in electron–proton collisions for a similar expectation (figure 4). The observation of the excess in positron–proton collisions is still compatible with the Standard Model and is interpreted as a statistical fluctuation. However, this observation stimulates discussion because it is also possible to attribute the excess to a bilepton resonance, such as a doubly charged Higgs boson, H++, produced in electroweak interactions.

These combined measurements from H1 and ZEUS are the first in a series of legacy results from the unique electron–proton collider, HERA. More than 20 years after the start of the facility and two years after the end of the data-taking, the harvest is in its best phase. This is also good for the LHC.

The Blois Workshop comes to CERN

CCeds1_01_10

The 13th International Conference on Elastic and Diffractive Scattering – the “Blois Workshop” – dates back to 1985, when the first meeting was held in the picturesque, old French town of Blois, famous for the 14th-century Royal Château de Blois. The conference series continues to focus on progress towards understanding the physics of hadronic interactions at high energy. A major strength of the meetings is the way in which they facilitate detailed discussion between theorists and experimentalists, thereby motivating new ways of formulating theoretical approaches and confronting them with experimental measurements – past, present and future.

More than 100 participants from 18 countries attended the latest meeting in the series, held at CERN on 29 June – 3 July 2009. The relatively informal manner of the 70 talks encouraged discussion. Appropriately, given the imminent start-up of the LHC, the following topics featured prominently: the total proton–proton (pp) cross-section; elastic pp scattering; inelastic diffractive scattering in electron–proton (ep), pp and heavy-ion collisions; central exclusive production; photon-induced processes; forward physics and low-x QCD; and cosmic-ray physics.

Theoretical developments

On the theoretical side, important aspects of soft diffraction were nicely introduced by Alexei Kaidalov of the Institute of Theoretical and Experimental Physics (ITEP) in Moscow, who emphasized factorization effects and unitarization in the framework of Reggeon calculus. Although everyone anticipates that the total pp cross-section will continue to rise with increasing energy – following the pioneering prediction of H Cheng and T T Wu in 1970 – a number of contributions made distinct predictions for its value at LHC energies – typically ranging between 90 mb and 140 mb, with surprising predictions as high as 250 mb. Several other features of elastic scattering at LHC energies were also considered within the framework of different models that were successful at lower energies. André Martin of CERN, with his long-established theoretical rigour, reported on a new limit for the inelastic cross-section.

The central production of various exclusive final states with one or two “leading protons” – Higgs production at the LHC, in particular – was also a source of much debate. This subject challenges different approaches in QCD, notably the “gluon ladder”, and how these approaches relate to the long-standing theoretical construct, the Pomeron. Douglas Ross of Southampton University presented an interesting treatment of the Balitsky–Fadin–Kuraev–Lipatov (BFKL) kernel of such a ladder, based on the extraction of the low-x gluon distribution in experiments at the HERA ep collider. The issue of the “rapidity-gap survival probability” as an explanation for substantial factorization-breaking in inelastic diffraction in hadron–hadron collisions (as opposed to ep collisions) continues to challenge theory and is important when developing models for central Higgs production. Mark Strikman of Penn State University presented a notable proposal of a new sum rule.

The workshop devoted a full day to contributions dealing with the physics of QCD at various extremes, such as at the lowest parton fractional momenta (low-x QCD) and at the highest densities achievable, e.g. in heavy-ion collisions. Emil Avsar and Tuomas Lappi of CEA/Saclay and Francesco Hautmann of Oxford University reviewed the physics of gluon saturation and possible modifications of the standard QCD evolution equations at tiny values of Bjorken-x. A second topic, summarized by Raphael Granier de Cassagnac of the Laboratoire Leprince-Ringuet, Gines Martinez of SUBATECH, and Jean-Yves Ollitrault of Saclay, concerned studies of the collective behaviour of a multiparton system in a hot, dense state such as a quark–gluon plasma. Various other talks covered the latest experimental and theoretical developments in each of these two active research areas of the strong interaction, all with prospects at the LHC very much in mind.

Experimental highlights

Presentations on experimental developments highlighted the challenge of diffractive physics and the way that it relies on a particularly close symbiosis of measurement and theory. The phenomenology of elastic pp scattering, based on long-standing measurements at the Intersecting Storage Rings at CERN and later experiments at CERN and Fermilab, continues within either “classic Regge” or “geometrical” approaches. The latter is now beginning to produce a “transverse” view of the proton’s structure, as Richard Luddy of Connecticut University explained. Such understanding will be testable in the near future in deep-exclusive lepton-scattering experiments, for example in COMPASS at CERN where, as Oleg Selyugin of JINR described, such measurements may be interpreted in terms of generalized parton distributions.

As at all meetings since EDS returned to Blois in 1995, there were reports from the experiments at HERA on the status of the deep-inelastic structure of the diffractive interaction, this time by Henri Kowalski of DESY and Alexander Proskuryakov of Moscow State University. The impressive precision of the data reveals beautiful features that demonstrate the quark and gluon components of the t-channel (i.e. the leading) exchange mechanism. Put differently, the data are sensitive to the parton structure of the proton’s diffractive interaction. Results on the scale dependence of these leading exchanges, measured at HERA in exclusive meson production, now provide precise data with which QCD theory has to be reconciled, as Pierre Marage of the Université Libre de Bruxelles explained.

The main experimental highlights came, arguably, from the CDF experiment at Fermilab’s Tevatron with the measurements of the central exclusive two-photon production (pp → ppγγ) and di-jet production (pp → pp+2 jets), presented by James Pinfold of Alberta University, Christina Mesropian and Konstantin Goulianos of Rockefeller University and Michael Albrow from Fermilab. Both processes are important as precursors for the exclusive Higgs search at the LHC; the agreement of the predictions, made prior to the measurements, with the data is an important milestone in the preparation for exclusive Higgs hunting – appropriately christened “Higgs with no mess” by the experimentalists concerned.

A session dedicated to ultrahigh-energy cosmic-ray observations underlined their complementarity to collider measurements in view of understanding hadronic interactions, as Jörg Hörandel of Radboud University, Nijmegen, explained. Alessia Tricomi of INFN/Catania University pointed out that, in particular, forward experiments can contribute valuable data to the development of models of air showers.

With its first data, the LHC will already provide new measurements that are crucial to this active field.

Other topics at the meeting included photon-induced processes from the BaBar and Belle experiments, with reviews of relevant heavy-ion results from RHIC at Brookhaven and prospects for the LHC. Looking further into the future, Paul Newman of Birmingham University reported on possibilities for ep and electron–ion interactions at an LHeC.

Given the venue of EDS ’09, perhaps the most appropriate session was the one that was concerned with new experiments. Taking advantage of the presence of the unique breadth of expertise present at an EDS meeting, a panel discussion took place between representatives of theory and experiments, moderated by Karsten Eggert of Case Western Reserve University and CERN. It provided the opportunity to exchange ideas about which measurements to carry out first at the LHC, how to create synergies between different experiments and about future upgrade possibilities for the forward proton detectors. Several new ideas for possible measurements at the LHC were proposed and discussed. With its first data, the LHC will already provide new measurements that are crucial to this active field. The meeting ended with a strong sense of anticipation, given the imminent diffractive data at a new energy scale from the first run of the LHC.

bright-rec iop pub iop-science physcis connect