Comsol -leaderboard other pages

Topics

Super-Kamiokande gets full refit

Operation of the Super-Kamiokande (SK) II detector in Japan was terminated last October after three years of running to begin a full restoration of the detector. Precise studies on neutrinos will resume next June.

CCEnew2_01-06

The SK detector consists of a cylindrical tank containing 50,000 tonnes of pure water viewed by about 11,000 photomultipliers (PMTs) of 50 cm diameter. The water tank is 40 m in height and 40 m in diameter, and located 1000 m underground. Neutrinos interact with the water and give rise to Cherenkov light, which provides information about the neutrino energy, direction and type or flavour. In 1998, the collaboration announced that neutrinos change flavour – oscillate – which is possible only if the particles have mass. The evidence came from observing neutrinos created by cosmic-ray interactions in the atmosphere. This was followed in 2001 by evidence for the oscillations of solar neutrinos in the combined data from SK and the Sudbury Neutrino Observatory. More recently, the KEK-to-Kamioka (K2K) experiment, using a man-made neutrino beam from KEK to the SK detector has confirmed the oscillations observed in the atmospheric neutrinos.

Several thousands of PMTs in the detector were destroyed in November 2001, when the shock wave from the implosion of one PMT at the bottom of the tank triggered a chain reaction of implosions in more than half the PMTs (see CERN Courier January/February 2002 p6). In 2002, the detector was partially reconstructed using about 5000 PMTs encased in plastic covers to avoid a similar accident. This partial reconstruction was done quickly in only a year in order to continue the K2K experiment. After three years of operation as SK-II, with half the original density of PMTs, the long awaited full reconstruction of the detector has now begun. Next June, the detector’s third phase, SK-III, will start to take data again.

The discovery of neutrino oscillations has opened up a new window of research with a variety of subjects for SK to tackle. An experiment using an intense neutrino beam from Tokai – Tokai-to-Kamioka (T2K) – is expected to start in 2009. The beam will be produced by a 50 GeV proton synchrotron being constructed at the Japan Proton Accelerator Research Complex in Tokai (see CERN Courier November 2004 p41). SK-III will be the far detector at a distance of 295 km from the beam-production point. The T2K experiment will determine neutrino oscillation parameters precisely and search for effects of the neutrino mixing angle, θ13, which is so far unobserved.

A longer exposure to atmospheric neutrinos will be important in searching for a resonant matter effect in the Earth and may help to resolve the octant ambiguity in the mixing angle θ23. At the lower energies of solar neutrinos, an up-turn in the spectrum is expected as direct evidence for large-mixing-angle solutions and will provide precise oscillation parameters. The higher statistics from several years of exposure should allow this measurement.

SK could also detect several thousand neutrino interactions from a galactic supernova. Such a large number of events would reveal details of the supernova explosion mechanism, as well as information on the properties of neutrinos. The positive identification of electron-antineutrinos in SK could also be possible in future. Neutrons emitted in antineutrino interactions could be detected through the 2.2 MeV gamma-rays emitted by neutron capture on protons and through interactions with gadolinium dissolved in the pure water.

Lastly, the detection of nucleon decay as predicted by grand unified theories has always been one of the primary topics for SK. Sensitivity to the decay mode p →e+ + p0 will soon reach the level corresponding to a lifetime of 1034 years. Decay modes favoured by supersymmetry, which include K mesons in the final state, will become interesting with a longer exposure in SK-III, and the collaboration hopes to observe the first indication of nucleon decay in the near future.

Belle achieves new luminosity record

By mid-afternoon on 22 November, the Belle experiment at KEK had accumulated an integrated luminosity of 500 fb-1 of electron-positron collision data. This integrated luminosity marks a landmark in the progress of the KEKB accelerator and the Belle experiment, which began operation in 1999. It is equivalent to achieving 5 × 1041 crossings of electrons and positrons a square centimetre. More than 500 million pairs of B and Bbar mesons have been generated in the collisions.

CCEnew3_01-06

The original challenge for KEKB was to achieve 100 fb-1 in 3 years. The total of 500 fb-1 in 6.5 years surpasses this goal. The group now aims to achieve even higher records with various upgrades to the machine.

Superconducting RF technology forum unites research and industry

Representatives from European research and industry have established the European Industry Forum for Accelerators with Superconducting RF Technology, EIFast. More than 30 companies and institutes from nine countries sent a total of 64 participants to a meeting to found the forum at DESY on 27 October. They agreed on the forum’s statutes and elected the members of the coordination board.

CCEnew4_01-06

The proposal to create the forum resulted from the considerable industrial interest triggered by several large accelerator projects based on superconducting RF (SCRF) technology, in particular the approved X-ray free-electron laser, XFEL, and the planned International Linear Collider. Both projects use SCRF technology, which has been substantially advanced during the past decade by the TESLA Technology Collaboration. In addition, the TESLA test facility at DESY, built with involvement from European companies, has added to the solid base of expertise in SCRF accelerators in European industry.

Against this background, it was concluded that a forum would further strengthen the excellent position of European science and industry in SCRF technology. Moreover, similar bodies have been established in both the US and Japan.

Members of European research centres and industrial companies decided to found EIFast at a meeting at DESY in April 2005. Its scope includes all systems and components needed for an SCRF accelerator, including supplies and services. Acting as a common voice for European research and industry, the forum will now try to promote the realization of SCRF projects in a coherent way.

The forum aims to bring research institutes either working in the field of SCRF technology or interested in becoming involved together with industrial companies interested in supplying products to projects based on the technology. The main tasks of the forum include generating support for projects at the political level in Europe, ensuring a flow of up-to-date information about projects between institutes and companies, promoting involvement of industry in projects at an early stage, and supporting the members in gaining access to information channels and decision makers otherwise difficult to obtain.

HERA hits record annual luminosity

In its 2005 run, DESY’s HERA collider achieved the largest integrated luminosity it has ever produced in one year. Colliding 27.5 GeV electrons with 920 GeV protons, HERA delivered a total of 213 pb-1 to the experiments H1 and ZEUS in 318 days of running. Compared with the positron-proton luminosity production of 2004, the integrated luminosity and the average luminosity were increased by factors of 2.2 and 1.5, respectively. The peak luminosity reached 5.1 ×1031 cm-2s-1 – the design luminosity for the upgraded HERA collider.

This success is particularly remarkable since, compared with running with positrons, additional complications were expected for electron-proton collisions, due to increased synchrotron radiation in the interaction regions and degradations in the lifetime of the electron beam. The synchrotron radiation problems were successfully avoided by improved beam control. Problems with the electron-beam lifetime proved rare and were not relevant for production of luminosity. Nevertheless the electron-beam current reached only 90% of the positron currents in 2004.

The proton intensity improved slightly in 2005 due to improved beam transfer from the injector, while the specific luminosity (luminosity/current) increased considerably beyond the design value owing to the additional focusing of the electron beam by the beam-beam force. The large beam-beam forces made longitudinal electron-spin polarization more difficult: the average peak polarization decreased from 50% in 2004 to 45% in 2005. All in all, however, the operating efficiency improved noticeably compared with the previous running.

Operation at HERA is scheduled to resume at the end of January 2006. Various improvements that have been added during the shutdown – such as refurbished normal conducting magnet coils, enhanced RF interlocks, active damper and feedback systems – will further improve the availability, peak luminosity and background conditions for the run during 2006.

Auger observatory celebrates progress

On 10 November, the Pierre Auger Observatory (PAO) began a major two-day celebration at its headquarters in Malargüe, Argentina, to mark the progress of the observatory and the presentation of the first physics results at the International Cosmic Ray Conference in the summer 2005. One of several experiments connecting particle astrophysics and accelerator-based physics, the PAO studies extensive air showers created by primary cosmic rays with energies greater than 1018 eV. With more than 1000 of the 1600 surface detectors and 18 of the 24 fluorescence detectors currently installed and operating, the observatory will eventually cover 3000 km2 of the expansive Pampa Amarilla.

CCEnew5_01-06

Over 175 visitors from the 15 collaborating countries attended the celebration, with guests including heads of collaborating institutions, representatives from supporting funding agencies, delegates from Argentinian embassies, local and provincial authorities, plus press and media teams. On the first day, experiment heads Jim Cronin, Alan Watson and Paul Mantsch presented the history and status of the observatory to the assembled visitors in Malargüe’s Convention Center. This was followed by a ceremony on the Auger campus to unveil a commemorative monument made of glass and stone. Ceremony speakers included Malargüe’s mayor and the governor of Mendoza Province. Guests then retired to a traditional asado that featured local cuisine and entertainment by folk musicians and tango dancers. On the second day, attendees toured the vast observatory site, including surface detectors on the pampa and one of the remote fluorescence detector buildings.

As part of the celebration, the collaboration sponsored a science fair in the observatory’s Assembly Building, organized by four local science teachers for teachers and students from high schools in Mendoza Province. Twenty-nine school groups, many travelling long distances to reach Malargüe, presented research projects on topics in physics, chemistry or technology. A team of PAO physicists judged the displays and awarded prizes to the most outstanding young scientists. In March 2006, the opening of a new high school in Malargüe is anticipated, partial funding for which was secured by Cronin from the Grainger Foundation in the US.

INTEGRAL reveals Milky Ways’ supernova rate

One supernova explosion every 50 years in our galaxy: that is the rate that a European team has determined from the observations of ESA’s INTEGRAL gamma-ray satellite. This figure is based on the amount of gamma-ray radiation emitted by radioactive aluminium produced in core-collapse supernovae.

CCEast1_01-06

With a lifetime of about a million years, radioactive aluminium (26Al) is an ideal tracer of ongoing nucleosynthesis in the galaxy. The decay of 26Al emits a gamma-ray line at an energy of 1.809 MeV. NASA’s Compton Gamma-Ray Observatory found in the 1990s that this characteristic emission is distributed along the plane of the Milky Way, as expected if 26Al is mainly produced by massive stars throughout the galaxy. It remained unclear, however, whether the dominant emission towards the centre of the galaxy was due to relatively nearby star-forming regions on the line-of-sight or to the central region itself.

It is this uncertainty that has now been lifted thanks to INTEGRAL’s very high spectral resolution. The peak of the emission from 26Al was found to be shifted towards higher energies east of the galactic centre and towards lower energies on the west side. These observations are consistent with the expected Doppler shift due to the rotation of the galaxy. They show that the 26Al emission does follow the global galactic rotation and hence comes from the inner part of the galaxy rather than from foreground regions.

The team, led by Roland Diehl of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, used these line-shift measurements to constrain the best model for the spatial distribution of 26Al in the galaxy. This distribution was then used to convert the observed gamma-ray flux into the total mass of 26Al in the Milky Way, which was found to be about three times the mass of the sun. Finally, using results from theoretical nucleosynthesis models for a typical stellar population, the team could estimate the current star-formation rate in the galaxy to be about 7.5 stars a year, corresponding to a rate of core-collapse supernovae of 1.9 (±1.1) events a century.

The rate of two supernovae a century in the Milky Way is consistent with the rate derived from supernovae detected in other galaxies, but exceeds several times the rate deduced from the historic observations of supernova explosions during the past 2000 years. Only eight such events have been recorded – in 185, 386, 393, 1006, 1054, 1181, 1572 and 1604 – mainly by Chinese astronomers. The two last events have been observed more precisely by the famous European astronomers Tycho Brahe (1546-1601) and Johannes Kepler (1571-1630). If we exclude the supernova of 1987 in the Large Magellanic Cloud, no supernova has been seen in the galaxy during the past 400 years. Some events may have gone unnoticed because they were distant and heavily obscured by interstellar dust, but the next supernova in the galaxy will certainly not remain hidden now we can observe the sky as never before throughout the whole electromagnetic spectrum, and even with neutrinos.

Further reading

R Diehl et al. 2006 Nature 439 45.

Shifts in the gamma-ray line from 26Al caused by the Doppler effect along the plane of the galaxy, owing to galactic rotation. The broad distribution is from a three-dimensional model of the spatial distribution of 26Al – based on free electrons in the interstellar medium – that matches the line shifts measured by INTEGRAL (error bars). (Courtesy MPE.)

Quarks matter in Budapest

The Quark Matter conferences have historically been the most important venues for showing new results in high-energy heavy-ion collisions. The 18th in the series, Quark Matter 2005, held in Budapest in August 2005, attracted more than 600 participants from 31 countries in five continents; more than a third were junior participants, reflecting the momentum of the field. The major focus of the conference was the presentation of the new data from the Brookhaven National Laboratory’s Relativistic Heavy Ion Collider (RHIC) together with the synthesis of an understanding of heavy-ion data from experiments at CERN’s Super Proton Synchrotron (SPS), including new data from the NA60 experiment. The meeting also covered a broad range of theoretical highlights in heavy-ion phenomenology, field theory at finite temperature and/or density, and related areas of astrophysics and plasma physics.

After an opening talk by Norbert Kroó, vice-president of the Hungarian Academy of Science, the scientific programme of the conference began with a talk by Roy Glauber, who was soon to share the 2005 Nobel prize in physics. Glauber’s calculations in the 1960s laid the foundation for the determination of centrality in high energy heavy-ion collisions – a measure of how close to head-on they are – which is now one of the most elementary and widely used tools of heavy-ion physics. In his talk “Diffraction theory, quantum optics and heavy ions”, he discussed the concept of coherence in quantum optics and heavy-ion collisions and presented a new generalization of the Glauber-Gribov model. Further talks in the introductory session were given by Luciano Maiani, former director-general of CERN, who reassessed the main conclusions of the SPS fixed-target programme, and by József Zimányi, of the KFKI Research Institute for Particle and Nuclear Physics in Budapest, who gave an account of the evolution of the concept of quark matter.

It has become a tradition of the Quark Matter conferences to follow the introductory session with summary talks of all experiments. Thus, the first day sets the scene for the discussions of the rest of the week. This short report cannot summarize all the interesting novel experimental and theoretical developments, but it aims at illustrating the richness of these discussions with a few of the many highlights.

One of the main discoveries of the fixed-target heavy-ion programme at the SPS five years ago was the strong suppression of the J/ψ yield with increasing centrality of the collision, which probed the deconfinement phase transition. Another discovery concerned the significant enhancement of low-mass dileptons, which indicates modification in the medium of vector mesons and possibly provides information about the restoration of chiral symmetry. These major discoveries by the NA50 and CERES experiments at the SPS also raised a significant set of more detailed questions, which were recognized as central to understanding the dynamical origins of the observed effects.

CCEqua1_01-06

In particular, the dimuon invariant-mass spectrum of NA50 showed an enhancement below the J/ψ peak, which different theoretical groups ascribed either to a dramatic enhancement of the charm cross-section in the medium, or to significant thermal radiation. Having implemented a telescope of silicon pixel detectors with improved pointing resolution, NA60 was able to report in Budapest that data taken in the 2003 indium-indium run allow them to rule out conclusively an increased charm cross-section as the source for the dimuon excess. The data are, however, consistent with the exciting possibility of a significant thermal contribution. In addition, for more than a decade, there has been a theoretical debate on whether the embedding of ρ mesons in dense quantum chromodynamic (QCD) matter leads to a shift in the ρ mass, or to a density-dependent broadening, both scenarios being consistent with the original CERES dielectron data. NA60 now concludes, from data taken in the indium-indium run, that the shifting-mass scenario is not consistent with their data, which instead support a broadening induced in the medium (see figure 1). NA60 also presented their first indium-indium measurements of J/ψ suppression as a function of centrality. These confirm the strong anomalous suppression seen by NA50 in central lead-lead collisions at the SPS.

The SPS experiments NA49, CERES, NA50 and NA57 also showed new results from their continuing data analysis. In addition to earlier high transverse-momentum (pT) measurements from CERES and WA98, this year NA49 and NA57 showed new results that were extensively compared with the results of the experiments at RHIC.

The central topic of this Quark Matter conference was without doubt the full harvest of the high-luminosity gold-gold run at RHIC in 2004, from which data analyses were shown for the first time. Equally important were results from the successful copper-copper run in the first half of 2005, which had been analysed in time for the conference in a global effort by the participating institutions of the four RHIC experiments. With an integrated luminosity for 200 GeV gold-gold collisions of almost 4 nb-1, this run increased statistics by more than a factor of 10, and made much-wanted information accessible for the first time. One of the most important early discoveries of the heavy-ion experiments at RHIC was the strong suppression of hadronic spectra by up to a factor of five in the most central collisions. This so-called “jet-quenching effect” supports the picture that the matter created in heavy-ion collisions is of extreme density and thus very opaque to hard partons.

CCEqua2_01-06

Results from the PHENIX experiment at RHIC now indicate that even neutral pions of pT = 20 GeV show this dramatic energy degradation (figure 2). Moreover, the increased luminosity allowed the STAR experiment to study the recoil of hadron trigger particles up to 15 GeV, and for sufficiently high transverse momenta, this recoil is for the first time observed to punch through the soft background. However, compared with reference data from proton-proton collisions, the particle yield of the recoil is strongly reduced, consistent again with the picture of a medium that is dense and very opaque to partonic projectiles. In further support, PHENIX also reported that high-pT photons are not suppressed (figure 2), and that photons at intermediate transverse momenta show an excess, which may be attributed to thermal radiation from the hot and dense matter.

Another important piece in the puzzle of reconstructing the properties of the produced matter came from the first measurements of high-pT single-electron spectra. These spectra are thought to be dominated by the semi-leptonic decays of D- and B-mesons, thus giving for the first time experimental access to the propagation of heavy quarks in dense QCD matter. Data from STAR and PHENIX reveal a medium-induced suppression of electrons, which is of similar size to that of light-flavoured hadrons. There were many parallel talks, by both experimentalists and theorists, which contrasted these data with the theoretical expectation that massive quarks should lose less energy in the medium than massless quarks or gluons due to the so-called “dead-cone effect” in QCD. While a final assessment is still awaited, there was widespread agreement that these data will help significantly in refining our understanding of the interaction between hard probes and the medium, which is much needed for a better characterization of the dense QCD matter produced in nucleus-nucleus collisions.

Another much awaited result that gave rise to a great deal of discussion was the first statistically significant J/ψ measurement at RHIC. This was presented by the PHENIX collaboration and showed a similar pattern and strength to that observed in lead-lead and indium-indium collisions at the SPS. This result was of particular interest also to lattice QCD theorists, who now find that the dissociation of the directly produced J/ψ in a deconfined medium sets in at much higher energy densities than previously expected.

CCEqua3_01-06

The bulk properties of dense QCD matter reveal themselves not only in the modification of hard processes by the medium, but also in the collective motion of soft particle production and its hadro-chemical composition. One of the main discoveries of the first years of running RHIC was the unprecedented large size of the collective flow signals, measured in the asymmetries of particle production with respect to the reaction plane. Remarkably, the measured mass-dependence of the transverse radial and elliptic flow supports the assumption that different particle species emerge from a common flow field. Flow measurements at intermediate transverse momenta follow constituent-quark counting rules and are consistent with quark coalescence as a medium-dependent hadronization scenario (figure 3). Moreover, to the surprise of many, the hydrodynamic description of the collision in terms of an adiabatically expanding, perfect fluid of vanishing viscosity and heat conductivity appears, at RHIC energies, to be satisfactory for the first time.

Much of the discussion at QM ’05 focused on the emerging picture of the matter produced in heavy-ion collisions at RHIC, which, far from being a weakly interacting gas of quarks and gluons, shows features of a strongly coupled partonic system indicative of a perfect liquid. This liquid includes not only the light and strange quarks; the first preliminary data on the elliptic flow of charmed hadrons from the PHENIX collaboration indicates that even charmed quarks participate in the collective expansion of this new form of matter.

The conference saw a lively theoretical discussion about the dynamic mechanisms underlying a possible rapid thermalization. Emphasis was given in particular to the relationship to thermalization processes in Abelian plasmas, to formal analogies with the thermal properties of black holes, and to the possibility that plasma instabilities accelerate equilibration. The intellectual richness of the field was further illustrated by exciting reports from string theory, where theorists have succeeded for the first time in calculating the viscosity to entropy density ratio in the physically relevant, strong-coupling limit of a certain class of thermal non-Abelian gauge theories. The fact that this ratio is found to be very small indicates a non-dissipative behaviour. It raises the exciting possibility that the non-dissipative character of an almost perfect liquid, which may be created in gold-gold collisions at RHIC, could be understood from first-principles calculations in QCD.

From the point of view of heavy-ion phenomenology, the central question of whether more direct signals of negligible viscosity can be established led to another highlight of the conference. The widely discussed idea was that if dissipation is negligible, then energy, deposited by a jet in dense QCD matter, must propagate in a characteristic Mach cone, determined by the velocity of sound in the quark-gluon plasma. Reports about back-to-back particle correlations from PHENIX, which may show such a Mach-cone-like structure, were hotly debated amongst theorists and experimentalists alike (see figure 4). Most importantly, these discussions showed that heavy-ion physics at collider energies has a large set of novel tools available for the controlled experimentation with hot and dense QCD matter, and that the field is moving towards characterizing specific properties of this matter, including its speed of sound, equation of state, and its transport coefficients such as heat conductivity and viscosity.

Past, present and future

CCEqua4_01-06

The Quark Matter conferences not only highlight the experimental harvest of the recent past and the latest news from theory, they are also the arena for assessing perspectives for the future. The first heavy-ion beam at the Large Hadron Collider (LHC) at CERN is expected in 2008, and heavy-ion researchers are now well prepared for the jump in centre-of-mass energy by a factor of 30 above RHIC. Most importantly, the fact that dramatic medium-sensitive effects persist unweakened at RHIC up to the highest measured transverse momentum strongly supports the expectation that the new kinematic regime accessible at the LHC will provide many qualitatively novel tools for the study of ultra-dense QCD matter.

The LHC will not be the only big player in the field of heavy-ion physics in the next decade. At Brookhaven, the STAR and PHENIX collaborations are lining up for several important detector upgrades, which will significantly enhance their abilities to characterize specific properties of the matter created in heavy-ion collisions. Moreover, Brookhaven envisages a luminosity upgrade of RHIC, which will open yet another class of novel opportunities. Finally, the newly approved Facility for Antiproton and Ion Research at the GSI Darmstadt is preparing for the start of a versatile heavy-ion programme in the next decade. Plenary talks provided overviews of the status and possibilities of these three programmes. The field is now eagerly awaiting its future, the next slice of which will be served at the 19th Quark Matter conference in Shanghai in November 2006.

Particles in Portugal: new high-energy physics results

The 2005 European Physical Society (EPS) Conference on High Energy Physics (HEP) took place in Lisbon on 21-27 July at the Cultural Centre of Belém, beautifully situated on the right bank of the Tagus river, 10 km west of downtown Lisbon. Held in alternate years, the EPS HEP conference starts with three days of parallel talks, followed by a day off, and then three days of plenary sessions. The format thus differs from that of the Lepton-Photon conferences, which are organized in the same year, and allows the participation of more “grass-root” and young speakers.

CCEhep1_01-06

In 2005 a total of 17 sessions yielded a wealth of detailed results from both experiment and theory, including new results from astroparticle physics. One of the highlights was provided by Barry Barish, newly appointed director of the Global Design Effort for the International Linear Collider (ILC). The EPS and the European Committee for Future Accelerators organized a particularly popular “Lab directors’ session”, which presented status and future plans.

CCEhep2_01-06

The opening ceremony was honoured by the presence of Mariano Gago, Portuguese Minister for Science, Technology and Universities, who as an experimental high-energy physicist, was also a member of the local organizing committee. As usual, the plenary sessions started with the prize awards. The EPS 2005 High Energy Particle Physics Prize was presented jointly to Heinrich Wahl of CERN and to the NA31 collaboration, with other prizes awarded to Mathieu de Naurois, Matias Zaldarriaga, Dave Barney and Peter Kalmus (CERN Courier, September 2005, p43). The next highlight was the invited talk by David Gross of Santa Barbara/KITP, Nobel Laureate in 2004 and EPS Prize winner in 2003. He checked off the list of predictions he had made in the summary talk of the 1993 Cornell Lepton-Photon conference, the majority of which had been confirmed.

CCEhep3_01-06

Sijbrand de Jong of Nijmegen/NIKHEF and Tim Greenshaw of Liverpool started the main business of the plenary session with talks on tests of the electroweak and quantum chromodynamic sectors of the Standard Model, respectively. The new (lower) mass for the top quark from Fermilab, of 172.7±2.9 GeV, as presented by Koji Sata of Tsukuba in the parallel sessions, gives an upper Higgs mass of 219 GeV at 95% confidence level. Greenshaw discussed how HERA continues to play a major role in precision studies in quantum chromodynamics (QCD) of the proton, now mapped down to 10-18 m, or a thousandth of its radius. Such results will be very valuable for the analysis of data from the Large Hadron Collider (LHC). New results on the spin structure of the proton were also reported.

Riccardo Rattazzi of CERN and Pisa then talked on physics beyond the Standard Model and was followed by Fermilab’s Chris Quigg, who reviewed hadronic physics and exotics. Rattazzi presented an interesting “LEP paradox”: the hierarchy problem, with a presumed light Higgs particle, requires new physics at a low scale, whereas there are no signs of it in the data from CERN’s Large Electron-Positron collider. He also reviewed the anthropic approach to the hierarchy problem: we inhabit one of very many possible universes. This many-vacua hypothesis is also referred to as “the landscape”, and might have implications for supersymmetry. Quigg reviewed several new states discovered by the CLEO collaboration at Cornell and at the B-factories, and reminded us that the pentaquark states are still controversial.

Near- and more-distant-future possibilities were reviewed by Günther Dissertori of ETH Zurich in his talk on “LHC Expectations (Machine, Detectors and Physics)” and by Klaus Desch of Freiburg in “Physics and Experiments – Linear Collider”. Dissertori gave an overview of all the complex instrumentation in the process of being completed for both the LHC and its four major detectors. The first beams are planned for the summer of 2007, with a pilot proton run scheduled for November 2007. All detectors are expected to be ready to exploit LHC collisions starting on “Day 1”. Desch presented the ILC project and highlights of the precision measurements it will provide in electroweak physics, in particular, in the Higgs sector.

More theoretical considerations were offered by CERN’s Gabriele Veneziano and Yaron Oz of Tel Aviv, who spoke on cosmology (including neutrino mass limits) and string theory, respectively. Veneziano reviewed current understanding, according to which the total energy content of the universe is split into 5% baryons, 25% dark matter and 70% dark energy. The question of what dark energy is was compared with the problem that faced Max Planck when he realized that the total power emitted by a classical black body is infinite. Interesting speculations on alternative interpretations of cosmic acceleration were also discussed. Precision measurements in cosmology have an impact on high-energy physics: they provide an upper bound on neutrino masses, indicate preferred regions in the parameter space of minimal supergravity grand unification, and suggest self-interacting dark matter. Oz reviewed the beauties of strings and their two major challenges: to explain the big bang singularity, and the structure and parameters of the Standard Model. So far, neither is explained, but the consistencies are impressive.

The recently discovered connection between string theory and QCD was described by SLAC’s Lance Dixon. An important problem being solved is how to optimize the calculation of multiparticle processes (which might be backgrounds to new physics processes). By ingeniously exploiting the symmetries of the theory, one is able to go beyond the method of Feynman diagrams in terms of efficiency. Roughly speaking, this amounts to first representing four-vectors by spinors, and then Fourier-transforming the left-handed but not the right-handed spinors.

Getting results

Christine Davies of Glasgow presented new results on non-perturbative field theory, in particular in lattice QCD (LQCD). She reported on the very impressive recent advances in LQCD, where high-precision unquenched results are now available to confront the physics of the Cabibbo-Kobayashi-Maskawa (CKM) matrix with only 10% errors on the decay matrix elements. This has been made possible by breakthroughs in the theoretical understanding of the approximations, together with faster computers.

Josh Klein of Austin and Federico Sanchez of Barcelona reviewed neutrino physics results and prospects, respectively. Neutrino physics has become precision physics, and now oscillations, rather than just flux reductions, are beginning to emerge in data from the KamLAND and Super-Kamiokande II experiments in Japan. Sanchez discussed rich plans for the future, with two main questions to tackle. Is the neutrino mass of Majorana or Dirac origin? How can the small angle θ13 and the CP-violating phase δ be constrained, or preferably, measured? The plans include the Karlsruhe Tritium Neutrino experiment to study tritium decay, and the GERDA experiment in the Gran Sasso National Laboratory (LNGS), the Neutrino Mediterranean Observatory and the Enriched Xenon Observatory, all of which will look for neutrinoless double beta decay. The Main Injector Neutrino Oscillation Search, the Oscillation Project with Emulsion Tracking Apparatus (OPERA) in the LNGS, and the Tokai to Kamioka (T2K) long-baseline neutrino experiments will all study the phenomena of “atmospheric” neutrino oscillations under controlled conditions, and the Double CHOOZ experiment will further bound the small values of θ13. A new idea is to exploit beams of unstable nuclei, which would provide monochromatic neutrinos. Meanwhile, the CERN Neutrinos to Gran Sasso project will start taking data in 2006, with a neutrino beam from CERN to the OPERA detector.

Flavour physics was the topic for both Gustavo Branco of Centro de Física Teórica das Partículas in Lisbon, in “Flavour Physics – Theory (Leptons and Quarks)”, and Marie-Hélène Schune of LAL/Orsay, who talked about CP violation and heavy flavours. At the B-factories, the Belle detector is collecting a lot of luminosity, and after a long shutdown, BaBar is back in operation. Many detailed results on CP violation in B-decays were presented at the meeting. The BaBar and Belle results on β or φ1 are now in agreement, and the CKM mechanism works very well, leaving little room for new physics, although the precision is also steadily improving.

Looking to the skies

Astrophysics was covered by three speakers, with Thomas Lohse of Berlin talking about cosmic rays (gammas, hadrons, neutrinos), Alessandro Bettini of Padova presenting dark matter searches, and Yanbei Chen from the Max-Planck Institute for Gravitational Physics reviewing work on gravitational waves. What and where are the sources of high-energy cosmic rays? How do they work? Are the particles accelerated or due to new physics (decay products) at large mass scales? The Pierre Auger Observatory is beginning to collect data in the region of the Greissen-Zatsepin-Kuzmin cut-off, while neutrino detectors search for “coincidences” (repeated events from the same direction).

The HESS telescopes and other detectors have discovered tera-electron-volt gamma rays from the sky! The origin is unknown, but they are correlated with X-ray intensities. The galactic centre is one such tera-electron-volt gamma-ray point source. It has also been discovered that supernova shells accelerate particles (electrons or hadrons?) up to at least 100 TeV. The searches for weakly interacting massive particles, on the other hand, remain inconclusive. Other experiments are still unable to confirm or refute the observation of an annular modulation seen by the DAMA project at the LNGS.

A major instrument in the search for gravity waves is the Laser Interferometer Gravitational-Wave Observatory, a ground-based laser interferometer that is sensitive in the region from 10 Hz to 10 kHz. The sources include pulsars, and one hopes to detect a signal after the planned upgrade. The Laser Interferometer Space Antenna will be launched in 2015, and will be sensitive to lower frequencies, in the range 0.01 mHz to 0.1 Hz, as might come from super-massive black-hole binaries.

Paula Bordalo of the Laboratério de Instrumentação e Física Experimental de Partículas in Lisbon presented an experimental overview of ultra-relativistic heavy-ion physics. Photon probes are important for the study of the new state of matter observed, as they do not interact strongly and carry information about the early stage of the collision. There is also a related virtual photon or dilepton signal that shows some interesting features. The new state being explored is possibly a colour glass condensate, which is behaving more like a low-viscosity liquid rather than a gas.

Plasma wake-field appears to be still in an early stage of development, although it has the potential to achieve very high acceleration gradients.

Alexander Skrinsky of the Budker Institute of Nuclear Physics reviewed the status and prospects of accelerators for high-energy physics, covering machines in operation as well as new facilities under construction or planned. Superconductivity is widely used and is being further developed for accelerating structures and for magnets. One important line of development is oriented towards higher luminosity and higher quality beams, including longitudinal polarization and monochromization techniques. There are studies aiming at shorter and more intense bunches, suppression of instabilities involving fast digital bunch-to-bunch feedbacks and minimization of electron-cloud effects. Rapid progress is being made on energy-recovery linacs, recyclers and free-electron lasers, which are being studied for future synchrotron light sources. Higher power proton beams and megawatt targets are being developed and several promising options for neutrino factories are under study. Plasma wake-field acceleration appears to be still in an early stage of development, although it has the potential to achieve very high acceleration gradients.

Grid references

Turning to computing, DESY’s Mathias Kasemann described the status of the Grid projects in high-energy physics. The big experiments running today – CDF, D0, BaBar and ZEUS – are already using distributed computing resources and are migrating their software and production systems to the existing Grid tools. The LHC experiments are building a vast hierarchical computing system with well defined computing models. The LHC Computing Grid (LCG) collaboration has been set up to provide the resources for this huge and complex project. The LCG system is being developed with connections to the Enabling Grids for E-science (EGEE) project and the Nordic Data Grid Facility in Europe and to the Open Science Grid in the US. Basic Grid services have been defined and first implementations are already available and tested. Kasemann’s personal prediction was that the data analysis of the LHC experiments will not be late because of problems in Grid computing.

On the detector front, CERN’s Fabio Sauli reported on new developments presented at the conference. Interesting progress has been achieved in fabricating the radiation-hard solid-state detectors needed for the LHC and other high-radiation-level applications. One way is through material engineering: choosing materials that are radiation resistant, such as oxygenated silicon, silicon processed with the Czochralski method, or using thin epitaxial detectors. Other solutions have been developed by device engineering, and these include pixel detectors, monolithic active pixels or three-dimensional silicon structures. For high-rate tracking and triggering, gas micropattern detectors, such as the gas-electron multipliers, have found versatile solutions in several experiments. For calorimetry, new materials like lead tungstenate crystals have been adopted in LHC experiments. Also new scintillation materials with large light yield, fast decay time and with high density have been tested.

Boris Kayser from Fermilab closed the conference with an eloquent summary. On the day off, various excursions to charming medieval villages and ancient monasteries all converged on the city of Mafra, where the conference participants met Portuguese students and teachers in a baroque palace dating from 1717. There was also a visit to a precious library created by Franciscan friars, with 36,000 prize volumes (the “arXiv” of its time!) and where bats control the insect numbers (visitors were told). Gaspar Barreira and his colleagues handled the local organization masterfully, and the many excellent fish restaurants nearby provided a relaxed setting for informal discussions.

• The next EPS-HEP conference, in July 2007, will take place in Manchester, UK.

CERN’s low-energy frontier

The energies attained at CERN and other particle physics laboratories are useful not only for probing nature’s deepest layers, they also enable the study of matter in the relatively low range up to a few million electron-volts. This range is typical of supernovae and X-ray bursters, and is also relevant for most nuclear-structure phenomena. Experimentalists at CERN have exploited these lower energies for many years, and the present status of their achievements and the prospects for further studies were the subject of the recent Nuclear Physics and Astrophysics at CERN (NuPAC) meeting held on 10-12 October 2005.

CCEnuc1_01-06

These activities are concentrated at CERN around the Isotope Separator On-Line (ISOLDE) and Neutrino Time-of-Flight (n_TOF) facilities. Both come under the ISOLDE and Neutron Time-of-Flight Experiments Committee (INTC), which has been asked by the CERN management to review the scientific case for the two facilities. NuPAC is one step in this review process.

In many ways, nuclear-structure physics is experiencing a renaissance. Some of the “basic truths” about nuclear structure, believed to be universal only 20 years ago, are now known to be approximations that hold for stable and close-to-stable nuclei, but that cannot be used further away from stability. For example, we are used to thinking in terms of nuclear shells based on unassailable “magic” numbers. However, it is possible to move far enough away from the stable nuclei for the balance between the number of neutrons and protons in a nucleus to be so disturbed that the magic numbers can and do change. Reaching the regions where this happens and performing detailed studies of how and why the changes occur are important tasks for nuclear physicists. So far “erosions” of the magic numbers N = 8, 20 and 28 are known, and it seems that they are at least partly replaced by N = 6 and 16, although our present understanding is not complete.

CCEnuc2_01-06

Another change when we move far away from stable nuclei is that continuum states need to be taken into account much more directly because binding energies become low (turning to zero at the neutron and proton driplines, where nuclei are so saturated with an excess of neutrons or protons that they “drip” the relevant nucleon). Experiments can now cross the proton dripline for many elements; and even the neutron dripline, which is further away from stability than the proton one, has been reached and partially crossed for the light elements up to about neon. The structure and dynamics of loosely bound nuclei show new phenomena, such as the spatially extended halo and the “pygmy” resonances at low-excitation energies. This is a challenging area for both experimentalists and theoreticians.

Two of the sessions at NuPAC were dedicated to the evolution of nuclear structure towards and at the driplines. Several theoretical talks outlined how far recent developments have taken us in descriptions of nuclear structure and reaction theory for loosely bound systems, of the evolution of shell structure and nuclear shape as the proton and neutron numbers change, and of the very complex problem the fission process presents. The experimental talks gave examples of the widely different techniques that are used today and planned for the near future.

Measuring the nucleus

Using the low-energy ISOLDE beams, properties such as mass, radius and magnetic moment can be measured relatively directly and in a model-independent way for “long-lived” states (that is, with half-lives longer than a few milliseconds). Decay experiments also make use of these beams and provide information about many aspects of nuclear structure. For the past four years it has also been possible to perform reaction experiments through post-acceleration in the Radioactive Beam EXperiment (REX-ISOLDE) accelerator. Most of these experiments have used the Miniball gamma-ray detector array.

Speakers at the meeting stressed the importance of the planned energy upgrade of REX-ISOLDE to at least 5.5 MeV per nucleon. This will enable reaction experiments to be performed with all of the 800 and more nuclei that ISOLDE can now produce. Participants also strongly supported the continuation of the beam development programme that is ISOLDE’s hallmark. On the “wish list” are beams of even more kinds of nuclei, as well as improved quality (intensity, isotopic purity, phase space extent) for existing beams.

A reliable knowledge of nuclear structure is one of the basic requirements for properly understanding how energy is produced in stars, and thereby how stars evolve. The session dedicated to these questions presented various aspects of the problems as seen by astronomers, theoreticians and nuclear experimentalists. Half of the heavy elements produced are made in what is known as the s-process, the slow neutron capture that takes place in massive stars in later stages of their evolution. Experimental data are still needed as input for a complete understanding, in particular of the weak s-process component (nuclei below mass number 90). One of the main lines of the future n_TOF physics programme is to measure these neutron-capture processes with sufficient resolution. Explosive astrophysical events – such as supernovae, novae and X-ray bursters – quickly drive nuclei far from the region of stability, and data collected at ISOLDE can benefit in several ways the theoretical modelling of these events.

Further uses

Experiments in nuclear physics dominated the early stages of the investigation into weak interactions. Particle physics has taken the place of nuclear physics for many decades, but nuclei still provide important information through precision experiments that restrict the low-energy limit of the phenomena seen more directly at higher energy. A short session at NuPAC gave two examples of this: on the one hand nuclear measurements are needed to improve further the unitarity test of the Cabibbo-Kobayashi-Maskawa quark-mixing matrix; and on the other hand precision measurements of beta-decays in ion and/or atom traps are sensitive to new interactions. These experiments typically run for up to a decade to obtain the required low level of systematic uncertainties.

A session was devoted to presentations of applications of nuclear physics. Basic data on neutron-capture cross-sections on many nuclei are indispensable to enable further developments of nuclear technologies – for example, the accelerator-driven systems for transmutation or the thorium cycle with its potential for a significant reduction of the amount of radiotoxic waste. As several speakers outlined, it is an important part of the present and future n_TOF programme to provide these data. The application of radioactive nuclei in solid-state physics and life science has been an important facet of the ISOLDE programme for many years, and some of the highlights were presented. A possible future use of radioactive ions as probes of nanostructures was outlined; this again requires isotopically pure beams of high beam-optical quality. Also discussed was the use of radioisotopes in nuclear medicine, where progress in biomedicine combined with the introduction of new high-purity radioisotopes opens new possibilities for diagnosis and therapy.

The proposed upgrades will further boost the scientific reach of the facilities and serve a community of more than 500 users.

The last session was devoted to the proposed upgrades of the ISOLDE and n_TOF facilities, and of the proton injectors on which they depend. The ISOLDE community is proposing an upgrade project, HIE (High Intensity and Energy)-ISOLDE, that includes increasing the REX energy to 5.5 MeV/u in 2009, with the goal of reaching 10 MeV/u in 2011. Furthermore, the beam quality will be improved with the help of, for example, new ion sources, an upgraded laser ion source with a trap close to the target, low-energy beam coolers and charge breeders. The target and ion source development programme would be boosted to keep the leading edge in this key field.

The n_TOF community is proposing to restart the facility (after refurbishing the target) and to use a different moderator to increase the proton flux at low energy. It is envisaged that at a later date the n_TOF facility will have a new, shorter TOF tube with a target area that is fully equipped to handle radioactive sources. In principle, such an arrangement could enable sources collected at ISOLDE to be used at n_TOF.

The faster cycling of the Proton Synchrotron (PS) Booster could in the short term provide ISOLDE with more protons. CERN’s Accelerator Beams Department is developing a scheme that will permit the Booster to cycle at 900 ms without any additional risk for the aging PS magnets. Further in the future, Linac 4 will make even more protons available for both ISOLDE and n_TOF, and will serve as the first step towards a multimegawatt proton source at CERN, the Superconducting Proton Linac (SPL). The long-term goal of the ISOLDE community is to realise the European Isotope Separation On-Line Radioactive Ion Beam Facility (EURISOL) – a high-intensity radioactive beam facility that will enable nuclear physicists to probe even further into the unknown. The SPL would be a suitable driver for EURISOL.

The opportunities at the present nuclear-physics and astrophysics facilities at CERN are clearly not yet exhausted. The proposed upgrades will further boost the scientific reach of the facilities and serve a community of more than 500 users. It will be interesting to follow the development of this programme over the coming years.

PHYSTAT: making the most of statistical techniques

Statistics has always been an essential tool in experimental particle physics, and today this is truer than ever. In the early days emulsions and bubble-chamber photographs were scanned slowly by hand; now modern electronic detectors perform equivalent processing quickly and automatically. However, physicists still classify and count their events and then, eagerly or reluctantly, turn to statistical methods to decide whether the numbers are significant and what results are valid.

As the subject has progressed, new themes have emerged. The high numbers of events obtained by CERN’s Large Electron-Positron collider (a Z-factory), the B-factories PEP-II and KEKB at SLAC and KEK respectively, and the experiments at DESY’s HERA collider, mean that statistical errors below 1% are now common. Many areas have become dominated by systematic effects, a relatively untrodden and much less well understood field.

On the theoretical side, the high precision of theories such as quantum electrodynamics and quantum chromodynamics means that the tiny uncertainties in their predictions have to be carefully studied and understood. Supersymmetry and other “new physics” models predict signals that depend on several parameters of the theory, and when an experiment fails to see such a signal the restrictions this places on possible values for these parameters has to be worked out. When different experiments probe the same basic theory, we need to evaluate the combined implication of their results.

The science of statistics is also developing fast. The availability of large amounts of processing power opens new possibilities for evaluating statistical models and their predictions. Bayesian statistics is a rapidly growing field in which a great deal of progress has been made. Machine learning techniques, such as artificial neural networks and decision trees, are flourishing, with further applications continually being found that open up new possibilities for exploiting data.
Astronomers and cosmologists are also developing the power and sophistication of their statistical techniques. Telescopes are becoming larger and more powerful, and the readout from their instruments with charge-coupled detectors produces a torrent of data. Observations at different wavelengths, from gamma rays to radio waves, from ground-based observatories and satellites are combined to yield clues about the nature of distant objects, the processes that power them and other features of the universe. Details of the distribution of the cosmic microwave background will, when properly interpreted, tell us what happened in the Big Bang at energies beyond the reach of man-made accelerators.

The PHYSTAT series of conferences and workshops provide a forum in which different communities can meet and exchange ideas. A first workshop of particle physicists at CERN in 2000 was followed by one at Fermilab in 2001, and then a full conference in Durham in 2002, which benefited from the presence of statisticians as well as physicists. At SLAC in 2003, astronomers and cosmologists were included (see CERN Courier March 2004 p22). This was so successful that it was repeated at the most recent conference, “Statistical Problems in Particle Physics, Astrophysics and Cosmology”, held in Oxford in September 2005 and organized by Louis Lyons.

PHYSTAT 2005 consisted of a wide-ranging programme of parallel and plenary talks. One of the most influential statistical thinkers of the 20th century, David Cox of Oxford University, gave the opening keynote speech, in which he provided an authoritative account of the Bayesian and frequentist approaches to inference. The official programme was supplemented by intense discussions in corridors, coffee lounges and local pubs, as the participants thrashed out ideas that ranged from the philosophical abstractions of the meaning of probability to the pragmatic and technical details of different computer systems.

These techniques are being fed back into the community through the activities of the participants, many of whom are active in analysis on various different experiments, through further meetings (a follow-up afternoon meeting in Manchester attracted 80 particle physicists from the UK), through the academic training programmes offered at CERN and other laboratories, and through graduate conferences and summer schools. There are developing plans for a repository of software that performs these increasingly sophisticated statistical tests. Further workshops are planned for 2006 and beyond.

Further reading

More information can be found at www.physics.ox.ac.uk/phystat05/ and at www.pa.msu.edu/people/linnemann/stat_resources.html.

bright-rec iop pub iop-science physcis connect