Stephen Hawking is the best known physicist alive. His book A Brief History of Time was a bestseller around the world, but sometimes his star status obscures his continued activity at the frontiers of theoretical physics. For some 40 years his research has centred on theoretical cosmology and black holes. When he started work as a theoretical physicist in the 1960s, particle physics and cosmology seemed to be separate worlds, but now these topics are increasingly intertwined. So it was not surprising that Hawking should want to visit the Theory Unit at CERN, to meet fellow theorists and give a seminar on his current work.
In the 1960s, Hawking and Roger Penrose developed the famous singularity theorems in general relativity, which provided a precise set of general conditions under which the existence of gravitational collapse is inevitable, leading to black holes. In particular Hawking provided a fairly complete theory of black holes and their classical properties, which led to intriguing analogies with the laws of thermodynamics. An important application of this work led to the theory that our universe began with a Big Bang, with an initial singularity.
Perhaps more famously, however, in the early 1970s Hawking shocked the world by showing that if one considered quantum mechanics (quantum field theory) in the presence of black holes, these are not black after all, but rather they emit radiation with a thermal spectrum and a temperature that depends only on the basic characteristics of the black-hole state: namely, its mass, angular momentum and charge. In the simplest case, the temperature is inversely proportional to the mass. This phenomenon is known as black-hole evaporation.
Hawking went even further, however, and made the provocative proposal that the laws of quantum mechanics must be changed in the presence of black holes. Specifically, quantum information and quantum coherence are irreversibly lost in the formation and evaporation of black holes. This proposal has generated much work in the past 30 years, and we are only now beginning to understand that quantum mechanics does not after all need to be modified. However, many aspects of the theory of quantum gravity need to be understood in detail before we can claim that the paradox has been resolved.
The Hawking evaporation phenomenon also has basic observational consequences. If a black hole has a small mass, then it will radiate more copiously, so we can put an observational limit on the size of remnant black holes from the origin of the universe. Furthermore, if CERN’s Large Hadron Collider (LHC) produces mini black holes we know that they will evaporate with a nearly thermal spectrum, an important characteristic in identifying them.
Currently Hawking is working on quantum cosmology. He is studying a top-down approach to cosmology that combines the string landscape with the scenario of no-boundary initial conditions. The theory seminar that he presented at CERN was based on this work in collaboration with Thomas Hertog, who is currently a fellow with the Theory Unit.
In his general colloquium, which attracted an audience of 850, Hawking discussed one of his favourite topics – the origin of the universe. He argued that thanks to what we have learned over the past 100 years we may finally have a scientific way to address this subject. General relativity predicts that the universe, and time itself, would have begun in a Big Bang. It also predicts that time ends in black holes. The discoveries of the cosmic microwave background radiation and of black holes support these conclusions; adding in quantum mechanics begins to yield the rudiments of a theory of structure in the observed universe. However, much still remains to be understood on this subject, although as Hawking argued in his seminar, a scientific cosmogony is both possible and within reach of theoretical and experimental work.
During his stay at CERN, Hawking also visited the ATLAS and CMS experiments, as well as the tunnel of the LHC, where installation work proceeds rapidly. He was interested in the details of the experiments, and in the possible discovery of mini black holes. He also met with the director-general, Robert Aymar, and their discussion covered topics ranging from open-access publishing to the start-up of the LHC. Congratulating Aymar and the CERN community on their scientific work, he commented “You have an exciting two years ahead of you.”
Festivity met science when the Lawrence Berkeley National Laboratory (LBNL) celebrated its 75th anniversary with a Founders Day party on 26 August. The day capped a summer of celebratory events, including historical lectures, speeches, music, fire-spinning, Scottish dance, films, science demonstrations, tours, birthday cake for 600 and a time capsule to be opened in 2031. There was even a display of vintage cars, including one driven by the lab’s founder and inventor of the cyclotron, Ernest Orlando Lawrence.
Originally known as the Rad Lab, LBNL was founded in 1931 to house the latest of Lawrence’s increasingly large cyclotrons. His first cyclotrons were small, and could easily be accommodated in his laboratory in the physics department at the University of California Berkeley. However, by 1931 he was working on a monster 27 inch cyclotron with an 80 ton magnet. This required a separate building with reinforced flooring, and on 26 August 1931 Lawrence was given use of the former Civil Engineering Testing Laboratory. Lawrence renamed it the Radiation Laboratory, and so the Rad Lab was born.
The 27 inch cyclotron accelerated protons to 3.6 MeV. Lawrence’s next machine boasted a 37 inch diameter, and accelerated deuterons to 8 MeV and alpha particles to 16 MeV. One of its major accomplishments was the production of the first artificial element, technetium. The next iteration had a 220 ton magnet around a 60 inch cyclotron. This machine required a new, dedicated building: the Crocker Radiation Laboratory. In 1939, Louis Alvarez and Robert Cornog used the 60 inch machine to discover helium-3, and Martin Kamen found radioactive carbon-14. Carbon-14’s potential as a radioactive tag for biology studies was quickly recognized. This machine also saw the beginning of the lab’s diversification, as Lawrence’s brother, John, used it to study nuclear medicine.
The steady growth in accelerator size and energy produced a steady increase in cost. Part of Lawrence’s genius was his ability to manage physics on an industrial scale; another part was his ability to persuade both government agencies and private philanthropists to fund his work. The fundraising was especially impressive in the midst of the Great Depression. Although these challenges are well recognized today, 75 years ago large accelerator laboratories were unknown, and many of the organizational techniques that he brought to particle physics have become ubiquitous.
During the Second World War, the Rad Lab shifted focus, playing a key role in early studies of magnetic separation of uranium isotopes. However, by 1944 magnetic separation had largely moved to Oak Ridge, and the Rad Lab returned to basic physics, with the construction of a mammoth 184 inch cyclotron. By 1950, this was in use for studies of physics, nuclear chemistry and nuclear medicine. The war also produced a large turnover in personnel, as many pre-war leaders left to work on atomic weapons, radar and other military technology.
The lab’s next machine was a big step up in energy and complexity: a 6.5 GeV synchrotron, the Bevatron. Its energy could reach the threshold for antiproton production; by 1955 the accelerator was complete and antiprotons were indeed observed. Later, it supported a long series of experiments that used progressively larger bubble chambers. Throughout the late 1950s and early 1960s, researchers used these chambers to discover a large number of meson and baryon resonances. Still later, the Bevatron was converted to accelerate heavy ions, ushering in the new field of relativistic heavy-ion collisions.
In 1959, with the death of Ernest Lawrence, Ed McMillan became the lab’s second director. The 1960s saw a period of expansion, with many new buildings and facilities, including the 88 inch cyclotron, still used for nuclear structure studies. The Heavy Ion Linear Accelerator (HILAC), and later Super-HILAC, continued the laboratory’s studies of heavy elements, producing atoms of elements 102 and 103. The lab took its first steps beyond the world of particle and nuclear physics, with initiatives in materials science (initially to study the effects of radiation on different materials), and later, chemical lasers.
Of course, the laboratory was part of the Berkeley community. By the late 1960s, the US was involved in the Vietnam War, and Berkeley was the scene of massive antiwar protests. Lab scientists were themselves divided, but most had great sympathy for the antiwar movement, and many actively demonstrated against the war. The 1960s also led to new concerns for human rights, and LBNL scientists were active in supporting oppressed scientists in the former Soviet Union.
Besides the antiwar movement, the 1960s and 1970s brought tremendous political and intellectual ferment. Energy became scarce and environmentalism appeared. In 1971, US president Richard Nixon declared war on cancer. Basic research lost some of its lustre. Under the leadership of its third director, Andrew Sessler, the lab responded to these pressures by diversifying into a variety of fields: biology, earth science and materials science.
Today, this diversity is a hallmark of LBNL. The lab has major programmes in many fields, including synchrotron radiation (the 1.5 GeV advanced light-source accelerator and a strong accelerator development programme), computing (the National Energy Research Supercomputing Center), genome sequencing and cancer biology, as well as continuing programmes in electron microscopy, energy efficiency in buildings, and nanotechnology. The lab has also developed strong electrical and mechanical engineering and computer-science groups; their contributions are apparent in the complicated instrumentation built at LBNL. Recent examples include the vertex detectors for the BaBar experiment at SLAC and CDF at Fermilab, and contributions to ATLAS at CERN; the time projection chamber for the STAR detector at Brookhaven’s Relativistic Heavy Ion Collider (RHIC); the support structure for the Sudbury Neutrino Observatory; and the Gammasphere germanium detector.
Although birthdays are good opportunities to reminisce, the lab also used Founders Day to look forward to the next 75 years. The foreseeable future looks bright. In particle physics, a strong programme in the ATLAS experiment is accompanied by a cosmology programme, which is exploring the nature of dark energy, most notably by studying distant supernovae. SNAP, an orbiting telescope with a billion-pixel camera, should launch early in the next decade. In nuclear physics, the STAR time-projection chamber continues to study heavy-ion collisions at RHIC, and LBNL is contributing to the effort to build an electromagnetic calorimeter for ALICE at CERN. Future nuclear structure studies will be built around GRETINA, a precision germanium tracking calorimeter currently under construction. Efforts in neutrino oscillation, θ13, double beta decay and neutrino astronomy with IceCube complement these large programmes. Accelerator design has always been a hallmark of the lab; future designs include a low-energy, high-current light-ion accelerator for astrophysical studies and work on linear colliders. Other accelerator efforts are focused on producing ultra-short pulses of X-rays, and studies using lasers to accelerate particles. Over the next few decades, these efforts are likely to lead to radically new types of accelerators.
A key focus of current lab director Steve Chu is helping to solve the world energy crisis, through studies including hydrogen storage, carbon sequestration, solar energy (perhaps involving photosynthesis), biomass-to-fuel conversion and improved nuclear power systems. This effort will involve many of the lab’s divisions.
Founders Day included activities and exhibits looking back at many of these periods. Memorabilia from Lawrence’s research, including an early cyclotron were on display, as well as clothing worn by some of the lab’s 10 Nobel prize winners. An exhibition of vintage cars included a 1935 Dodge Brothers Coupé, said to have been driven by Lawrence and Robert Oppenheimer on clandestine late-night beer runs. A cinema showed classic science-fiction greats, from Flash Gordon to Frankenstein, and modern documentaries. Dance performances included traditional Scottish dance and modern fire-spinning. For children, there were hands-on scientific activities ranging from bubble-blowing to build-your-own electric motors and extracting genes from strawberries, as well as two bouncy inflatables where they could expend their energy. Who knows where the next Lawrence will come from?
When analysing data from particle-physics experiments, the best statistical techniques can produce a better quality result. Given that statistical computations are not expensive, while accelerators and detectors are, it is clearly worthwhile investing some effort in the former. The PHYSTAT series of conferences and workshops, which started at CERN in January 2000, has been devoted to just this topic (CERN Courier May 2000 p17). The latest workshop was at Banff in the Canadian Rockies in July and was also a culminating part of the spring 2006 programme of astrostatistics which had taken place earlier in the year at the Statistical and Applied Mathematical Sciences Institute (SAMSI) in the Research Triangle Park, North Carolina.
The initiative for the Banff workshop came from Nancy Reid, a statistician from Toronto who has delivered invited talks at the PHYSTAT conferences at SLAC and Oxford (see CERN Courier March 2004 p22 and (see CERN Courier January/February 2006 p35). The Banff International Research Station sponsors workshops on a variety of mathematical topics, including statistics. The setting for these meetings is the Banff Center, an island of tranquillity and vigorous intellectual activity in the town of Banff. Most of the activities at the centre are in the arts, but science and mathematics are found there too.
Thirty-three people attended the workshop, of whom 13 were statisticians, the remainder being mostly experimental particle physicists, with astrophysicists making up the total. It concentrated on three specific topics: upper limits, in situations where there are systematic effects (nuisance parameters); assessing the significance of possible new effects, in the presence of nuisance parameters; and the separation of events that are signal from those that are caused by background, a classification process that is required in almost every statistical analysis in high-energy physics. For each of these topics there were two coordinators, a physicist and a statistician.
The three topics, of course, interact with each other. Searches for new physics will result in an upper limit when little or no effect is seen, but will need a significance calculation when a discovery is claimed. The multivariate techniques are generally used to provide the enriched subsample of data on which these searches are performed. Just as for limits or significance, nuisance parameters can be important in multivariate separation methods.
As this was a workshop, the organizers encouraged participants to be active in the weeks before the meeting. Reading material was circulated as well as some simulated data, on which participants could run computer programmes that incorporated their favourite algorithms. This enabled all participants to become familiar with the basic issues before the start of the meeting. The workshop began with introductory talks on particle physics and typical statistical analyses, and Monte Carlo simulations in high-energy physics. These primarily described for statisticians the terminology, the sort of physics issues that we try to investigate in experimental particle physics, what the statistical problems are and how we currently cope with them, and so on.
Jim Linnemann of Michigan State University publicized a new website, www.phystat.org, which provides a repository for software that is useful in statistical calculations for physics. Everyone is encouraged to contribute suitable software, which can range from packages that are suitable for general use, to the code that is specifically used in preparing a physics publication.
The convenors of the various subgroups led the remaining sessions on the first day. Very few talks were scheduled for subsequent days, specifically to leave plenty of time for discussions and for summary talks and to provide an opportunity for exploring fundamental issues.
Limits and significance
The discussion about limits ranged from Bayesian techniques, via profile likelihood to pure frequentist methods. Statisticians made the interesting suggestion that hierarchical Bayes might be a good approach for a search for new physics in several related physics channels. There was a lively discussion about the relative merits of the possible approaches, and about the relevant criteria for the comparison. After a late evening session, it was decided that data would be made available by the limits convenor, Joel Heinrich of the University of Pennsylvania, so that participants could try out their favourite methods, and Heinrich would compare the results. This work is expected to continue until November.
Discussions considered the significance issue within particle physics, with several other examples in astrophysics. Indeed it arises in a range of subjects in which anomalous effects are sought. Luc Demortier of Rockefeller University in New York, the physics convenor on significance, detailed eight ways in which nuisance parameters can be incorporated into these calculations, and discussed their performance. This will be a crucial issue for new particle searches at the Large Hadron Collider at CERN, where the exciting discoveries that may be made include the Higgs boson, supersymmetric particles, leptoquarks, pentaquarks, free quarks or magnetic monopoles, extra spatial dimensions, technicolour, the substructure of quarks and/or leptons, mini black holes, and so on. In all cases some of the backgrounds will be known only approximately and it will be necessary to distinguish among peaks that are merely statistical fluctuations, errors and genuine signals of new physics.
Demortier also addressed the issues of whether it is possible to assess the significance of an interesting effect, which is obtained by physicists adjusting selection procedures while looking at the data; and why particle physics usually demands the equivalent of a 5 σ fluctuation of the background before claiming a new discovery. (The probability of obtaining such a large fluctuation by chance is less than one part in a million.)
Signal or background?
The sessions on multivariate signal–background separation resulted in positive discussions between physicists and statisticians. Byron Roe of the University of Michigan explained the various techniques that are used for separating signal from background. He described how for the MiniBooNE experiment at Fermilab, Monte Carlo studies showed that boosted decision trees yielded good separation, and coped with more than 100 input variables. An important issue concerned assessing the effect on the physical result, in this case neutrino oscillation parameters, of possible systematic effects. One of the conventional methods for doing this is to vary each possible systematic effect by one standard deviation, and to see how much this affects the result; the different sources are then combined. Roe pointed out that there is much to recommend an alternative procedure, which investigated the effect on the result of simultaneously varying all possible systematic sources at random.
Radford Neal, a statistician from Toronto University, took up this theme in more detail, and also emphasized the need for any statistical procedure to be robust against possible uncertainties on its input assumptions. One of Neal’s favourite methods uses Bayesian neural nets. He also described graphical methods for showing which of the input variables were most useful in providing the separation of signal and background.
Ilya Narsky of Caltech gave a survey of the various packages that exist for performing signal–background separation, including R, WEKA, MATLAB, SAS, S+ and his own StatPatternRecognition. Narsky suggested that the criteria for judging the usefulness of such packages should include versatility, ease of implementation, documentation, speed, size and graphics capabilities. Berkeley statistician Nicolai Meinshausen gave a useful demonstration of the statistical possibilities within R.
The general discussion in this sub-group covered topics such as the identification of variables that were less useful, and whether to remove them by hand or in the programme; the optimal approach when there are several different sources of background; the treatment of categorical variables; and how to compare the different techniques. This last issue was addressed by a small group of participants working one evening using several different classifiers on a common simulated data set. Clearly there was not the time to optimize the adjustable parameters for each classification method, but it was illuminating to see how quickly a new approach could be used and comparative performance figures produced. Reinhard Schweinhorst of Michigan State University then presented the results.
As far as the workshop as a whole was concerned, it was widely agreed that it was extremely useful having statisticians present to discuss new techniques, to explain old ones and to point out where improvements could be made in analyses. It was noted, however, that while astrophysics has been successful in involving statisticians in their analyses to the extent that their names appear on experimental papers, this is usually not the case in particle physics.
Several reasons for this have been suggested. One is that statisticians enjoy analysing real data, with its interesting problems. Experimental collaborations in particle physics tend to be very jealous about their data, however, and are unwilling to share it with anyone outside the collaboration until it is too old to be interesting. This results in particle physicists asking statisticians only very general questions, which the statisticians regard as unchallenging. If we really do want better help from statisticians, we have to be prepared to be far more generous in what we are ready to share with them. A second issue might be that in other fields scientists are prepared to provide financial support to a statistics post-doc to devote his/her time and skills to helping analyse the data. In particle physics this is, at present, very unusual.
There was unanimous agreement among attendees that the Banff meeting had been stimulating and useful. The inspiring location and environment undoubtedly contributed to the dynamic interaction of participants. The sessions were the scene of vigorous and enlightening discussion, and the work continued late into the evenings, with many participants learning new techniques to take back with them to their analyses. There was real progress in understanding practical issues that are involved in the three topics discussed, and everyone agreed that it would be useful and enjoyable to return to Banff for another workshop.
POSIPOL 2006, an international workshop that was held earlier this year at CERN, was dedicated to the production of polarized positrons using the Compton back-scattering of a high-power laser beam by electrons of a few giga-electron-volts. The particular focus was on applications to the two future linear-collider studies, the International Linear Collider (ILC) and the Compact Linear Collider (CLIC). The workshop, which attracted around 50 experts from Europe, Asia and the Americas, was jointly organized by the CLIC team at CERN, the European CARE-ELAN network, the Japanese high-energy accelerator research organization, KEK, and the Laboratoire de l’Accélérateur Linéaire (LAL) at Orsay. It led to a roadmap and a series of recommendations for future R&D on positron sources for linear colliders.
Polarized positrons (POSItons POLarisés in French, hence POSIPOL) are produced by bombarding a tungsten target with polarized photons. The latter are generated either from a helical undulator or from the scattering of a polarized high-power laser beam with an unpolarized high-energy electron beam. For this second scheme, requirements on the intensity of both the electron beam and the laser beam are significantly relaxed by stacking the laser beam in an optical cavity with an enhancement factor of up to 1000, and by re-using the electrons, which are stored in a so-called Compton ring. This scheme also implies the stacking of the produced positrons in a storage ring. Control of the laser system and of the high-quality optical cavity is crucial, as is the electron-beam dynamics in the presence of electron-laser collisions. The various aspects of this scheme, as well as comparisons with the undulator method, formed the main topics for the sessions at the workshop.
Robert Aymar, CERN’s director-general, opened the workshop, and was followed by Louis Rinolfi, POSIPOL chair, who set the scene with a look at the state-of-the-art for producing polarized positrons and a reminder of the scope of the workshop. In the first overview session, Gudrid Moortgat-Pick from CERN stressed the importance of positron polarization for future linear colliders. Both a Compton source and an undulator scheme are being considered for the ILC, as described by KEK’s Junji Urakawa and John Sheppard of SLAC respectively. Frank Zimmermann of CERN presented a proposal for a Compton source for CLIC, demonstrating that the pertinent requirements are much less demanding than they are for the ILC. He also emphasized the large synergy with ongoing developments for a Compton ring for medical applications.
Several talks by Susanna Guidicci of Frascati, Alessandro Variola of LAL, and Eugene Bulyak and Peter Gladkikh of the Kharkov Institute for Physics and Technology (KIPT) discussed the beam dynamics and optics designs for Compton rings. One suggestion was to use a pulsed mode of operation for the Compton ring with a specific technique for radio frequency (RF) phase modulation. Vitaly Yakimenko of Brookhaven National Laboratory described the merits of an alternative single-pass Compton scattering approach involving a high-duty-cycle electron linac and a battery of CO2 lasers.
Several talks addressed advances in laser systems, including those by Igor Pogorelsky of Brookhaven, Brent Stuart of Lawrence Livermore National Laboratory and Sudhir Dixit of Oxford. Yoann Zaouter of Amplitude Systems highlighted the dramatic evolution of fibre lasers over the past decade. Products from Time-Bandwidth, presented by Thomas Ruchti, achieve parameters close to what is needed.
Tsunehiko Omori of KEK presented the first experimental results on polarized positron production using Compton back-scattering at KEK’s Accelerator Test Facility (ATF), which have recently been published in Physical Review Letters. The E-166 undulator experiment at SLAC also has results, as described by Andreas Schaelicke of DESY/Zeuthen.
Ian Bailey of the UK’s Cockcroft Institute discussed aspects of positron production, in particular targets, and described the development of a conversion target. Vladimir Strakhovenko of the Budker Institute of Nuclear Physics (BINP) presented theoretical calculations of the radiation spectrum and photo-production in a target. Robert Chehab of IN2P3/Lyon and Wei Gai of Argonne National Laboratory addressed the positron production process, in particular matching and capturing positrons downstream of the target. Masao Kuriki of KEK talked about systems considerations for the positron source at the ILC, in particular construction, commissioning and availability of undulator and Compton schemes.
In the session on equipment and diagnostics, Fabian Zomer of LAL discussed ongoing studies of non-planar optical resonators, and Peter Schueler of DESY looked at beam polarimetry issues. A final R&D session focused on optical cavities. Viktor Soskov of the Institute for High Energy Physics, Moscow, reviewed R&D on a high-finesse cavity at LAL, and Hiroki Sato of Hiroshima described R&D on an optical cavity for the ILC. Kazuyuki Sakaue of Waseda University, Tokyo, described the experimental plan for X-ray generation using a pulse-stacking optical cavity.
At the end of the workshop, POSIPOL participants identified a number of critical issues that still need to be demonstrated, both for positron sources that are based on an undulator scheme and for the alternative Compton scattering scheme. The two approaches are in principle equivalent, but there are differences in the photon spectrum, photon energies, angular photon spectrum, power on the conversion target, collimation efficiency, operational efficiency and implementation cost. For the Compton scheme, the photon and positron yields must be simulated with a realistic lattice and energy spread. Many items require further study, including laser systems, 6D positron distribution, stacking in a pre-damping ring, required RF power, Touschek lifetime, beam instabilities and the heat-load limit of optical cavities.
The workshop also agreed on a roadmap for future R&D to address the common issues in Compton and undulator sources. Noteworthy common recommendations for the two schemes concern the analysis of the systematic errors in the polarization measurements, the comparison of yields and polarization, the optimization of pre- and post-selection of positrons and the evaluation of the cost.
For the undulator scheme, the main recommendations were the publication of E-166 results, the evaluation of emittance degradation in the undulator and the technical demonstration with an undulator several metres long. For the Compton scheme, the main recommendations were the publication of the design of a Compton ring with a chicane and with optimization of the energy of the Compton photons, the development of a reliable power laser taking into account the polarization, the simulation of stacking into a damping ring, the comparison of a single-pass scheme with the ring scheme and the comparison of CO2, YAG and fibre lasers.
The workshop also addressed validating design choices and demonstrating feasibility in experiments at KEK’s ATF, Brookhaven’s ATF and the DAFNE storage ring at Frascati. In May, the KEK–ATF Technical Board approved an experimental programme of installing and operating laser pulse-stacking cavities in the ATF damping ring during 2006 and 2007. The goal is the simultaneous demonstration of the high enhancement factor that is required by POSIPOL, the small laser spot and a small beam–laser collision angle in multi-bunch operation. Optimized optical cavities from LAL may be installed later at the KEK-ATF, enabling the study of high-intensity multi-bunch gamma-ray generation by Compton scattering.
Another new project at KEK would allow accumulation experiments with electron beams. An experimental optimization for the Compton source inside a laser cavity is also foreseen at the Brookhaven ATF, and single-pass Compton collisions could be tested with the drive beam of the CLIC Test Facility 3 at CERN.
Since the POSIPOL workshop, LAL has written a letter of intent concerning R&D activities on polarized positron sources. The idea is to submit a proposal as a Joint Research Activity (JRA–POSIPOL) in the context of the European Framework Programme 7. Several institutes have already expressed their interest: LAL, INFN/Frascati, CERN, DESY Zeuthen, the Institut de Physique Nucléaire de Lyon, BINP, the National Science Center KIPT, Université-Paris-XI, KEK, Waseda University and Kyoto University.
As particle physics heads towards tera-electron-volt energies with the Large Hadron Collider, it may be surprising to find that not all valuable research requires hadron beams of the highest energy available. Indeed, the opposite can be true. Experiments on processes that involve hadrons at kilo-electron-volt or even electron-volt energies can address some unresolved questions in quantum chromodynamics (QCD), its associated symmetries such as chiral symmetry, and CPT-invariance. This quickly developing field, which connects atomic, nuclear and particle physics, as well as astrophysics, was the subject of an international workshop, Exotic Hadronic Atoms, Deeply Bound Kaonic Nuclear States and Antihydrogen: Present Results, Future Challenges, which was held at the European Centre for Theoretical Nuclear Physics and Related Areas (ECT*) in Trento on 19–24 June. The workshop brought together some 50 experts in exotic atoms and nuclei to assess the current experimental and theoretical status of the field, and identify the most relevant topics to be addressed in the future. The rich programme extended from the pionic, kaonic and antiprotonic varieties of exotic atoms to antihydrogen, and exotic nuclear clusters, better known today as deeply bound kaonic nuclei. The workshop discussed the latest results from many experiments on these exotic atoms, and outlined future plans that are based on improved experimental techniques in detectors and/or hadronic beams.
Studies of exotic atoms in which a hadron such as a π–, K– or pbar replaces an electron can reveal important information about spontaneous chiral-symmetry breaking in QCD, which governs the low-energy interactions of the lightest pseudoscalar mesons (pions and kaons) with nucleons. Such experiments can access hadronic scattering lengths at zero energy by direct measurements of bound-state parameters, which is not possible through other experimental approaches. The Paul Scherrer Institut in Villigen has investigated pionic hydrogen (π––p) and deuterium (π––d), and DAFNE in Frascati has investigated their kaonic counterparts. Other no-less-important species include kaonic and antiprotonic helium, which have been studied at the Japanese High Energy Accelerator Research Organization (KEK) and CERN, and yet another exotic variety is formed by the non-baryonic π+– π–(pionium) and πK atoms. Finally, the antihydrogen atom, pbar–e+, which CERN has copiously produced, is in a class of its own owing to its importance for testing the CPT theorem to extremely high precision.
The latest kaonic hydrogen results from the DAFNE Exotic Atoms Research (DEAR) experiment, as presented by Johann Marton of SMI Vienna, gave rise to a lively discussion on the possibility of accommodating them with kaon–nucleon scattering data. More precise data and further theoretical calculations will evidently be needed in this domain. The SIDDHARTA experiment that is planned for DAFNE, which aims at 10 times higher precision on the kaonic hydrogen atom and the first measurement on kaonic deuterium, should contribute to a better understanding of the physics of the kaon–nucleon interaction at very low energies.
In the pion–pion scattering sector, the DIRAC experiment at CERN has measured pionium and yielded new values for the scattering lengths. However, the study of the kaon decaying to three pions provides a valid alternative for determining these quantities to an unprecedented accuracy. Gianmaria Collazuol, of INFN and Pisa, presented results from the NA48/2 experiment at CERN on the (a0–a2) difference of pion–pion scattering lengths with a precision of about 6%, equally shared between systematic and statistical uncertainties. This value is in agreement with various theoretical predictions that were discussed at the workshop. Further refinements in the precision of the results will need an interplay between experiment and theory, since most of the systematic error is caused by theoretical uncertainties.
Researchers at the Antiproton Decelerator (AD) at CERN are pursuing precision spectroscopy of antiprotonic helium, as Ryugo Hayano of Tokyo described. The study of the metastable three-body system pbar–e––He2+ has led to the most stringent tests of the equality of the charge and mass of the proton and antiproton at a relative precision of 2 ppb and, for the first time, produced a value of the antiproton-to-electron mass ratio.
Antihydrogen, the simplest atom of antimatter, offers even higher sensitivity to violations of CPT symmetry, because the properties of its conjugate system, hydrogen, are known to very high precision (10-12 for the ground-state hyperfine structure, and a few parts in 1014 for the 1S–2S two-photon transition). The ATRAP, ALPHA and ASACUSA collaborations at the AD are pursuing the formation of cold antihydrogen atoms for precision spectroscopy. Nikolaos Mavromatos of King’s College London and Ralf Lehnert of Massachusetts Institute of Technology discussed theoretical aspects of these interesting tests of CPT and Lorentz invariance, as well as the possibility of using antihydrogen to investigate the gravitational properties of antimatter.
Finally, an important part of the workshop programme discussed a new type of exotic nucleus – the antikaon (Kbar)-mediated bound nuclear system, with Kbar as constituents. Soon after the theoretical prediction of such states, both KEK and the Laboratori Nazionali di Frascati reported preliminary experimental evidence for their existence, and several experiments in Europe and Japan are currently searching for them. Attendees heard a critical review of the present experimental results, followed by an extended discussion on the foundations of the predictions. A lively discussion took place between those defending the existence of these deeply bound kaonic nuclear states and others who express doubts on their existence. There was also a critical analysis of the conditions under which such states might exist. New experiments, studying both the formation and the decay processes of the exotic nuclei, will play a key role in clarifying this interesting physics. These include a recently accepted proposal at the Japanese Proton Accelerator Research Complex (J-PARC), the AMADEUS experiment at DAFNE and the future possibility of investigating double Kbar systems at the Facility for Low-energy Antiproton and Ion Research (FLAIR) within the international Facility for Antiproton and Ion Research (FAIR) to be built at Darmstadt.
A round table on the deeply bound Kbar–nuclear systems, led by Wolfram Weise of Munich, concluded the workshop. He stressed the importance of new experimental studies and further theoretical efforts, showing that progress is expected on one side from next-generation experiments, and on the other by understanding of the Kbar–nucleon interaction (in the SIDDHARTA experiment) by realistic modelling of the nucleon–nucleon interaction and of the Kbar–nucleon–nucleon→hyperon–nucleon absorption. Weise also discussed the interesting connection with dense matter, namely kaon condensation in high-density media.
The workshop confirmed that many fundamental and still-open questions in low-energy QCD and related symmetries can be assessed and answered with experiments in the low-energy domain, by creating and measuring new forms of matter – exotic atoms, exotic nuclei and antihydrogen. An active and growing scientific community supports the great expectations of the field.
For the first time, on 18 August, the OPERA detector in the Gran Sasso Laboratory in Italy recorded the interactions of neutrinos sent from CERN, 732 km away. During the following 11 days, OPERA had about 120 hours of beam time and identified around 300 interactions correlated with the neutrino beam, in-line with predictions. This marked both a successful culmination for the beam-commissioning phase of the CERN Neutrinos to Gran Sasso (CNGS) project and a promising start to data-taking for OPERA. The ultimate aim is to observe oscillations from muon neutrinos into tau neutrinos.
The CNGS beam line incorporates a target to produce pions and kaons and a magnetic horn and reflector to focus them before they decay to muons and neutrinos in a 1 km vacuum pipe. At the end of the decay pipe, a barrier of graphite and iron absorbs the remaining hadrons, leaving only muons, which are quickly absorbed downstream in the rock, and a beam of neutrinos travelling towards Gran Sasso.
The CNGS team began gradually to commission the installation at the start of 2006, beginning with tests without beam of some 200 items of equipment. Then, from 10 July, protons from CERN’s Super Proton Synchrotron were sent step-by-step through the beam line to the target, using a beam 100 times less intense than the nominal beam. After analysis at the end of July, commissioning resumed at high intensity.
The OPERA Collaboration, comprising 170 physicists from 35 research institutes and universities worldwide, began installing their detector in the underground laboratory at Gran Sasso in 2003. The detector has two identical Super Modules, each containing a target section and a large-aperture spectrometer. The target consists of alternate walls of lead/emulsion bricks and modules of scintillator strips for the target tracker. The spectrometer, which detects muons emerging from neutrino interactions, consists of a precision drift-tube tracker, a 1.55 T magnet and resistive plate chambers (RPCs) inserted into the magnet.
During this first run, researchers verified the operation of the electronic detectors (4000 m2 of RPCs, 6000 m2 of scintillator strips and two magnets), checked the synchronization of the OPERA and CNGS clocks and tested the algorithms governing the selection of interesting events. CNGS can send at most two bunches of neutrinos every 6 s, and to reduce the background noise, the experiment has to select events exactly as a bunch passes through. The two 10.5 µs bunches are separated by 50 ms, and the design synchronization accuracy between CERN and OPERA is better than 100 ns.
OPERA is now ready to enter the next phase, aimed at observing neutrino interactions in the emulsions of the detector’s 200,000 target bricks, which will be produced and installed over the coming months. The long search for tau neutrinos will start in earnest at the end of October and will last at least five years.
The world’s largest superconducting solenoid magnet, built for the CMS experiment at the Large Hadron Collider (LHC), reached full field on 28 August. In addition, elements of the detector already in place within the magnet have been successfully recording the tracks of cosmic rays as part of the magnet test and cosmic challenge (MTCC).
The CMS magnet is a marvel of modern technology. Weighing in at more than 10,000 tonnes, the magnet is built around a 6 m diameter, 13 m long superconducting coil that generates a field of 4 T and stores 2.5 GJ of energy. When it was designed in the early 1990s, it was beyond the state-of-the-art. What makes it remarkable is not just its high magnetic field, but also the fact that the field is maintained with high uniformity over such a large volume. New techniques have had to be developed, allowing the solenoid coil to be more compact than 1990s technology could have achieved.
Construction of the magnet was approved in 1996, and began in earnest in 1998. By 2002, fabrication of the superconducting wire was complete. Winding the cable for the solenoid coil began in 2000 and took five years. By the end of 2005, the solenoid was ready for testing, and in February 2006, it was cooled to its operating temperature of 4.5 K. Following the insertion of various particle detectors, the MTCC was ready to begin at the end of July.
During these tests, which lasted until the end of August, more than 25 million cosmic events were recorded at a trigger rate of around 200 Hz. It was a big task to provide the trigger, optimize the performance of the various detector systems and ensure the data integrity. Highlights included data transfer to some Tier-1 centres of the LHC Computing Grid and fast off-line running at Fermilab. Now that the maximum field has been reached in this first phase of the MTCC, the next step is to map it with a precision of 1 in 10,000 in the space that will later be filled by the electromagnetic barrel calorimeter and tracking detectors.
• The magnet is a common project to which all 155 institutes of the CMS Collaboration have contributed financially. Major innovative and technical contributions have been made by the French Atomic Energy Commission in Saclay (CEA) for the original concept and general engineering; CERN for the project coordination, all ancillaries, and the magnet yoke and assembly; the Swiss Federal Institute of Technology (ETH Zurich) for the development and production of the compound superconductor and organization of major magnet procurement including the barrel yoke; the US Department of Energy’s Fermi National Accelerator Laboratory near Chicago for the superconducting wire and field mapping; the Italian National Institute of Nuclear Physics (INFN) in Genoa for the design and execution of the winding operation; the Russian Institute for Theoretical and Experimental Physics (ITEP) in Moscow; and the University of Wisconsin for the endcap yoke.
After years of design, construction and commissioning, the two outer detectors – the transition radiation tracker (TRT) and the semiconductor tracker (SCT) – of the inner detector barrel were moved to the ATLAS cavern from the nearby cleanroom at the end of August. The journey was only about 100 m, but it required weeks of planning and a bit of luck concerning the weather. Special measures were in place to minimize shock and vibration during transportation. Accelerometers fitted to the barrel to provide real-time monitoring recorded no values greater than 0.1 g, satisfying the transport specification for this extremely precise and fragile detector. Then, with only a few millimetres of clearance, the detector was inserted into the liquid-argon calorimeter cryostat.
The SCT and TRT are two of the three major parts of the ATLAS inner detector. The innermost layer (pixels) will be installed in 2007. The barrel part of the inner detector containing the two outer subsystems was assembled in February and passed through complete characterization of its performance during tests in spring. An eighth of the TRT and a quarter of the SCT were equipped with complete readout chains, and during testing particular attention was paid to ensuring that the SCT did not generate noise in the TRT and vice versa. The results were a triumph for the designers – mechanical engineers, electronics engineers and physicists. The two detectors, which are completely independent, can operate at thresholds close to that defined by thermal noise.
The last large component of the Large Hadron Collider beauty (LHCb) experiment has descended into the cavern on the LHC ring, and the delicate installation of the important beryllium vacuum chambers has begun. LHCb will focus on the precision measurement of CP violation and rare decays of hadrons with b quarks, with a spectrometer covering only one side of the collisions in the LHC. Beryllium was chosen for 12 m of the 19 m long beampipe to minimize the level of background in the experiment.
The 10 tonne, 18 m long metal structure known as the bridge, which will support the LHCb tracking system, was lowered into the cavern in June. This was a challenge as there were only a few centimetres to spare as the structure was turned and moved into its final position. The bridge is made of stainless steel, which was chosen to avoid creating interference in the experiment as it is only slightly magnetic. It has rails onto which will slide the three stations of the silicon inner tracker and the three stations of the outer tracker consisting of straw-tube detectors.
More recently, at the end of August, the first beryllium section of LHCb’s beam vacuum chamber was installed. The three-day operation demanded patience and precision as the first of four sections of the beampipe was connected to the vacuum vessel of the vertex locator (VeLo). This first section comprises a conical tube of 1 mm thick beryllium, nearly 2 m long, and an 800 mm diameter spherical window made from 2 mm thick aluminium alloy. The window is connected to the conical part of the beampipe through an aluminium alloy bellow, which allows mechanical alignment once the assembly is installed.
For installation, the beampipe was placed on a frame that slides over rails to move it gently into position. A wakefield suppressor was then inserted and connected electrically, and finally the spherical window was connected to the VeLo vessel using a metal seal. After installation was completed, the system was pumped down and a leak test conducted. The aim is to reach an average pressure of 10–9 millibar with the beam passing through the beampipe.
In mid-July, the ALICE Collaboration reached important milestones with the installation of the trigger and tracking chambers of the muon spectrometer. They are the first detectors to be installed in their final position in the ALICE cavern of the Large Hadron Collider.
The role of the trigger detector is to select events containing a muon pair coming, for instance, from the decay of J/Ψ or Υ resonances. All of the eight half-planes of the resistive plate chambers (RPCs) are now in position behind the muon filter. The company General Tecnica fabricated the internal parts of the RPCs, which are made of bakelite, and groups from INFN Torino and Alessandria are constructing the readout chambers The IN2P3 laboratory in Clermont-Ferrand has developed the front-end electronics and Subatech Nantes has produced the readout electronics.
At the same time, workers at ALICE have installed the first half-station of the tracking system a few metres before the muon wall. The main task of this system is to sample the trajectory of muons with a resolution better than 100 µm. It is composed of cathode-pad/strip chambers, among the first of their kind, made from composite material. Extremely thin but still very rigid, the composite material helps to minimize the scattering of the muons. INFN Cagliari, the Petersburg Nuclear Physics Institute in Gatchina, Subatech Nantes and CEA Saclay constructed the big chambers, while the Institut de Physique Nucléaire at Orsay, and the Saha Laboratory in Kolkata, India, made the smaller ones.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.