Bluefors – leaderboard other pages

Topics

Smaller institutes look to gain from scientific fallout

cernvie1_5-04

Large laboratories obtain scientific data in vast quantities and usually use this material for rapid research being driven by competition. The majority of important results are collected in as short a time as possible. When new data appear older data lose their importance and are abandoned or placed at the disposal of smaller labs that could make use of them.

This has been the case in the past with data obtained at laboratories such as CERN, Fermilab and JINR, which came in such quantities that they could not be exhaustively analysed by the researchers there. The data were therefore given to various universities and other smaller laboratories, which over a long period of time have analysed the events in question and sometimes made valid discoveries.

More recently, data from the CDF and D0 experiments at Fermilab have become available via the web. A more leisurely analysis phase is also happening with data obtained from experiments at LEP, whose activity is slowing down. Thus it gives the possibility of allowing researchers at smaller scientific institutions to follow up the work and make new findings. For example, institutes in the “Post L3” collaboration are currently analysing some LEP data in their own time and have no obligation to provide results by a specific deadline.

The pictures made in the late 1960s with the CERN 2 m hydrogen bubble chamber show the possible importance of this approach. Its films ended up in various universities, either for further analysis or for didactic purposes, because bubble-chamber pictures are useful for students. Consequently, during the 1970s, the University of Bucharest and JINR in Dubna obtained 125,000 pictures courtesy of CERN. The pictures were found to contain a number of interesting items that had earlier been overlooked because in the principal analysis they had been viewed with different criteria in mind.

In one particular example, V M Karnauhov, V I Moroz, C Coca and A Mihul were able to report on finding a resonance in πp interactions at 16 GeV, having a mass of 3520 ± 3 MeV/c2 and a width of 7 +20-07MeV with eight standard deviations (Karnauhov et al. 1992). At the time this seemed very strange, as most physicists were not particularly interested as the resonance corresponded to a five-quark particle (uud ccbar), which did not fit then into any theoretical framework.

During the past year, however, evidence for several exotic resonances has been reported. A real “gold rush” for similar phenomena – the “pentaquarks” – has begun, even though there are few, if any, irrefutable theoretical explanations. Their masses have not yet been calculated due to the lack of a theoretical basis. These include the Θ* (1540 MeV and a width of 17 MeV) and the Ξ (1862) baryon with S = -2, which have still to be established with high accuracy. They appear like states of five quarks (pentaquarks), i.e. four quarks and one antiquark, so yielding a system without colour, which is necessary to be observable.

The 2 m bubble-chamber data suggested long ago that at least one more baryonic exotic state was found with a mass of 3520 ± 3 MeV/c2, a width of 7 +20-07 MeV and S = 0. This was a pentaquark baryon with neutral strangeness. The essential difference between the Θ*and Ξ (1862) and what was found long ago is that the old resonance was formed by quarks including a ccbar pair, while the new ones contain s (sbar) quarks, giving a substantial difference in the final mass. Other teams have also reported possible sightings of pentaquarks in data from the 2 m chamber, and now the H1 experiment at DESY has evidence for a uuddcbar state with a mass of 3100 MeV/c2.

So what can we learn from this experience? The distribution of data to smaller institutions, which perhaps have more time to follow different or unfashionable lines of analysis, must continue. Besides the benefits that this activity can bring to the institutes themselves, the long-term process also has the benefit of bringing fresh minds to the analysis as younger physicists, who may bring new approaches, replacing older ones.

The Grid should also be able to overcome some of the difficulties of the past. It aims at providing a global computing facility, which will allow the smaller laboratories to participate in the primary research. However, the Grid is being developed to provide enormous computing power; it will not be able to provide the thinking time that is necessary for the best job to be done. This can only be provided by the researchers performing long-term analysis generally in the smaller laboratories.

OECD committee endorses future linear collider

Ministers meeting at the end of January for the Committee for Scientific and Technological Policy of the OECD (Organisation for Economic Cooperation and Development) have acknowledged the importance of ensuring access to large-scale research infrastructures in high-energy physics and of the long-term vitality of the field. The ministers also noted the worldwide consensus of the scientific community in choosing an electron-positron linear collider as the next accelerator-based facility to complement and expand on discoveries that are likely to emerge from the Large Hadron Collider (LHC) at CERN. They agreed that the planning and implementation of such a large, multi-year project should be carried out on a global basis, and should involve consultations among not only scientists but also representatives of science funding agencies from interested countries.

At their previous meeting in 1999, the ministers had endorsed the creation of the OECD Global Science Forum, which provided a useful venue for consultations among senior science policy officials and programme managers, and was a valuable mechanism for bringing together government officials with representatives of scientific communities. Now, at the January 2004 meeting, the ministers were in a position to devote their attention to the forum’s work concerning high-energy physics. In particular the ministers endorsed the statement prepared by the forum’s Consultative Group on High-Energy Physics and noted several important points that were articulated in the group’s report. These included the need to have large, next-generation facilities funded, designed, built and operated as global-scale collaborations; the need to educate, attract and train young people in the fields of high-energy physics, astrophysics and cosmology; and the need for a strong international R&D collaboration and studies of the various issues required to realize the next major accelerator facility on the consultative group’s roadmap – a next-generation electron-positron collider with a significant period of concurrent running with the LHC.

CERN strengthens links with AMS experiment

cernnews6_4-04

CERN and the collaboration behind the Alpha Magnetic Spectrometer (AMS) experiment have signed a new memorandum of understanding (MOU) for the execution of the experiment, which will take place not at CERN, or elsewhere on Earth, but in space. The new MOU foresees the establishment at CERN of the experiment’s Payload Operations and Control Centre, and the Science Operations Centre. CERN will also provide areas for the assembly and testing of the AMS detector, as well as offices for users and secretarial support.

cernnews7_4-04

AMS is a major international collaboration that is led by Sam Ting of MIT. The principal goal of the AMS experiment, which will be located on board the International Space Station, is to look for antiparticles in the primary cosmic radiation of outer space. Other objectives of the experiment include searching for dark matter and carefully analysing details of the cosmic-ray spectrum. The detector will be equipped with a powerful superconducting magnet and sophisticated detectors for precision tracking, particle identification and photon detection.

AMS has been a “recognized experiment” at CERN since 1997. The new MOU, which is a significant upgrade of the previous agreement, has a duration of five years and can be renewed.

PPARC approves new funding for UK accelerator R&D

The UK Particle Physics and Astronomy Research Council (PPARC) has approved a £21 million (~€31 million) programme of accelerator R&D for future facilities in particle physics, including a linear collider and a possible neutrino factory. This will develop the UK’s academic base in these important areas. PPARC’s investment, in partnership with the Council for the Central Laboratory of the Research Councils (CCLRC), will fund a research programme and create two new university research centres. The aim is to build on existing academic expertise and develop a strong research base in accelerator R&D, in order to enhance the UK’s position in experimental particle physics.

The two centres that are being created are the Cockcroft Institute: National Centre for Accelerator Science and the Oxford/Royal Holloway Centre. The Cockcroft Institute is being established with £7.03 million (~€10.50 million) from PPARC, in partnership with the Northwest Development Agency, and the universities of Liverpool, Lancaster and Manchester. The second centre, which will receive £2 million (~€3 million) from PPARC, is a partnership with the University of Oxford and Royal Holloway, University of London. The centres will work closely with CCLRC’s Accelerator Science and Technology Centre to create a leading capability in accelerator science in the UK.

An electron-positron linear collider has been accepted by the international particle-physics community as the next large facility that is needed, and construction could start as early as 2009. UK scientists are focusing on developing the beam delivery system, which will take the accelerated particles to the collision point.

The neutrino factory is a proposed international experiment to study neutrinos, and will rely on a beam of muons to create the neutrinos. To achieve this, a new mechanism has been proposed for cooling the muons, and the Muon Ionisation Cooling Experiment (MICE) is designed to test this principle. A collaboration of more than 150 physicists and engineers from Europe, the US and Japan would like to build and test a section of a realistic cooling channel on a beamline, which could be constructed on the ISIS accelerator at CCLRC’s Rutherford Appleton Laboratory. The funding for MICE is at present only provisional, and depends on the project passing through some further review procedures.

The W and Z particles: a personal recollection

cerndar1_4-04

The decade between 1967 and 1976 witnessed an impressive sequence of experimental and theoretical discoveries that changed the vision we had of the world – from the prediction of electroweak unification in the lepton sector (1967-1980) and the discovery of deep-inelastic electron scattering (1969), to asymptotic freedom and quantum chromodynamics (1973) and the discoveries of the J/Ψ (1974) and naked charm (1976). By 1976 the Standard Model of particle physics was in place, ready to confront experiments, and it was clear that a new accelerator was required to explore the electroweak unification sector. This is where the weak gauge bosons, W and Z, were expected, with approximate masses of 65 and 80 GeV/c2, respectively. The arguments for the future LEP machine were already strong.

I remember being asked by John Adams (then executive director-general of CERN) to convene the Large Electron Positron Collider (LEP) study group in April 1976, and to edit the report. In practice this meant learning from theorists John Ellis and Mary K Gaillard all the beautiful new physics that was waiting for us, putting together some documents on the feasibility of the machine (which were available following Burt Richter’s seminal paper), and wrapping it all up as quickly as possible together with some bread-and-butter experimental comments. It took only seven months to get it all done to the satisfaction of Adams, who wanted to push the LEP project in the wake of the success of the Super Proton Synchrotron (SPS), which was about to start operation.

The proton-antiproton choice

The situation in 1976 sets the context in which the proton-antiproton decision was made. The pressure to discover the W and Z was so strong that the long design, development and construction time of the LEP project left most of us, even the most patient, dissatisfied. A quick (but hopefully not dirty) look at the new bosons would have been highly welcome. But when proton-proton colliders such as the Superconducting Intersecting Storage Rings (SCISR) were proposed in this spirit, they were “killed in the egg” by the management at CERN, with the argument that they would delay – or even worse, endanger – the LEP project. This was accepted as a serious argument even by the proponents of such colliders.

cerndar2_4-04

The same argument did not apply to the proton-antiproton collider as it did not require the construction of a new collider ring and could be proposed as an experiment. One might object that this sounds like a bad joke, because it implied the construction of an antiproton source, and that turned out later to include a collector/accumulator accelerator complex (AC/AA).

However, it remains true that the existence of the SPS, which was soon shown to perform extremely well, was obviously an essential element of the success of the proton-antiproton project, for which John Adams has to be credited. It is also true that he found it hard to swallow that his newborn baby should be tinkered with at such a young age and turned into a collider that had only a small chance of working. This was indeed the feeling of the vast majority of machine experts at the time, and much of Carlo Rubbia’s merit is that he pushed his ideas for the proton-antiproton collider with an untiring determination in such an adverse climate. Indeed, he pushed not only with determination but also with a clear vision of what his proposals would lead to, and with a deep understanding of the machine-physics issues at stake.

A threat from Fermilab

Another argument also made it possible for the proton-antiproton project to break the LEP taboo. If CERN did not buy Carlo’s idea, it was most likely that he would sell it to Fermilab. This threat was clear and had a great deal of weight when the decision was made at CERN. Despite the fact that the Fermilab machine was not performing well enough at the time to be used as a proton-antiproton collider, the threat very effectively accelerated the well known sequence of events that followed the publication in 1976 of the paper by Carlo Rubbia, Peter McIntyre and David Cline. In 1977, after the proposal had been made to CERN and Fermilab to produce the W and Z with existing machines, a feasibility study was undertaken by Franco Bonaudi, Simon Van der Meer and Bernard Pope that led to the Antiproton Accumulator (AA) design. At the same time a detector study was initiated under Carlo that led to the UA1 design, and the Initial Cooling Experiment (ICE) was proposed to the SPS Committee. The success of ICE was demonstrated in June 1978 and the approval for the UA1 detector followed immediately. Only six months later UA2 was also approved.

cerndar3_4-04

I strongly believe that if it had not been for Carlo, there would have been no proton-antiproton collider physics in the world for a long time, maybe forever. Whether the weak bosons would have been discovered at LEP, at the Stanford Linear Collider (SLC), or at some other collider is another matter, but it would have taken another six years at least. One might argue that six years is not really that long, but the top quark would not have been discovered either (other than indirectly from radiative corrections at LEP), nor would we have learned from the vast and rich amount of strong and electroweak physics data that have been collected at the SPS and Tevatron colliders – not to mention the low-energy LEAR physics, antihydrogen, glueballs, CP violation, antiprotonic helium atoms, etc.

The influence of the CERN ISR

I would like to say a word here about the CERN ISR and the seminal role that they played in the success of the proton-antiproton project. The ISR were the world’s first hadron collider. This was the machine on which the young generation of machine physicists who designed, built and operated the antiproton source and the proton-antiproton collider (and later on, maybe to a lesser extent, LEP) gained their experience and their expertise. It worked superbly, exceeding its design goals in both energy and luminosity. It is the machine on which Van der Meer’s ideas on stochastic cooling were tried for the first time, where they were studied and understood. It is also the machine with which a generation of physicists learned how to design experiments at hadron colliders.

When the first ISR experiments were being designed the strong interaction was still a complete mystery; when the machine was finally shut down QCD was in place. I do not mean to say that it is ISR physics that has taught us about QCD, but it contributed to the development of several of its ideas. ISR physics has helped us greatly in drawing a clear picture of hadron collisions, without which we would not have been able to design so effectively the UA experiments at CERN, and CDF and D0 at Fermilab. We, in UA2, were particularly indebted to the ISR, where many of us had previously been working and for whom this experience was an essential asset in designing a good detector.

I would also like to recall the extraordinary concentration of outstanding talents that the proton-antiproton project succeeded in attracting. One reason was of course that between the SPS and LEP projects – one completed and the other as yet unborn – its timing was in some sense ideal. But the other reason, possibly more important, was the challenging nature of the project, which attracted extremely bright engineers and physicists, both machine physicists and particle physicists.

cerndar4_4-04

The challenge of designing, constructing and assembling the antiproton source and detectors, and of getting them to work in such a short time, was enormous; as was that of digging and equipping the large experimental halls required for housing the new detectors that had to be alternately rolled in and out between collider and fixed target periods; and that of transforming the SPS into a collider. The amount of ingenuity that went into all these achievements was truly outstanding.

My best memory of those times may indeed be the good fortune I had to work with so many talents, and, in the case of UA2, to enjoy collaborating with such bright colleagues, senior physicists, postdocs, students or physicists of the same generation as mine.

The UA1/UA2 competition

The competition between UA1 and UA2 was real and lively, but relatively unimportant; it was more a kind of game, and we had a lot of fun playing it. There was no doubt that Carlo was the king of the proton-antiproton kingdom and was recognized as such by all of us. Undoubtedly, he would have had to take the blame if the proton-antiproton project had been a failure, but as it turned out to be a success he deserved to take the fame.

Personally, I had been working in Carlo’s group for six years or so, mostly on K physics. I had joined him as a postdoc in the mid-1960s, coming from nuclear physics, and I had learned from him the basis of experimental particle physics. I had always been impressed by his brightness, by the readiness of his mind and by his far-reaching vision; and I respected him, as I do today, as someone of a clearly outstanding stature. To respect him as the king did not mean to belong to his court, however, and we in UA2 were particularly keen on finding occasions when we could proclaim that: “The king was naked.” Such occasions were very rare – the king was usually dressed splendidly – so they were all the more enjoyable.

The design of the UA2 detector was a success and its construction and running-in went extremely smoothly. We were rightly proud of it. For only one-third the cost of UA1 – a condition of our approval was that UA2’s cost should be significantly lower – we managed to build a detector that was ready on time, that saw the W and Z as soon as the collider luminosity made it possible (and at the same time as UA1 did), that measured the W and Z masses more accurately than UA1, and that was better than UA1 at detecting and measuring hadron jets. It was easier to design UA2 than UA1 because UA2 did not have to be a multi-purpose detector and could afford simply to ignore some of the physics, in particular to be blind to muons. While the main asset of the UA1 detector was its central detector, that of UA2 was its calorimetry.

cerndar5_4-04

One difficulty in the design process had been judging how well the machine would perform, how long it would take to get going, and how noisy and hostile an experimental environment had to be expected. Sam Ting’s detector (which was ultimately not approved) could have run in almost any background conditions, but could only see muons; the UA1 central detector required very clean conditions; UA2 was somewhere in between.

Expectations exceeded

The collider turned out to be an exceedingly clean machine and we had grossly underestimated how fast its luminosity would increase. In particular we had left an open wedge in our calorimeter, instrumented with a magnetic spectrometer, to do quietly (so we thought) some exploratory measurements while the machine was being tuned and run in. The wedge did not stay open very long, for the performance of the machine progressed at high speed, and we were able to tackle the first high-luminosity run with full calorimetric coverage.

I do not wish to repeat here the oft-told stories about the first seminars and the first publications reporting the UA1 and UA2 discoveries of the weak bosons, but I wish to comment on how we perceived these events. As I have already said, we were all expecting to see the weak bosons, we had no competition to fear from other laboratories and there was no question of UA2 “scooping” UA1 in the sense of stealing a Nobel prize or whatever. There was no doubt in our minds that Carlo and of course Simon deserved the whole credit for the success. The real outstanding achievement was the production of the weak bosons, not their detection. Without Carlo and Simon there would have been no proton-antiproton collider, but without UA1 and UA2 there would have been other experiments that would undoubtedly have done as good a job. The success of UA2 was largely due to the quality of the many physicists who had worked together very efficiently, with an excellent team spirit, and it was impossible to single out a few of them as deserving a larger part of the credit.

Of course there was competition; we enjoyed being faster or more clever than UA1 whenever we could afford to be, as when we were first at reporting to the 1982 Paris Conference the observation of very clear hadron jets, a breakthrough in the history of strong interaction physics. But this was not the dish, it was just the spices. The dish was serious business. It was reporting to the physics community what we had been finding. It was writing papers that would stay forever as important documents in the history of science.

In retrospect I am proud we resisted the pressure that was exerted on us to publish faster than we thought we had to. It would have been stupid and childish to give in, and would not have shown much respect for science. In fact this pressure made us almost overreact and, in the case of the Z, it caused a delay of nearly two months between the UA1 and UA2 publications because we preferred to wait for the imminent new run and collect more statistics before publishing. There was virtually no dissenting opinion in UA2 that we should have behaved differently – we all felt quite strongly about it. In particular the wiser and more experienced members of the collaboration (I mean the generation before mine) gave their full support to this line.

It is obvious today that there would have been no point in making a fuss about an event detected in 1982 that was most likely a Z, but one of its decay electrons was not identified because it hit a coil of our forward spectrometer magnets. We were wise to wait for more statistics before publishing the Z results. The issue at stake was not to bet on the truth, but to behave as if we were the only experiment.

Scientists of my generation are very fortunate to have witnessed such amazing progress in our understanding of nature, in phase with our own scientific life. It is remarkable that this has not only been the case in particle physics but also, and maybe to an even greater extent, in astronomy and life sciences. While many questions remain unanswered in each of these three fields, none can be put aside any longer as being a mystery inaccessible to science. Our vision of the world has changed dramatically. Having had an opportunity to contribute to this progress, however modest our contribution may have been, is very good fortune. May science be smiling on the next generation as kindly as it did on us, with the new physics that the LHC will soon reveal.

Further reading

www.nobel.se/physics/laureates/1984

Apart from the Nobel lectures of Rubbia and Van der Meer, the interested reader may consult a list of relevant references in John Krige, History of CERN, volume III, chapter 6 (Elsevier, Amsterdam, 1996).

•This article is based on a talk given at the symposium held at CERN in September 2003, “1973: neutral currents, 1983: W± and Z0 bosons. The anniversary of CERN’s discoveries and a look into the future.” The full proceedings will be published as volume 34 issue 1 of The European Physical Journal C. Hardback ISBN 3540207503.

Perspectives for nuclear-physics research in Europe

cernnup1_4-04

In Vienna, Austria, in December 2001 the Nuclear Physics European Collaboration Committee (NuPECC) started to prepare a new long-range plan for nuclear physics in Europe. NuPECC’s goal was to produce “a self-contained document that reflects on the next five years and provides vision into the next 10-15 years”. The previous long-range plan had been published as a report, “Nuclear Physics in Europe: Highlights and Opportunities”, in December 1997.

NuPECC first defined the active areas of nuclear physics that were to be addressed. Working groups were formed, spanning all the subfields of nuclear physics and its applications: nuclear structure; phases of nuclear matter; quantum chromodynamics; nuclear physics in the universe; and fundamental interactions and applications. Convenors for each of these groups were appointed and two liaison members of NuPECC were assigned to each of them. The working groups were then asked to provide recommendations for possible future directions and a prioritized list of the facilities and instrumentation needed to address them.

The next step in the process was a town meeting, organized at GSI Darmstadt on 30 January – 1 February 2003, to discuss the long-range plan. Prior to this, the preliminary reports of the groups had been posted on the NuPECC website. The town meeting was well attended with around 300 participants, including many young scientists, and the following summarizes the general trends and exciting ideas about modern nuclear physics that were presented at the meeting and given in the report.

Progress in nuclear research

At a deeper level, nuclear physics is the physics of hadrons. Here, recent developments in lattice quantum chromodynamics (QCD) calculations have raised a great deal of interest in hadron spectroscopy. According to QCD, gluon-rich hadrons can be formed, as well as hybrid states of combinations of quark and gluonic excitations. There is also interest in quark dynamics, since in hadrons the polarization of gluons and the orbital angular momentum of quarks play an important role, together with a large transverse quark polarization. Nowadays the measurement of generalized parton distributions – which are generalizations of the usual distributions describing the momentum or helicity distributions of the quarks in the nucleon – receives much attention as the measurements will improve our knowledge of the structure of the hadron. Quark confinement and the study of phenomena in the non-perturbative regime of QCD will be addressed in future. Phase transitions of nuclear matter are being investigated in two regimes: at the Fermi energy, at which a liquid-gas phase transition is expected, and at very high energies and/or densities where a quark-gluon plasma (QGP) is expected. In the first phase transition interesting isospin effects turn out to play a role in the formation of exotic isotopes, whereas at the second phase transition the deconfinement of quarks is expected at very high temperatures and colour superconductivity at low temperatures and very high densities.

cernnup2_4-04

A long-term and fundamental goal of nuclear physics is to explain low-energy phenomena starting from QCD. In a first step, the connection could be made through QCD-motivated effective field theories. This should go hand in hand with experimental investigations that allow tests of these models. Recently, new developments have taken place, raising interest in nuclear structure, and besides the development of equipment and refined detection methods, it is now possible to use exotic beams of unstable nuclei. Furthermore, due to the increase in computing capacity, ab initio calculations with two- and three-body forces up to mass 12 can be performed. Experimentally, it is now possible to broaden the research field of the 300 stable nuclei to the approximately 6000 atomic nuclei that are predicted to exist. This means that a number of questions can now be addressed, such as what happens in extreme conditions of the neutron to proton (N/Z) ratio, at a high excitation energy, at an extreme angular momentum, or at a very heavy mass – that is, at considerably more extreme conditions than those we have investigated so far. Phenomena to be addressed here include neutron halo structures, super-heavy elements, new magic numbers, hyperdeformation and many other exotic forms of atomic nuclei.

In the past 20 years nuclear astrophysics has developed into an important subfield of nuclear physics. It is a truly interdisciplinary field, concentrating on primordial and stellar nucleosynthesis, stellar evolution, and the interpretation of cataclysmic stellar events such as novae and supernovae. It combines astronomical observation and astrophysical modelling with research into meteoritic anomalies, and with measurements and theory in nuclear physics. With the use of new methods, as well as the availability of radioactive-ion-beam (RIB) accelerators, astrophysically relevant nuclear reactions are already being measured. In future, this research will be intensified with the new generation of RIB facilities.

In the past, research on symmetries and fundamental interactions (and the physics beyond the Standard Model) has made large steps with the development of techniques that facilitate precision measurements. In this subfield, research on the properties of neutrinos (mass measurement), time-reversal and charge-parity violation (through measurements of electric-dipole moments of molecules, atoms and nucleons as well as correlations between electrons and neutrinos in ß-decay), and the determination of fundamental constants, is in progress.

Finally, there has been progress in the applications of nuclear-physics techniques and methods. These cross over into several disciplines, such as life sciences, medical technology, environmental studies, archaeology, future energy supplies, art, solid-state and atomic physics, and civilian safety.

Research facilities

Several new research facilities are now being developed or built. The most ambitious is the International Accelerator Facility for Beams of Ions and Antiprotons (IAFBIA) in Darmstadt (currently GSI) (see IAFBIA box). This will be available for experiments after 2010. For nuclear structure and related studies with extreme N/Z ratios, RIB facilities are required and can be realized by means of the in-flight fragmentation (IFF) technique, as aimed at with the IAFBIA, or the isotope-separation online (ISOL) method (see figure 1). In Europe, a plan to build the European ISOL (EURISOL) facility, which would be ready after 2013, exists, and intermediate to this are the ISOL facilities already operational at CERN, GANIL and Louvain-la-Neuve, and the upgrades at REX-ISOLDE and SPIRAL2, as well as the future facilities SPES in Legnaro and MAFF in Munich.

cernnup3_4-04

Recommendations

The first of NuPECC’s recommendations is to exploit fully the existing and competitive lepton, proton, stable isotope and radioactive ion-beam facilities and instrumentation. In addition to their physics-research potential, they will serve as important training sites and facilities where major beam-production development and detector R&D can be performed in the next 5 to 10 years. In its previous long-range plan, NuPECC gave high priority to the ALICE experiment at CERN, which has an extensive programme to investigate QGP in the framework of the large and active heavy-ion programme at the Large Hadron Collider (LHC) in the near future. A huge European effort is already underway to build the ALICE detector in time for the LHC. In accordance with the high priority given to ALICE in the previous long-range plan, NuPECC strongly recommends its timely completion to allow early and full exploitation at the start of the LHC.

Support of the university-based nuclear-physics groups, including their local infrastructure, is seen by NuPECC as essential for the success of the programmes at the present facilities and at future large-scale projects. Furthermore, NuPECC recommends that efforts should be taken to strengthen local theory groups in order to guarantee the development needed to address the challenging basic issues that exist or may arise from new experimental observations. NuPECC also recognizes the positive role played by the ECT* centre in Trento in nuclear theory, especially in its mission of strengthening unifying contacts between nuclear and hadron physics. In addition, NuPECC recommends that efforts to increase literacy in nuclear science among the general public be intensified.

Priorities for the future

The specific recommendations and priorities follow on from the new experimental facilities and advanced instrumentation that have been proposed, or are under construction, to address the challenging basic questions posed by nuclear science. NuPECC supports, as the highest priority for a new construction project, the building of the IAFBIA. This international facility (see IAFBIA box) will provide new opportunities for research in the different subfields of nuclear science. Envisaged for producing high-intensity radioactive ion beams using the IFF technique, the facility is highly competitive, even surpassing in certain respects similar facilities that are either planned or under construction in the US or in Japan. With the experimental equipment available at low and high energies, and at the New Experimental Storage Ring with its internal targets and electron collider ring, the facility will be a world leader in research in nuclear structure and nuclear astrophysics, in particular for research performed with short-lived exotic nuclei far from the valley of stability. The high-energy, high-intensity stable heavy-ion beams will facilitate the exploration of compressed baryonic matter with new penetrating probes. The high-quality cooled antiproton beams in the High-Energy Storage Ring, in conjunction with the planned detector system, PANDA, will provide the opportunity to search for the new hadron states that are predicted by QCD, and to explore the interactions of the charmed hadrons in the nuclear medium. In short, this facility is broadly supported since it will provide almost all fields of nuclear science with new research opportunities.

After the construction of IAFBIA, NuPECC recommends the highest priority to be the construction of the advanced ISOL facility, EURISOL. The ISOL technique for producing radioactive beams has clear complementary aspects to the IFF method. First-generation ISOL-based facilities have produced their first results and have been shown to work convincingly. The next-generation ISOL-based RIB facility EURISOL aims at increasing, beyond 2013, the variety of radioactive beams and their intensities by orders of magnitude over what is available at present for various scientific disciplines, including nuclear physics, nuclear astrophysics and fundamental interactions. EURISOL will employ a high-power (several MW) proton/ deuteron (p/d) driver accelerator. A large number of possible projects, such as a neutrino factory, an antiproton facility, a muon factory and a neutron spallation source, may benefit from the availability of such a p/d driver, and synergies with closely and less closely related fields of science are abundant. Considering the wide interest in such an accelerator, NuPECC proposes joining with other interested communities to do the Research and Technological Development (RTD) and design work necessary to realize the high-power p/d driver in the near future.

NuPECC also gives a high priority to the installation at the Gran Sasso underground laboratory of a compact, high-current 5 MV accelerator for light ions, equipped with a high-efficiency 4π array of germanium detectors. Such a facility will enhance the uniqueness of the present facility at Gran Sasso, and its potential to measure astrophysically important reactions down to relevant stellar energies.

On a longer timescale, the full exploration of non-perturbative QCD, e.g. unravelling hadron structure and performing precision tests of various QCD predictions, will require a high-intensity, high-energy lepton-scattering facility. NuPECC considers physics with a high-luminosity multi-GeV lepton-scattering facility to be very interesting and of high scientific potential. However, the construction of such a facility would require worldwide collaboration, so NuPECC recommends that the community pursues this research from an international perspective, incorporating it into any existing or planned large-scale facilities.

To exploit the current and future facilities fully and most efficiently, advanced instrumentation and detection equipment will be required to carry on the various programmes. The AGATA project for the construction of a 4π array of highly segmented germanium detectors for γ-ray tracking will benefit research programmes in the subfields of nuclear science at the various facilities in Europe. NuPECC gives its full support to the construction of AGATA, and recommends that the R&D phase be pursued with vigour.

•For more information about NuPECC, see www.nupecc.org.

The importance of funding outreach

cernvie1_4-04

Science and technology play an increasingly important role in our everyday lives, and many of life’s decisions now depend on some sort of scientific or technical knowledge. At the same time, advances in modern science occur quickly as each subject evolves and entirely new subjects are created, so it is often difficult for the general public and for teachers to keep up with scientific discoveries and technological innovations. However, science can be made more accessible and interesting to students, teachers and the public if they are exposed to the exciting ideas and discoveries of the latest research, for example in the field of high-energy particle physics.

Research in particle physics involves advanced technology, such as the large-scale use of superconductivity, precision particle detectors, and state-of-the-art electronics and computing systems. The technology of particle accelerators and detectors can also be applied to medicine and many other areas of science and industry, bringing alive the “appliance of science” to everyday life. Moreover, research has led to advances in information technology, such as the World Wide Web, which can bring about a “high tech” approach to learning about science. Aspects of classical physics, such as electromagnetism, optics and kinematics, can also be given a new lease of life through examples from modern physics, as compared with traditional teaching.

It is now generally agreed that education and awareness in science have to be strengthened in modern society. Indeed during the past few years increasing efforts have been made to improve awareness in the general public – especially young people in schools – of the importance of natural science to everyone. Scientific outreach, which promotes awareness and an appreciation of current research, has become an essential task for the research community and for many scientists.

As a result of an increased awareness of the importance of outreach activities, the European particle-physics community created the European Particle Physics Outreach Group (EPOG) in 1997 to promote outreach activities in particle physics. EPOG members represent the particle-physics communities of the 20 member states of CERN and, more recently, the US, together with the major laboratories of CERN, DESY and INFN. The group has received its mandate from the High Energy Particle Physics (HEPP) division of the European Physical Society (EPS) and the European Committee for Future Accelerators (ECFA). EPOG aims to help make scientific results and discoveries accessible to schools and the general public, and to introduce modern science into the school curricula.

Since its inception, the members of EPOG have both learned from each other and worked together on joint activities. There are many particle physicists active within their own countries who are working on a variety of initiatives, such as the development of new teaching materials, the translation of materials, workshops and masterclasses for both students and teachers, and visits to CERN. This work is often undertaken on a voluntary basis, with little or no official funding, and is dependent on the goodwill of the hardworking contributors and their institutions.

To be really successful though, outreach activities have to be done in a professional manner. Leading scientists who have a specialist knowledge in their subjects can form a powerful team with educators and those familiar with modern techniques in disseminating information to large groups of people. However, as we are competing with television and other leisure pursuits, outreach activities also require proper funding to be able to produce an attractive and engaging image of the natural sciences.

For this reason EPOG, together with ECFA and the EPS HEPP division, has written to a number of science research councils and other funding bodies in various countries to encourage them to recognize scientific outreach as an important and natural part of the research process, and to make financing available to the scientists for professional outreach activities. As we say in the letter, we realize that in some countries the importance of scientific outreach activities has already been recognized and is regarded as a natural part of the research activity. A particularly good example is the awareness in the US, which has resulted in organized funding. In many other countries, however, this is still not the case, and we believe that proper funding is crucial for an increased interest in and awareness of science and technology.

In summary, it is important that outreach activities are taken seriously by the bodies that fund our research. They should be recognized as a natural and logical part of research, and as an important link between research and society. With appropriate funding we could have the opportunity to make our mark and, who knows, to make a real difference.

Le miroir aux neutrinos (The Neutrino Mirror)

by François Vannucci, Odile Jacob. ISBN 2738113311, €23.50.

cernboo1_4-04

Neutrinos have excited scientists since 1930 and have allowed some important discoveries: Gargamelle’s 1973 observation of neutral currents in fact constituted the first manifestation of the Z boson, and as such marked the experimental foundation of the Standard Model. More recently, the beautiful phenomenon of neutrino oscillations has demonstrated that the Standard Model needs to be enlarged to account for neutrino masses. In a nutshell, neutrinos are in the spotlight.

For this reason it is very pleasant to see one of our colleagues undertake to communicate to a broad public his enthusiasm and excitement for these particles that are so hard to detect. The “mirror” through which Vannucci invites us to discover these neutrinos is, in the end, that of his own personality. The reader finds a typically French character, profoundly cultured, who revels in the company of literary quotes that mirror his thoughts and that enrich them with a touch of melancholic beauty. Marcel Proust and Oscar Wilde top his favourite author’s list, which extends from Saint Augustine to Daniel Pennac, via Jean-Paul Sartre and the medical dictionary. Sometimes a school-boy’s wink, and often a sensuous shiver, express themselves through these quotations, which is testimony to the fact that science speaks not only to the brain but also the heart. I am not sure that I have grasped what these quotations are supposed to explain, but they certainly carry a form of emotion.

The book tells the story of neutrinos, at a level that is meant to be accessible to pupils in the final years of high school (15-18 years old), as well as scientifically cultivated adults. It begins with a discussion of perception and detection, first of ordinary objects and then of particles. Then we arrive at Pauli and his “radioactive ladies and gentlemen”, followed immediately by UA1 and the discovery of the W. (Sartre and Le Verrier are quoted…but no word of Carlo Rubbia. This will soothe the feelings of all those who felt they should have appeared.) Then we go back to the experiments to measure the neutrino mass followed by neutrinoless double-beta decay, and the detection of the first neutrino interactions by Fred Reines. As one can see, the experiments that have established the properties of neutrinos are listed thematically and not necessarily historically, something that I appreciated.

With occasional irony towards his colleagues (or himself?), Vannucci takes us around the experiments that made history in neutrino physics; those that were right and those that were wrong, those that made us understand and those that got us confused. This is followed by a discussion on uncertainties and the scientific method. I am not sure I agree fully when what we don’t know yet but are striving to know and will hopefully understand (“the big bang cannot be considered a physical event”), is compared with medieval legends (“angels, archangels and cherubim of the middle ages”). However, do read carefully and you will find the definition of the “miroir aux alouettes”, which inspired the title of the book and is taken from a quotation in…a dictionary.

It is not obvious for whom this book is best suited. For whom would I buy it? It seems more for our fathers – and mothers – or our colleagues than for teenagers, who may be discouraged by the unlikely mix of literature and science.

The Global Approach to Quantum Field Theory

by Bryce DeWitt, Oxford University Press (vols I and II). Hardback ISBN 0198510934, £115 ($230).

71GxoUhRL5L

It is difficult to describe or even summarize the huge amount of information contained in this two-volume set. Quantum field theory (QFT) is the more basic language to express the fundamental laws of nature. It is a difficult language to learn, not only because of its technical intricacies but also because it contains so many conceptual riddles, even more so when the theory is considered in the presence of a gravitational background. The applied field theory techniques to be used in concrete computations of cross-sections and decay rates are scarce in this book, probably because they are adequately explained in many other texts. The driving force of these volumes is to provide, from the beginning, a manifestly relativistic invariant construction of QFT.

Early in the book we come across objects such as Jacobi fields, Peierls brackets (as a replacement of Poisson brackets), the measurement problem, Schwinger’s variational principle and the Feynman path integral, which form the basis of many things to come. One advantage of the global approach is that it can be formulated in the presence of gravitational fields. There are various loose ends in regular expositions of QFT that are clearly tied in the book, and one can find plenty of jewels throughout: for instance a thorough analysis of the measurement problem in quantum mechanics and QFT, something that is hard to find elsewhere. The treatment of symmetries is rather unique. DeWitt introduces local (gauge) symmetries early on; global symmetries follow at the end as a residue or bonus. This is a very modern point of view that is spelt out fully in the book. In the Standard Model, for example, the global symmetry (B-L, baryon minus lepton number) appears only after we consider the most general renormalizable Lagrangian consistent with the underlying gauge symmetries. In most modern approaches to the unification of fundamental forces, global symmetries are quite accidental. String theory is an extreme example where all symmetries are related to gauge symmetries.

There are many difficult and elaborate technical areas of QFT that are very well explained in the book, such as heat kernel expansions, quantization of gauge theories, quantization in the presence of gravity and so on. There are also some conceptually difficult and profound questions that DeWitt addresses head on with authority and clarity, including the measurement problem mentioned previously and the Everett interpretation of quantum mechanics and its implications in quantum cosmology. There is also a cogent and impressive study of QFT in the presence of black holes, their Hawking emission, the final-state problem for quantum black holes and a long etcetera.

The book’s presentation is very impressive. Conceptual problems are elegantly exhibited and there is an inner coherent logic of exposition that could only come from someone who had long and deeply reflected on the subject, and made important contributions to it. It should be said, however, that the book is not for the faint hearted. The level is consistently high throughout its 1042 pages. Nonetheless it does provide a deep, uncompromising review of the subject, with both its bright and dark sides clearly exposed. One can read towards the end of the preface: “The book is in no sense a reference book in quantum field theory and its applications to particle physics…”. I agree with the second statement but strongly disagree with the first.

Das große Stephen Hawking Lesebuch, Leben und Werk (The Big Stephen Hawking Reader)

by Hubert Mania (ed.), Rowohlt Verlag. Hardback ISBN 3498044885, €17.90.

cernboo2_4-04

The Big Stephen Hawking Reader includes excerpts from books written by Hawking, as well as information about his life and work. This naturally divides the book into two parts: the first half is a short biography of Hawking interspersed with sections explaining the basic physics of his work. In this way it not only introduces Hawking himself, but also his thoughts and ideas.

Mania admits in the prologue that he wrote the biography from a “respectful distance”, honouring Hawking’s wish to be remembered for his work and not his “involuntary presence in the gossip columns”. Because of this, Mania sometimes leaves out things that could shed a less favourable light on Hawking. For example, Hawking’s treatment of his first wife is only mentioned very briefly. Nevertheless there are some nice anecdotes about Hawking, such as when he was thinking about A Brief History of Time. “If he was going to neglect his research to write a popular book, then it should be profitable for him.”

The second half of the book is made up of excerpts from A Short History of Time, The Illustrated Short History of Time and Einstein’s Dream. The chapters are well chosen and understandable with the help of Mania’s comments.

Even if Mania’s book is sometimes a little sketchy, I enjoyed reading it and would recommend it to anyone who wants a short introduction to Stephen Hawking’s life and work – and it whets the appetite for more books about and by this well known scientist.

bright-rec iop pub iop-science physcis connect