An update on particle physics and closely related topics in Switzerland was presented at the University of Zurich in March to a subpanel of the European Committee for Future Accelerators, RECFA. Members were welcomed to the meeting by the university’s rector, Hans Weder. In his opening address, Professor Weder emphasized his belief in the importance of basic research. He said that the University of Zurich intends to be among the very best in basic research. As a theologist, he found particle physics “a most fascinating human endeavour”. He concluded: “You may be proud of your contribution to human culture.”
The position of the Swiss Government was described by Charles Kleiber, Secretary of State for education and science, who declared that “Switzerland believes in CERN” (see below). An overview of particle physics in Switzerland was given by Claude Amsler, followed by several talks on the various Swiss activities in particle physics, in Europe and elsewhere, as well as a few contributions to spin-offs, such as medical applications involving particle physics techniques.
RECFA delegates were impressed by the extent and quality of the activities. Switzerland, in spite of being a small country, is almost omnipresent at CERN. Swiss scientists are active in a large number of experiments all the way from the lowest energies – experiments with antihydrogen – to the highest-energy experiments preparing for the Large Hadron Collider (LHC). There is also a very strong community of theoretical particle physicists in Switzerland.
An interesting recent development in Switzerland concerns a proposal by the Forum of Swiss High Energy Physicists to construct a dedicated Swiss facility to meet the challenges of the LHC computing. This is to be situated at the Swiss Center for Scientific Computing in the Italian-speaking Canton of Ticino. Switzerland also benefits from a large multidisciplinary national laboratory, the Paul Scherrer Institute (PSI). In addition to being a research laboratory, the PSI enables Swiss physicists to engage in activities beyond those that are possible at universities, such as building large equipment and having access to test-beams.
Are there then no clouds on the horizon for Swiss particle physics? The funding system is complicated, and post-doctoral fellows are expensive and not easy to find. However, Swiss particle physicists seem to have found their way through the funding labyrinths, and RECFA was pleased to find the Swiss particle physics community so strong and dynamic.
In telling RECFA delegates that Switzerland believes in CERN, the Swiss secretary of state for education and science, Charles Kleiber, said: “CERN is the world’s focal point for high-energy physics and therefore an invaluable asset for research in this field. Moreover, member states have invested heavily in CERN and it would simply be a waste of money not to continue to use it to the maximum extent possible. CERN motivates young students to study physics and serves as a first-class learning site by offering excellent training possibilities for the next generation of physicists. CERN is also ‘la part de rêve’ which is so necessary today. CERN disposes of motivated and competent personnel with an excellent record of success, but working also with great passion – and sometimes under very difficult conditions – on the future of CERN. Let’s protect and take advantage of the human resources available.”
Charles Kleiber is the new head of Switzerland’s delegation to CERN Council.
In the spring of 1960, CERN’s proton synchrotron (PS) was delivering its first beams. In the middle of this critical phase for European particle physics, CERN’s director-general, Cornelis Bakker, was killed in an aeroplane accident. Although CERN’s governing Council acted swiftly by appointing John Adams as acting director-general, this step necessarily prolonged the period that in retrospect may be characterized by the dominance of brilliant accelerator scientists.
At the same meeting in June 1960 which confirmed Adams’ appointment, the “modern” structure of research committees with at least as many members from outside as inside the laboratory was also approved, and the search for Bakker’s successor began. In any case, Adams would have to leave CERN to take up an important position in the UK. The discussion centred around two eminent scientists – Hendrik B G Casimir and Victor F Weisskopf. Weisskopf was already well known at CERN, having worked in the Theory Division from 1957 until 1958. With characteristic modesty he doubted his talents for such a position, but he expressed his willingness to act as a director of research. Casimir made it clear that his position with Philips would make it very difficult to take over the post of CERN director-general.
During the following months, a formal nomination procedure of candidates in the Scientific Policy Committee (where Weisskopf was formally proposed by Greece), extensive deliberations and successful persuasion led to Weisskopf’s election by Council on 8 December 1960. His term was envisaged to run from 1 August 1961 until 31 July 1963, but this was later extended until 31 December 1965. It is no exaggeration that in that period, under Weisskopf’s guidance, the future of CERN was shaped for many years to come.
CERN was fortunate to be led by a personality such as Weisskopf at this time. The difficult situation for the laboratory, whose harmonious development had been interrupted at a critical point in its evolution, needed a director-general with special abilities. Every fast-developing scientific organization must cope with the effects that its very size has on its aims. Scientists with little inclination towards administrative matters must submit to administrative and bureaucratic rules, especially in an international organization.
The selection of collaborators and the future style of work is determined at the stage of most rapid initial growth, because the natural inertia of a structure made up of human beings makes it extremely difficult later on to rectify earlier mistakes. At the end of 1960 the number of CERN staff and visiting scientists was 1166; this rose to 2530 at the time of Weisskopf’s departure in 1965.
Therefore, at this time in the history of CERN even more than at others, the director-general had to be a physicist who set the direction of the laboratory towards an absolute priority of science. To achieve this he had to rely on a high reputation in his field, together with an ability to deal with the administrative needs of a rapidly growing organization. CERN was placed in the delicate position of having to restore European research parity with that of the US, profiting as much as possible from the experience gained already in the US, while retaining the European character of the new organization.
Born in Vienna in 1908, Weisskopf followed a truly cosmopolitan scientific career as a theoretical nuclear physicist, working with the most important founding fathers of modern quantum theory, and contributing important results himself. He was familiar not only with Germany (his collaboration with Heisenberg), Switzerland (with Pauli) and the Nordic countries (with Niels Bohr at Copenhagen) from extended stays in these countries, but also with Russia (with Landau at Kharkov), and eventually accepted a position at Rochester, US, in 1937.
His qualities as a leader of a technological project in which theoretical physics only played an auxiliary role was exploited in the Manhattan Project (Los Alamos) towards the end of the Second World War. The European background of many of his collaborators there was an excellent preparation for the task of leading a European laboratory. Even when pursuing the same scientific goal, the individual style of scientists varies greatly, especially if they are of different nationalities.
After the war, as professor at the Massachusetts Institute of Technology (MIT), Weisskopf resumed contacts with Europe, which was slowly recovering from the dark years. In addition to his outstanding qualifications as a theoretical physicist and as a leader of scientific enterprises, Weisskopf possessed a special quality that physics in Europe is lacking to a large degree. Possibly because of the general structure of secondary education in Europe, mathematics plays an extremely important role in theoretical physics. Hence theoretical physics frequently becomes almost a mathematical discipline, with the physical ideas being submerged by an overemphasized mathematical formalism. Among experimentalists this can cause uncertainty or even refusal as far as the judgement of theoretical ideas is concerned.
In the US only a handful of gifted physicists knew how to bridge this gap. Weisskopf was a master of this. Before coming to CERN, he had already taught a generation of nuclear physicists how to pick out the essential physical ideas which are always transparent and simple (once they have been understood), but which may be hidden under many layers of mathematical formalism. The true masters of mathematical physics always knew how to isolate the physical content of complicated mathematical arguments, but unfortunately the majority of theoreticians in Europe are to this day sometimes over-fascinated by the mathematical aspects of the physical description of nature.
The understanding of physical phenomena often does not even require the use of precise formulae. Students at MIT had invented the notion of the “Weisskopfian”, which naturally takes care of numerical factors such ±1, i, 2p, etc. Also in the book Theoretical Nuclear Physics by John M Blatt and Weisskopf, which remains a standard textbook to this day, the emphasis on simple, physically transparent arguments by Weisskopf and the more precise, but more formal presentation topics by his co-author are clearly discernible.
From MIT to CERN
To facilitate his transition from MIT to CERN, and to make optimal use of his period as director-general of CERN, Weisskopf became a part-time member of the CERN directorate in September 1960, dividing his time equally between MIT and CERN. Unfortunately in February 1961 he was involved in a traffic accident, and needed complicated hip surgery and a long stay in hospital. At the start of his term as director-general and less so during a large part of his stay in Geneva, Weisskopf was hampered in his movement. I vividly remember his tall figure walking with crutches through the corridors, obviously in pain, but he never lost his friendly disposition.
The first progress report to CERN Council in December 1961 clearly reflects the situation of CERN at the beginning of the Weisskopf era. Two years after the first beam at the PS, breakdowns and construction work on beams had prevented completely satisfactory use of this machine, whereas the smaller synchrocyclotron was working very well. Research director Gilberto Bernardini aptly remarked that European researchers with a nuclear physics background had had little difficulty orienting their work towards the synchrocyclotron. The PS, on the other hand, was a novelty for physicists, so certain mistakes had been made, particularly with insufficient time for preparation of experiments.
Nevertheless, 1961 was the first year with a vigorous research programme at CERN. Not surprisingly, organizational problems and difficulties in the management of relations with universities in the member states became acute. It was recognized that at least track chamber experiments required the collaboration with institutes outside CERN for the scanning, measuring and evaluation of data. For electronic experiments such a need was not yet seen.
The construction of the 2 m bubble chamber was continuing well, but experimental work was still done on the basis of data from the tiny 30 cm chamber and with the 81 cm Saclay chamber. The heavy liquid chamber had looked in vain for fast neutrinos in the neutrino beam. Simon van der Meer’s neutrino horn, intended to improve this situation, had just finished its design stage.
Addressing Council for the first time on the problem of the term future of CERN, the new director-general already strongly emphasized two directions of development which, as subsequent history has shown, were decisive for the laboratory’s future success. One project, based upon design work by Kjell Johnson and collaborators, foresaw the construction of storage rings; the other was aimed at a much larger 300 GeV accelerator.
The financial implications of such proposals and the necessity to formalize budget preparations more than a year in advance led to the creation of a working group headed by the Dutch delegate, Jan Bannier. From this group emerged the remarkable “Bannier procedure”, under which firm and provisional estimates of budget figures for the coming years are fixed annually. It was decided that the cost variation index should not be provided automatically, and that Council should make a decision on this index each year.
First research successes
The discovery that different neutrinos came from electrons and from muons was made in 1962, not at CERN, but at Brookhaven. In retrospect it was clear that CERN’s attempt was bound to fail for technical reasons. However, the disappointment did not overshadow some remarkable successes in the first full year of CERN under Weisskopf’s leadership. The shrinking of the diffraction peak in elastic proton collisions was first seen at CERN – in agreement with the new ideas of Regge pole theory, which had also originated in Europe. The cascade anti-hyperon was found simultaneously with Brookhaven, but the beta decay of the p meson and the anomalous magnetic moment of the muon were “pure” CERN discoveries. For the first time development of a novel type of scanning device for bubble-chamber pictures (the Hough-Powell device) which started at CERN was taken over by US institutions. However, Weisskopf had to complain to Council about the “equipment gap” at the PS, caused by the lack of increase in real value of the budgets in 1960 and 1961.
In some sense, the most important experimental result of 1963 was the determination of the positive relative parity between the L and the S hyperon, obtained at CERN in the evaluation of data from the 80 cm bubble chamber. This result was in disagreement with some much-publicized predictions from Heisenberg, and gave further support to the growing confidence in internal symmetries. Despite a long shutdown of the PS in order to install the fast ejection mechanism giving extracted beam energies up to 25 GeV, it now began its reliable and faithful operation, which to this day is the basis of all accelerator physics at CERN. Thanks to a neutrino beam 50 times more intense than that at Brookhaven, the first bubble-chamber pictures of neutrino events were made.
<textbreak=Parliament>In 1963 a new body of European physicists was created under the chairmanship of Edoardo Amaldi. Taking into account future plans outside Europe, this body strongly recommended the storage ring project, as well as the plans for a 300 GeV accelerator. CERN Council authorized a “supplementary programme” for 1964 to study the technical implications of these two projects. This Amaldi Committee was set up as a working group of CERN’s Scientific Policy Committee, and was the forerunner of the European Committee for Future Accelerators (ECFA), which was founded three years later, again under Amaldi’s chairmanship. ECFA has been the independent “parliament” of European particle physicists ever since.
Weisskopf’s clear vision of the importance of education resulted in his legendary theoretical seminars for experimentalists at CERN. I had the privilege of collaborating with him at that time on some aspects of the preparation of these seminars, and my view of theoretical physics has been decisively influenced by his insistence on stressing the physical basis of new theoretical methods.
From 1964, CERN’s synchrocyclotron started to concentrate on nuclear physics alone, whereas the PS was now the most intensive and most reliable accelerator in the world. Another world premiere was the first radiofrequency separator, allowing K-meson beams of unprecedented energy. At CERN, also for the first time, small online computers were employed in electronic experiments. A flurry of fluctuating excitement was caused by the analysis of muon and muon-electron pairs in the neutrino events seen in the spark chamber. When it turned out that they could not have been produced by the intermediate W-boson (to be discovered at CERN exactly 20 years later at much larger energies), these events were more or less disregarded. Only 10 years later, after the charmed quark was found in the US, was it realized that these events were examples of charm decay – admittedly very difficult to understand on the basis of the knowledge in 1964. The unsuccessful hunt for free quarks also started in 1964, together with the acceptance of the concept of quarks as fundamental building blocks of matter.
Making decisions
Thanks to Weisskopf’s relentless prodding in 1964, CERN member states were convinced that the time was ripe for a decision on the future programme of CERN. Rather than rush into an easier but one-sided decision, Weisskopf was careful to emphasize the need for a comprehensive step involving three elements:
*further improvements of existing CERN facilities, comprising among other things two very large bubble chambers containing respectively 25 m3 of hydrogen and 10 m3 of heavy liquid;
*the construction of intersecting storage rings (ISR) on a new site offered by France adjacent to the existing laboratory;
*the construction of a 300 GeV proton accelerator in Europe.
Although a decision had to be postponed in 1964 – due to the difficult procedure to be set up for the site selection of the new 300 GeV laboratory – optimism prevailed that such a decision would be possible in 1965. After recommending the ISR supplementary programme in June 1965, the formal decision by Council was finally taken in December 1965.
The novel ISR project had no counterpart elsewhere in the world. Although experience had been gained at the CESR test ring for stacking electrons and for high ultravacuum, this decision reflected the increasing self-confidence of European physics. Thus the foundation was laid for the dominating role of European collider physics which eventually led to the antiproton-proton collider, the LEP electron-positron collider, and the LHC proton collider. At the same time as the ISR project was authorized, a supplementary programme for the preparation of the 300 GeV project was also approved.
When Weisskopf’s mandate ended at the end of 1965, particle physics had passed through perhaps its most important stage of development. From being an appendix to nuclear physics and cosmic-ray experiments, it had become a field with genuine new methods and results. The many new particle states disentangled by CERN and other laboratories gradually found a place in a framework determined by a new substructure, the quarks. In addition, many new discoveries in weak interactions, and especially at the unique neutrino beam of CERN, showed close similarities between weak and electromagnetic interactions, and paved the way for their unified field theory.
Much of the enthusiasm that enabled CERN experimentalists to participate so successfully was due to Weisskopf. He made a point of regularly talking to the scientists, and more than once he visited experiments during the night. These frequent contacts on the experimental floor with physicists at all levels gave CERN a new atmosphere and created contacts between different groups – something which was lacking before. Weisskopf himself was aware of this. When asked on his departure from CERN what he thought his main contribution had been, he replied that the administration and committees would have functioned perfectly well without him, but that he thought he had given CERN “atmosphere”.
During the Weisskopf era, directions were set for the distant future. Almost 40 years later, the basis of the CERN programme is still determined by those decisions taken in 1965. How could Weisskopf have been so successful in his promotion of CERN in Europe, at a time when there was always at least one member state with special problems regarding the support of particle physics and CERN?
Politicians must trust valued experts. Weisskopf achieved so much for the laboratory because he was deeply trusted by the representatives of the member states. Although enthusiastic in the support of new ideas in scientific projects, he never lost his self-critical attitude, and was quick to try to understand opposing points of view in science and in scientific policy. The enthusiasm, honesty and modesty of Victor Weisskopf have proved to be a rich inheritance, and have determined the future of CERN.
For more than 40 years, CERN’s library has collaborated with institutes and universities worldwide to collect carefully documented results of scientific research. Initially, this prodigious output was all on paper, and the CERN library regularly received papers from scientists at these institutes and universities via mailing lists. Because of its visibility, CERN received far more of this material than most institutes, and a major attraction of a visit to CERN was to peruse the latest pre-prints on view in the library.
With the advent of electronic publishing, more and more documents became accessible online. To complete the picture, documents still received on paper were scanned to offer Web access. Today this practice is diminishing as grey literature (library-speak for pre-prints and other material not published by a publishing house) in science, particularly in physics, is more widely available in electronic form.
Saving time and money
Having distributed documents for some years both on paper and electronically, many institutes have now chosen to use only the electronic route. This offers undeniable advantages: cost savings; quick and easy distribution; full text availability at a distance; the possibility of enriching the catalogue; and cheap online access, for example. The virtual library has become a reality. Paper documents are increasingly rare, and authors generally prefer to submit their papers electronically. Most major research centres also offer Web access to their documents and have ceased to send out paper copies via mailing lists, encouraging other scientific libraries and the researchers themselves to consult their Web pages and databases.
Faced with this evolution, library acquisition policies must be reconsidered and adapted to the new standards of scientific information dissemination.
The problem in this new context is the multiple consultation of databases. To find a document, a researcher must consult many resources, which is a time-consuming and tedious task with often dubious results. To facilitate searching and to offer users a single search interface, the CERN library chose to import as many electronic documents as possible. In 1999 the information support team introduced its Uploader program, which allows automatic importation of bibliographic records extracted from several sources. This has led to three main advantages: papers can now be found directly from institutes’ sites; the number of documents received from different research centres has increased; and new databases have been explored.
From any database or Web page, Uploader formats the records and adapts them to the cataloguing format used at CERN – Machine Readable Cataloguing (MARC). The program also updates existing records, searching for duplicates before importation. Which databases to explore was a difficult choice. First, the websites of all institutes from which CERN still received paper documents were consulted to see if the institutes offered the same documents online. This showed that more or less all institutes offer their publications on the Web in some form.
This study also revealed that CERN received, via mailing lists, only a third of the documents available on the Web. There are two possible explanations for this: perhaps for economic reasons research centres make a selection of which documents to send out; and mailing lists are not always kept up to date. The need for automatic importation of these documents from websites became obvious, but there were technical problems to overcome.
Sources can be divided into two types: Web pages and online databases, which are handled differently. Medium-sized research centres and information sources that do not offer online databases generally offer Web pages presenting the work of their researchers (usually theses). Searching can be primitive if no real search engine is implemented. The number of documents is also often limited. This means that manual submission of the full text of the documents is the most efficient way of acquiring the documents. The constant evolution of Web pages also argues against automatic importation. Since alerting services for such sites are rare, the CERN library set up its own alert system for some 80 information sources at 30 institutes. This tells the librarians when the available information changes, allowing them to acquire new documents as they become available.
Online databases often allow multicriteria searching. In contrast to Web pages, however, it is usually impossible to put an alert on the search results. This means that for online databases that do not offer an alert system, a different approach is needed. The method adopted by the CERN library is a monthly or annual search.
The Uploader program helps CERN’s librarians to manage an effective document supply service, but the huge diversity of online information sources means that there is no shortage of work for the librarians. Document structure can vary from page to page, or even within the same page. In the majority of cases the pages are therefore presented as free text with no common structure. With virtually no constraints imposed by databases, no common import protocol is possible, and material must be input manually. Inconsistencies can arise when Web pages are not handled rigorously, causing confusion in bibliographic cataloguing – most frequently for authors’ names. Some databases allow external submission of documents and bibliographies, which results in many irregularities and loss of homogeneity in the presentation of the documents. Information can be presented in multiple forms. Pre-print numbers, for example, can appear as IUAP-00-xxx (number not yet attributed), CERN-TH-2K-1 (instead of CERN-TH-2000-1) or MPS15600 (instead of MPS-2000-156). Vital pieces of information, such as collaboration lists, are sometimes missing. All of these problems require traditional librarianship skills. CERN’s library aims to offer a coherent and homogeneous database, validation and improved metadata. Knowledge databases recognize retrievable work and provide links to relevant articles on the Web, while a computer program appends and corrects bibliographic data, keeping manual checking to a minimum.
There is no doubt that electronic uploading saves a considerable amount of time compared with manual submission. It has also greatly increased the number of documents made accessible and available at CERN. However, source databases must be carefully selected. The richer the database, the more time-consuming the procedure becomes. In addition, the volatility of Web pages requires close follow-up. Automatic importation has taken over from manual submission, but specialist monitoring remains essential.
The electronic approach was initially investigated at CERN on a test basis, to ascertain technical feasibility and to judge what the advantages would be. Since then, its use has spread and the laboratory has reached agreements with Cornell, Fermilab and several other information sources. Today, more than 90% of the material entering the CERN library database is imported or created electronically. Of this, only 8% comes from CERN.
The Large Hadron Collider (LHC) is without a doubt the most technologically challenging project that CERN has ever embarked upon. It is also the most costly, and it was approved under the strictest financial conditions that CERN has ever faced. This should have been cause for the laboratory to reflect on its way of working, but reflection did not come until September last year, when the results of a comprehensive cost-to-completion review showed that CERN would have to find an additional SwFr 850 million for the LHC and its experiments.
CERN is built on a tradition of excellence, in terms of both its personnel and its facilities. In the world of particle physics, the laboratory has a well deserved reputation for building the finest machines. Our first big accelerator, the PS, was completed in 1959 and is still going strong. And had the SPS not been built to CERN’s exacting standards, the Nobel prize-winning antiproton project might never have got off the ground. With the LHC being inaccessibly encased in its cryostat, high standards are needed more than ever. CERN, however, must also become more cost-aware.
Moving forward
At 18% of the material cost, the LHC overrun does not seem excessive for a project of this complexity, and is comparable to the percentage overrun incurred in the construction of the Large Electron Positron (LEP) collider. But the bill for the LHC is three times that for LEP. The lesson we have learned is that contingency in big projects must now be measured in absolute and not percentage terms. Our mistake was that we failed to realize that the scale of the LHC would require new monitoring and control systems at all levels of the laboratory.
Such systems are now being introduced with advice from an External Review Committee. CERN will introduce earned-value management techniques to allow the financial health of the laboratory to be easily monitored at any time, and we will move to full personnel-plus-materials accounting, which will introduce greater transparency. These measures are essential for completing the LHC within the boundaries set by last year’s cost-to-completion review, and they will position CERN well for the longer term.
CERN’s mission is to provide the facilities that its user community wants. In the past, that has meant a diverse range of particle beams serving a wide range of relatively small experiments. With the LHC, our user base has consolidated to give a smaller number of much larger experiments, and we must adapt our facilities accordingly. That means a narrower programme, focused on the LHC. A large part of the required resources can be found by reallocating budget and personnel to the LHC. Further reallocations will come from internal restructuring, postponing the start-up of the LHC until 2007, extending the pay-back period until 2010 and cutting back on accelerator R&D until the LHC is running.
I am convinced that these moves will allow CERN to maintain its tradition of excellence. We continue to host a lively and diverse low-energy programme. The LHC will be the world’s foremost facility for high-energy physics, and by maintaining a minimum R&D base, we are providing a platform for the long term.
However, we are not yet out of the woods. An important part of the resources we plan to reallocate to the LHC has been identified but not yet secured. It will take a coherent effort across the laboratory to ensure that human resources released by the reduction of non-LHC activities are effectively deployed to the LHC. However, we are heading in the right direction, and I have every faith in the ability of CERN’s staff and users to meet the challenge.
The lessons CERN has learned are lessons for us all. The need to measure contingency in absolute terms requires management tools, risk analysis and strategy tuned to the size of the projects, much as we choose our physics instruments in relation to the precision we are aiming to achieve. Time will tell how these considerations can be applied to future projects. For now, however, we have learned our lesson and CERN is set to emerge leaner and fitter to face the future.
by John Dirk Walecka, Cambridge Monographs on Particle Physics, Nuclear Physics and Cosmology, Cambridge University Press, ISBN 0521780438, £60 (€ 98).
The author is well placed to write a monograph on this classic subject which, like no other, bridges the gap between nuclear and particle physics through common concepts and techniques. He can look back on a long and distinguished career in this field as professor of physics at Stanford University, as scientific director of CEBAF (now Jefferson Lab), and now as professor of physics at the College of William and Mary.
The book is based largely on a series of lectures on the subject given at CEBAF. Given the author’s track record, it is only natural that the book should focus on electron scattering in the few-gigaelectronvolts energy domain. At the same time, it exploits the power of the theoretical concepts developed originally for low-energy scattering, to address an audience much wider than students and researchers at laboratories such as MIT Bates, JLab and the Mainz microtron. This is achieved through a clear structure and pedagogical distinction between the theoretical framework and practical applications.
The book is organized into five parts, two of which contain the core material. Part 2 – “General analysis” – is one of the most comprehensive reviews of the theory and phenomenology of electron-nucleus and electron-nucleon scattering that can be found in the literature today. This chapter is recommended reading not only for nuclear physicists, but also for every graduate student working on electron, muon and neutrino scattering, to acquire a detailed understanding of the roots and the development of the formalism applied to present-day high-energy experiments. It includes discussions of polarized deep inelastic scattering and of parity violation in electron scattering.
Part 4 – “Selected examples” – is targeted specifically at nuclear physicists.The applications of scattering theory focus on detailed discussions of classic nuclear-structure problems and experiments; three sections on the quark model, QCD, and the Standard Model embed the subject in the wider theoretical context. Recent deep inelastic electron and muon scattering experiments are not covered in a systematic way (however, they have been discussed in many other excellent reviews).
The three remaining parts of the book are more succinct. Part 1 is an easy-to-read introduction, and part 3 discusses quantum electrodynamics and provides an introduction to radiative corrections, which unfortunately is too concise to be of much practical use. Finally, part 5 gives an overview of future directions, which again focuses on the CEBAF/JLab experimental programme. A useful feature is an extensive set of appendices, providing handy reference material.
The author has accomplished a successful blend of textbook and monograph. Written by a nuclear physicist for nuclear physicists, it is a must for students and seasoned researchers alike engaged in electron-nucleus scattering. This book will also be eminently useful and rewarding for the deep-inelastic-scattering community to read, to learn about the origin of their field and its intimate relationship with one of the most important subject matters in nuclear physics.
edited by Beate Block and Maggie DeWolf, Mountainair press, ISBN 092952618X, $25 (€ 29).
It is somewhat unusual to review a cookbook in CERN Courier, but then this is a somewhat unusual cookbook. A collection of recipes assembled by the wife of a physicist at the Aspen Center for Physics, the book is as much a glimpse into the mindset of physics as it is a book about cooking.
The introductory pages deal with Aspen, but you’d have had to have been there to get the most out of it. The recipes begin with sections devoted to extraordinary chefs. There you’ll learn how to make risk-free mayonnaise, and why it’s best to whip egg whites in copper bowls. You’ll also learn some of the culinary secrets of Fermilab’s famed Chez Leon. One of the extraordinary chefs is Tita Alvarez Johnson, who founded the restaurant and gives it a memorable atmosphere to this day.
The rest of the book is divided into chapters sorted by region. Contributors are often mentioned by name, occasionally along with tasters. This makes for interesting, if slightly voyeuristic, reading. Here physicists will find the recipes of colleagues, their wives and even mothers-in-law. The presence of fondue Chinois betrays a CERN influence, and at least one Aspen visitor must have been to a La Thuille meeting, since the Aosta valley speciality “la Grolla” makes an appearance. In these chapters, you can even learn that at least one delegate to CERN Council has a soft spot for chocolate (a Belgian, of course).
The final section, “Drinks and amusements”, is definitely by physicists for physicists. There you’ll find a learned treatise on “Interparticle forces in multiphase colloid systems” – or how to resurrect coagulated sauce béarnaise. The thermodynamics of the perfect Martini are also covered here.
A chef once told me that to review a cookbook properly, you have to make all of the recipes. After spotting mysterious ingredients, such as powder steam and others still more exotic, this reviewer shied away from that approach and chose instead to dip into the book simply for the pleasure of it. Making the recipes will follow, starting with those from Chez Leon. Although the book may tell you how to make Tita’s recipes, unfortunately it doesn’t give her recipe for creating a memorable atmosphere – that you will have to discover for yourself.
All proceeds from Chaos in the Kitchen = Symmetry at the Table go to the Aspen Center for Physics. Ordering information is available from The Aspen Center for Physics, 700 West Gillespie St, Aspen, Colorado 81611, USA, or order by email.
At the March meetings of CERN’s governing body, Council, the laboratory’s management presented preliminary ideas for absorbing the cost overrun for the Large Hadron Collider (LHC) project identified last year. These focus more of the laboratory’s resources on the LHC, with compensatory reductions being made in other scientific programmes.
Under the management’s proposals, the running time for CERN’s existing accelerators could be reduced by up to 30% each year until the LHC starts up. The largest accelerator, the Super Proton Synchrotron, which provides test beams for the LHC experiments and supports the current high-energy programme, would not run at all in 2005. Other potential areas for savings have been identified in long-range research and development, LHC computing, the fellows and associates programme (CERN fellowships are fixed-term appointments for young people; associateships allow sabbatical periods to be spent at CERN later on), and general overheads. Savings could also be made in services contracted in to the laboratory. The total amount to be redirected to the LHC is expected to amount to SwFr 500 million (€ 341 million).
The plan envisages LHC start-up in 2007 and full payment for the new facility by 2010 with no budget increase. CERN’s director-general, Luciano Maiani, nevertheless urged Council to consider an increase in the laboratory’s budget over the medium term. This would allow the LHC to be financed by 2009 and would enable limited research and development to continue, standing the laboratory in good stead for the longer term.
CERN’s staff association, along with French and Swiss unions representing employees of companies working on the CERN site, also made their opinions known by presenting letters to Council. The staff association argued that in its opinion, more resources are needed to complete the LHC. The unions expressed their concerns over the impact of cutbacks at CERN on local employment.
A decision on the management’s proposals will be taken at the next meeting of Council in June. By then, the report of an external review committee set up in November (see CERN reacts to increased LHC costs) will be ready, and management proposals will also be complete. In the meantime, Council agreed to release SwFr 20 million from the 5% of the laboratory’s 2002 budget initially held back pending resolution of LHC funding issues. In a separate initiative, the Swiss delegation said that Switzerland would advance SwFr 90 million to CERN over the next three years, to be deducted from later contributions.
More than a year after being asked to study the opportunities and priorities for US nuclear physics research in the coming decade, the Department of Energy/National Science Foundation Nuclear Science Advisory Committee (NSAC) has recently submitted its latest long-range plan for the field. This is the fifth in an influential series of reports that NSAC has prepared on a regular basis since 1979. The US nuclear physics community is a diverse one which has its roots in nuclear structure studies, but which has branched out in recent years to address questions at the forefront of a number of related areas including nucleon structure, nuclear astrophysics, the nature of hot nuclear matter and searches for physics beyond the Standard Model. As part of the planning process, town meetings sponsored by the Division of Nuclear Physics of the American Physical Society for major subfields have provided a forum for presenting new ideas. A long-range plan working group then drafted overall priorities, taking into account current developments in nuclear physics on the world scene.
Recent investments in facilities such as the Relativistic Heavy-Ion Collider (RHIC) at Brookhaven, CEBAF at Jefferson Laboratory and the newly upgraded National Superconducting Cyclotron Laboratory at Michigan State University have positioned the field well for the future. Because of this, the plan concludes that “the highest priority of the nuclear science community is to exploit the extraordinary opportunities for scientific discoveries made possible by these investments.” Unfortunately, as with many branches of the physical sciences, funding for nuclear physics in the US has not kept pace with inflation in recent years. The plan’s first recommendation therefore calls for a 15% increase in base funding, which would allow more effective operation of accelerator facilities, increase support for university researchers, and revitalize the nuclear theory programme.
Looking further into the future, the plan recommends investment in areas where capabilities in the US can be dramatically improved, providing significant new capabilities on the international scene. The highest priority for major new construction is given to the Rare Isotope Accelerator – RIA (see Climbing out of the nuclear valley). This will provide higher intensities of radioactive beams than any present or planned facility worldwide, and will be used primarily for nuclear structure and astrophysics studies, with opportunities also for experiments on fundamental symmetries and in a number of applied areas.
Next, the plan recommends the construction of the world’s deepest underground science laboratory, noting that: “This laboratory will provide a compelling opportunity for nuclear scientists to explore fundamental questions in neutrino physics and astrophysics.” The plan also recommends the upgrade of CEBAF to 12 GeV by the addition of additional, high-field, superconducting cavities (see How CERN became popular with US physicists).
Finally, the plan endorses a number of smaller initiatives, including R&D towards an electron-ion collider that could be integrated into the RHIC facility. The scientific case for such a facility is currently under active consideration within the nuclear physics community.
In the late 1970s, CERN made the bold decision to convert its new Super Proton Synchrotron (SPS), only just getting into its stride as a fixed-target machine, into a proton-antiproton collider. The fast-tracked project began operation in 1981 and soon led to CERN’s first Nobel prize. It was a new watershed for European physics.
The innovative idea to convert a major proton synchrotron in this way came from David Cline, Peter McIntyre and Carlo Rubbia, then all working in the US. It had been initially proposed to Fermilab, but the US laboratory committed itself instead to increasing the beam energy of its existing synchrotron by adding superconducting magnets. Converting the Fermilab machine into a proton-antiproton collider became a longer-term goal. With such a project scheduled in the US, there was no immediate migration to CERN’s fast-tracked version.
For the CERN collider, the lessons of the Intersecting Storage Rings (ISR) had been learned. “Keyhole” physics was not the way to go. Carlo Rubbia’s 2000 tonne UA1 experiment for the new collider completely surrounded the proton-antiproton collision point, and its sheer size was impressive by the standards of the day. From the US, it had on board a Riverside contingent (a tradition having been set at the ISR) and David Cline, then at Wisconsin. As UA1 gained momentum, more physicists came from Rubbia’s base at Harvard, from MIT, and from Wisconsin.
The CERN proton-antiproton collider was built to house more than one experiment, and there were several contenders. An unsuccessful bid was made by Sam Ting of MIT, who at that time was the leader of the Mark-J experiment at the PETRA electron-positron collider at the DESY laboratory, Hamburg. UA2, the second major experiment approved for the proton-antiproton collider, was essentially European. Additional high-energy antiproton experiments at CERN included a gas-jet target, attracting groups from Michigan and Rockefeller (including ISR pioneer Rod Cool), and a study of jet structure by a dedicated UCLA group.
For low-energy antiprotons, CERN had the LEAR ring, and several US groups contributed to experiments here. The tradition continues with the AD antiproton decelerator, notably with Gerry Gabrielse’s Harvard group making precision measurements of antiproton parameters.
The LEP era
Even while the proton-antiproton collider was getting into its stride in the early 1980s, CERN began a push for its next large machine, the 27 km LEP electron-positron collider. Such a large machine was again unique, and therefore an attraction for US physicists, who proposed the LOGIC detector. There were four slots for experiments at LEP and more than four proposals. LOGIC did not make it.
Ting, having lost out at the proton-antiproton collider, was determined to get a front seat at LEP. He, like Lederman, understood the importance of studying lepton pairs, and had got together a major international effort with scientists based in the US, China and Europe for a detector to analyse muons using a huge magnetic spectrometer. The proposal was initially labelled “L3” as it was the third letter of intent to be tabled for LEP, and the collaboration hoped that a more positive title would emerge. The experiment was approved, but no better name appeared. It went on to become a major US effort, with groups from Alabama, Boston, Caltech, Carnegie-Mellon, Harvard, Johns Hopkins, Los Alamos, Michigan, Northeastern, Oak Ridge, Princeton, Purdue and UC San Diego being introduced to research at CERN.
US researchers also collaborated in the other three LEP experiments. ALEPH, initially led by Jack Steinberger, included groups from Florida State, UC Santa Cruz, Washington/Seattle, and from Wisconsin, under Sau-Lan Wu, who had moved to LEP after previous research with the TASSO experiment at PETRA. OPAL included groups from Duke, Indiana, Maryland, Oregon and UC Riverside. DELPHI had participation from Ames, Iowa.
The SPS fixed-target programme had initially had little attraction for US physicists, as Fermilab’s machine had higher energy and was commissioned earlier. However, a new development came in the 1980s when the SPS became the scene of experiments using high-energy beams of nuclei (although the initial US push had been for an alternative heavy-ion scenario). This was a natural extension of work which had been pioneered at the Berkeley Bevalac, and the Lawrence Berkeley Laboratory made vital contributions to the ion source and nuclear beam infrastructure for these experiments.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.