Comsol -leaderboard other pages

Topics

Calorimetry: Energy Measurement in Particle Physics

by Richard Wigmans, Oxford University Press, ISBN 019 850296 6, 726pp, £85.

41vX2EfvYhL._SX313_BO1,204,203,200_

The role of calorimetry in high-energy physics has become increasingly important during the last 20 years. This is due to the increase in energy of the particle beams available at the major accelerators and to the need for hermetic detectors. The 1980s, in particular the second half of the decade, saw an important breakthrough in the understanding of the mechanisms underlying the development of hadronic cascades and their energy loss.

The theme around which this breakthrough took place is “compensation”: for a compensating calorimeter e/h = 1, where e represents the response to an electromagnetic and h the response to a non-electromagnetic,that is purely hadronic, shower of the same energy. For compensating calorimeters the energy measurement of electrons and hadrons of the same energy yields the same average response for all energies, at the same time leading to optimal hadronic energy resolution. It is also a prerequisite for linearity of the hadronic energy measurement.

In practice, very few compensating calorimeters have been built for major experiments (one example is the calorimeter of the ZEUS experiment at HERA, discussed in the book), probably because, in practice, achieving compensation means making a concession to the electromagnetic energy resolution. None of the experiments planned at the Large Hadron Collider, for example, will employ a compensating calorimeter. The importance of the research into compensation is nevertheless very large in that it led to a much better understanding of calorimetry in general. The author of the book has made original and essential contributions to this field through his own research.

The book reflects the deep and encyclopedic knowledge that the author has of the subject. This makes the book a rich source of information that will be useful for those designing calorimeters and for those analysing calorimeter data, for a long time to come. At the same time the book is not always successful in finding a way of organizing and conveying all of this knowledge in a clearly structured and efficient way. Parts of the book are rather narrative and long-winded.

The most important chapters are those on Shower Development, Energy Response, Fluctuations and Calibration. Also, that on Instrumental Aspects contains essential information. The chapters on generic studies and on existing (or meanwhile dismantled) and planned calorimeter systems, are interesting but less necessary parts of a textbook. Moreover, the author does not always keep to the subject – calorimetry – leading to unnecessary excursions and, what is worse, outdated material. It would, on the other hand, be interesting if the author, in his description of the calorimeters under construction for the Atlas experiment, had been a bit more explicit on what, in the light of the ideas developed earlier in the book, the optimal approach would be to (inter)calibrating this very complex calorimeter system.The chapter on Calibration is probably the most essential part of the book, bringing together many of the fundamental issues on shower development, signal generation and detection. Reading this chapter, one gets the impression that in fact it is impossible to calibrate calorimeters, but the style chosen by the author is only to emphasize that the issue is subtle and great care must be taken. The chapter contains information that is extremely worthy of consideration, culminating in the recommendation that, in the case of non-compensating calorimeters, individual (longitudinal) calorimeter sections should be calibrated by the same particles generating fully contained showers in each section, a recommendation that, in practice, cannot always be satisfied. In his ardour to emphasize the importance of the (inter)calibration of longitudinal calorimeter segments, the author even invokes decays, such as that of the neutral rho into two neutral pions, that do not exist in nature – we get the point and forgive him. It is, however, true that there are more places where the book would have profited from a critical, final edit.

Calorimetry is a book that describes the essential physics of calorimetry. It also contains a wealth of information and practical advice. It is written by a leading expert in the field. The fact that the discussions sometimes do not follow the shortest path to the conclusion and that perhaps the “textbook part” of this work should have been accommodated in a separate volume does not make the book less important: it will be amply used by those trying to familiarize themselves with calorimetry and in particular by those analysing the data of the very complex calorimeter systems of future experiments, such as at LHC.

Quarks and Gluons: a Century of Particle Charges

by M Y Han (Duke), World Scientific Publishing, 168pp, ISBN 981 02 3704 9 hbk $34/£21, ISBN 981 02 3745 6 pbk) $16/£10.

41w-BcDortL._SX340_BO1,204,203,200_

This is a readable little book on particle physics and is aimed at those with no previous exposure to the subject. It starts with the discovery of the electron in 1897 and works its way more or less historically up to the present. That means, of course, that it contains a lot about leptons and photons as well as the quarks and gluons of the title.

The guiding theme is the discovery of different kinds of conserved charges – first electric charge, then baryon number and the lepton numbers, and finally the more subtle kind of charges that are the source of the colour force between the quarks.

Like Stephen Hawking, the author manages to avoid all equations, except E = mc2. The style is chatty and colloquial (American), which will have some non-native English readers running for their phrase books. For example, correct predictions are “right on the money”, and when the terminology seems comical the reader is exhorted to “get a grip on yourself”. Nevertheless, as one would expect from a leading contributor to the field, Han takes care to get things right even when using simple language, as for example in his discussion of spin.

The jacket says that the book will be “both accessible to the layperson and of value to the expert”. I imagine that the latter refers to its value in helping us to communicate with non-experts.

I have some misgivings about this book, mainly because of its insistence on discussing only those charges that are (within current limits) absolutely conserved leaves the reader with the impression that nothing much is understood about the weak interaction. The author even says that the weak charges have yet to be identified. All of the beautiful developments of electroweak unification are omitted. Also, there is no mention of the exciting possibilities that lie in the near future. This makes the subject seem a bit moribund and musty. For example, we are told that the discovery of the pion in 1947 was “one of the last hurrahs” of cosmic-ray physics, whereas in fact that field continues to show astonishing vitality, with neutrino studies, ultrahigh-energy primaries and other fascinating phenomena promising a rich future.

Anomalies in Quantum Field Theory

by R A Bertlmann, Oxford University Press, ISBN 01 850762 3, pbk £29.95.

61XY7LclXbL

Field theory “anomalies” constitute a long-standing source of physics and mathematics. They have remained fascinating for physicists and mathematicians, as ongoing developments in string and brane theory show.

This book gives a comprehensive description of the many facets of this subject that were known before the mid-1980s. It is essentially self-contained and thus deserves to be called a textbook. Both mathematicians and physicists can learn from this volume.

With a modest knowledge of quantum mechanics, a mathematician can read about the history of the subject: the puzzle of the decay of the neutral pion into two gamma-ray photons; the inconsistencies of the perturbative treatment of gauge theories related to the occurrence of anomalies; the original Feynman graph calculations; and the theoretical constructions that introduced relationships with topology, up to the elementary versions of the index theorem for families.

The physicist will find all of the necessary equipment in elementary topology and differential geometry combined in constructions that are familiar to professional mathematicians. S/he will find thorough descriptions of the algebraic aspects that emerged from perturbation theory, both in the case of gauge theories and in the case of gravity, and an introduction to the way in which they tie up with index theory for elliptic operators and families thereof.

The book reads fluently and is written so clearly that one not only gets an overview of the subject, but also can learn it at an elementary level.

The bibliography is a rather faithful reflection of the physics literature and includes a few basic mathematical references, which give the reader the opportunity to learn more in whichever direction s/he chooses.

As mentioned, the subject is still developing in the direction of new mathematics and, possibly, new physics in the context of strings and branes. One may therefore regret that the book stops around the developments that took place in the mid-1980s.

The book is already more than 500 pages. Since it is essentially self-contained and every topic that is dealt with is described in sufficient detail to allow a non-specialist to get acquainted with it, at least at an elementary level, the mathematical techniques do not go beyond elements of differential geometry, as well as of homology, cohomology and homotopy theory. Generalized cohomology theories, including K-theory, only appear in a phenomenological disguise, in connection with the description of the index theorem for families, in the particular case relevant to gauge theories, but not as mathematical prerequisites.

As a consequence of the principle of maximal perversity, one may expect that physics will exhibit subtle effects describable in terms of the above-mentioned constructions. In such an event, there remains the hope for a corresponding textbook as understandable as this one, possibly written by the same author.

Selected Papers of Richard Feynman (with commentary)

edited by Laurie M Brown, World Scientific Series in 20th Century Physics, Vol. 27, ISBN 981 02 4130 5 hbk ISBN 981 02 4131 3 pbk.

cernbooks1_3-01

After A Quantum Legacy, the selected papers of Julian Schwinger, it is fitting that the next volume in this carefully selected series covers the work of Richard Feynman.

Now a cult figure, Feynman is fast becoming one of the most prolifically documented physicists of the past century. As well as his own popular work (You Must Be Joking, What Do You Care What Other People Think?) and his various lectures, there are biographies or biographical material by Gleick, Brown and Rigden, Mehra, Schweber, Sykes, and Gribbin and Gribbin.

Anecdotes about such a flamboyant character are easy to find, but the man’s reputation ultimately rests on his major contributions to science, which this book  amply documents. Chapters, of various lengths, deal with his work in quantum chemistry, classical and quantum electrodynamics, path integrals and operator calculus, liquid helium, the physics of elementary particles, quantum gravity and computer theory. Each has its own commentary.

As a foretaste of things to come, the first chapter serves up just a single paper – “Forces in molecules” – written by Feynman at the age of 21, in his final year as an undergraduate at MIT. This result – the Hellmann-Feynman theorem -has played an important role  in theoretical chemistry and condensed matter physics.

Chapter 2 begins with Feynman’s 1965 Nobel Lecture, goes on to include work with John Wheeler at Princeton, which explored the underlying assumptions about the interaction of radiation and matter, and concludes with the classic 1949 papers that presented his revolutionary approach to quantum electrodynamics.

The Nobel Lecture alone is worth reading – clearly a major early source of Feynman anecdote, such as the Slotnick episode. One is struck by Feynman’s ambivalent attitudes – his enormous regard for father figures such as Wheeler and Bethe on the one hand, and his clear disdain for many contemporaries on the other. Another good read in this chapter is Feynman’s paper presented at the 1961 Solvay meeting, and the ensuing discussion.

Chapter 3 deals with the detailed presentation of the path integral approach, which enabled Feynman to dissect electrodynamics and look at it from a fresh, uncluttered viewpoint.

From 1953 to 1958, Feynman looked for fresh pasture and produced a series of seminal papers on the atomic theory of superfluid helium, which is presented in Chapter 4.

Chapter 5 is split into two parts. The first, on weak interactions, includes the classic 1957 paper with Gell-Mann and some lecture notes from the 1960s exploring the consequences of SU3 symmetry for weak interactions. The second part – by far the largest section of the book – deals with his approach to partons, quarks and gluons. Feyman began thinking about describing hadrons simply as an assembly of smaller parts – his partons – just when experiments were beginning to probe this inner structure. This is a good example of how Feynman, arriving at a fresh interest, would invariably strip problems down to their essential parts before reassembling them in a way that he, and many other people too, understood better.

Feynman’s interest in numerical computation went back to his time at Los Alamos, when he had to model the behaviour of explosions using only the mechanical calculators of the time. Coming back to the subject in the 1980s, he went on to pioneer the idea of quantum computers. Apart from the prophetic papers published here, this aspect of his work has been well documented in The Feynman Lectures on Computing (ed. A J G Hey and R W Allen, Perseus).

Selected Papers of Richard Feynman concludes with a full bibliography. Even without the burgeoning Feynman cult, such a selection of key papers is a useful reference. However, with almost 1000 pages, the book could perhaps havebeen better signposted.  The selected papers are not listed in the initial contents and the pages have no running heads to indicate how the chapters fall.

JAERI/KEK project gets government approval

Phase 1 of the new joint project between the Japanese Atomic Energy Research Institute (JAERI) and the national KEK laboratory on high-intensity proton accelerators (see Proton collaboration is under way in Japan) has been given the go-ahead to begin construction.

Although formal approval of the budget has not yet been given, notice from the government means that Phase 1 of the project has effectively already been approved.

Phase 1 of the new project will include:

  • a 400 MeV normal conducting Linac;
  • a 3 GeV rapid cycling Proton Synchrotron operating at 1 MW;
  • a 50 GeV PS operating at 0.75 MW;
  • a major part of the 3 GeV neutron/meson facility;
  • a portion of the 50 GeV experimental facility.

The total budget for Phase 1 is 1335 Oku Yen (1 Oku Yen is equal to 108 yen, or approximately $860 000) and it is expected to be completed within six years. The entire cost of the project, including Phase 2, is expected to be in the region of 1890 Oku Yen.

Physics meeting reflects Vietnam’s prosperity

The latest of the now traditional Rencontres du Vietnam, organized by Trân Than Vân, took place in Hanoi last summer. Some 200 participants from all over the world attended, including a conspicuous number of Vietnamese physicists. The conference on Physics at Extreme Energies was held in the Horizon Hotel in central Hanoi, one of the new hotels signalling the rapid economic development of Vietnam. The change was especially evident to participants who were present at the first event of the series in 1993.

cernnews7_2-01

Two Nobel prizewinners, Jerome Friedman (who gave a very successful public talk entitled “Are we made of quarks?”) and Norman Ramsey, attended. The packed programme covered all of the most significant recent results in particle physics and cosmology.

Roberto Peccei (UCLA) gave the introductory talk on the fundamental energy scales in particle physics and in the universe. Highlights of the meeting included the recent breakthroughs in the measurement of cosmological parameters; the results of experiments on neutrino oscillations; the latest news from LEP (especially on the search for the Higgs particle and for new physics); and the review of the indications for quark-gluon plasma in heavy-ion collisions. Also of interest were the updates on flavour physics, with the results on CP violation in K decay and the start of the BaBar and Belle “beauty factories” that will unveil CP violation in B decays; and the summaries on the status of such diverse fields as QCD, electroweak theory, quantum gravity, astrophysics and cosmic rays.

Nguyên Van Hiêu, chairman of the local organizing committee, described the development and present status of physics in Vietnam. The concluding talks, one on experiment and one on theory, were given by Pierre Darriulat (formerly of CERN and now a distinguished professor at Hanoi) and Guido Altarelli of CERN. Away from the science, concerts of Vietnamese music were organized, introduced and explained by talented musicologist Tran Van Khe, who has become a feature of the whole series of Rencontres.

cernnews8_2-01

School for science

Since the first Rencontres meeting in Hanoi in December 1993, an international school in theoretical physics has been held there annually. This attracts not only Vietnamese scientists, but also those from the Association of South East Asian Nations, China and Bangladesh. The seventh such school was held last year under the direction of Patrick Aurenche (Annecy).

In September 1994 Jim Cronin (Chicago) and Alan Watson (Leeds) were invited to look at the possibility of including a Vietnamese group in the international Pierre Auger high-energy cosmic-ray collaboration. For three years now a group led by Vo Van Thuan has been part of this project. Its activities have increased, thanks to the arrival of former CERN physicist Pierre Darriulat, who plays an important role in directing the research of the group.

An advanced technology school, directed by Jean Badier of the Ecole Polytechnique, began in 1996, just after the second Vietnam Rencontres. The first two such schools focused on the physics of silicon, while the latest two covered electrochemical sensors to measure water quality. During this year’s summer meeting, numerous new contacts were made between local laboratories and international research centres. A collaboration between physicists from Ho Chi Minh City and Fermilab is under study to enable Vietnamese scientists to work in major groups at Fermilab.

An important aspect of these meetings is the enthusiasm that they generate among young scientists. Since 1995, talented Vietnamese students have entered the entry exams for the prestigious Paris Ecole Polytechnique. About 20 of them are currently studying there, and many others are attending French and US universities.

Scientists seek the secret of start-up and spin-off success

cernnews9_2-01

Basic science does not usually have immediate benefits for industry or the economic world in general, and
delays in visible return are often difficult to reconcile with the short-term expectations of market-driven activities.

However, during the last decade the jobs that have been generated by start-up companies have injected extra liquidity into a once stagnant labour market. Since the early 1970s, universities and their incubator schemes, particularly in the US, have been supporting young entrepreneurs. This new culture has led to the establishment of a large number of start-up companies. However, the gold rush aspects of such mass migration can also have negative implications.

Aspects of this new scene were reflected in a Basic Science and Entrepreneurship workshop that was held during the recent IEEE Nuclear Science Symposium and Medical Imaging Conference in Lyon, France, and organized by François Bourgeois, CERN; Alan Jeavons, Oxford Positron Systems (UK); Yves Jongen, Ion Beam Applications (Belgium); and Gert Muehllehner, UGM (US).

The workshop aimed to highlight the factors that are necessary for success in entrepreneurship and the best practices to be adopted in the research and development environment. During a session entitled “The do’s and the don’ts of entrepreneurship”, five founders of spin-off companies reported on the problems they faced when developing their businesses. In addition to the well known problems – establishment of a business plan, funding, marketing and growth – the panel discussion gave useful indications on requirements of particular relevance to scientist-entrepreneurs: to match a high-tech product with market and customer needs; to team up with third parties knowledgeable in business and administration (e.g. local business schools); and to know how to produce a business plan.

As Muehllehner said: “To succeed, the scientist-entrepreneur needs to have a finished product, an established market, a team of people (finance, marketing and sales) and a source of money. Failing to have one of these [means] the chance of success drops to 80%; failing to have two [means] it is only 25%; and don’t even start if you’re missing more than two.”

During the session entitled “How to turn a scientist into an entrepreneur”, representatives of major research and development laboratories and European institutes presented their most recent initiatives. The oral presentations gave special attention to training actions, support given to entrepreneurs (identification of nascent technologies, intellectual property, seed capital and funding), and measures aimed at fostering a more entrepreneurial spirit.

Panel discussions agreed that there was substantial value in the direct exploitation of technology as compared to licensing. The need to foster an entrepreneurial spirit among scientists and their evident willingness to transfer technology was also examined. The raising of their awareness of the value of intellectual property and of exploiting its worth, together with the need for networking with other entrepreneurs and venture capitalists, were seen as key measures likely to foster a change of culture, at least in Europe.

Strength in numbers: particle physics goes global

cernhamburg1_2-01

If science knows no geographical frontiers, then its parliaments too need to be international. One such platform is the Global Science Forum (GSF) of the influential Organization for Economic Cooperation and Development (OECD). The GSF, the successor to the OECD’s Megascience Forum, which was established in 1992, has set up working groups in several specialist areas, in which particle physics has always featured prominently.

A GSF meeting in London on 13-15 April 2000 agreed to form a Consultative Group to advise the GSF on charting a “roadmap” for high-energy physics over the coming 20-30 years to prepare the way for new large facilities.

The group’s membership of active physicists and scientific administrators represents OECD member states and also non-member states that have an active high-energy physics programme.

At the London meeting, the group was mandated to consider both accelerator- and non-accelerator-based experimental and theoretical particle physics, plus particle astrophysics, and to report to the GSF in mid-2002.

The GSF initiative has come during a period of rapid innovation in high-energy physics. The Large Hadron Collider is now being constructed at CERN with a collision energy seven times that of the Fermilab Tevatron. Japan, the US and Europe have all developed plans for the construction of a 0.5-1 TeV electron-positron linear collider. The physics case for such a collider is strong and complements that of the LHC. However, the construction costs of such a machine are high.

At the same time, new accelerator ideas have prompted promising R&D on muon storage rings and the resultant creation of intense neutrino beams. At higher energies, R&D on a multi-TeV electron-positron linear collider (CLIC) is continuing, and R&D is starting on muon colliders and higher-energy hadron colliders. In parallel, the marriage of astrophysics and particle physics at both the experimental and the theoretical level is resulting in a significant programme.

First meeting

Approximately 50 delegates attended the first meeting of the Consultative Group at DESY, Hamburg, on 9-11 November 2000, which was chaired by Ian Corbett of the UK. The meeting was also attended by observers from CERN; the various Asian, US, European and international high-energy physics communities (the Asian Committee for Future Accelerators, ACFA; the US High Energy Physics Advisory Panel, HEPAP; the European Committee for Future Accelerators ECFA; and the International Committee for Future Accelerators, ICFA); the particle astrophysics branch of the International Union of Pure and Applied Physics (IUPAP-PANAGIC); and the European Union. Future meetings of the group will be held at CERN, in Japan and in the US before the group reports to the GSF.

The meeting at DESY was especially relevant because of the proposed construction of a 500-800 GeV electron positron superconducting linear collider (TESLA) by an international collaboration in which DESY plays a central role. Following an introduction by Hermann-Friedrich Wagner of the German delegation, and a physics perspective by Brian Foster (Bristol), DESY director Albrecht Wagner described in detail the planning and prototype activities of the project. In particular, he showed impressive results achieved by the collaboration on the development of high-gradient accelerator cavities as part of the Tesla Test Facility (TTF2). This will be used as a high-intensity X-ray source from 2003 (the SASE FEL project; see Towards the ultimate X-ray source:the X-ray laser ). Wagner announced the submission in March of a Technical Design Report for consideration by the German Government, and he expressed ideas on how such a machine might be built (in particular, involving a Global Accelerator Network of national laboratories in the construction of the machine; see Accelerators to span the globe).

Peter Rosen from the US Department of Energy (DOE) pointed out the impressive new particle and nuclear physics facilities coming on line in the US (the upgraded TeV2 proton-antiproton collider at Fermilab; the B factory at SLAC, Stanford; and Brookhaven’s RHIC heavy ion collider) and the need to exploit these facilities. He noted ongoing R&D towards a high-energy electron-positron linear collider, towards the development of muon storage rings and hadron colliders beyond the LHC (VLHC). He said that the US community would discuss future perspectives at a Workshop in Snowmass in June-July, to be followed by an HEPAP panel that would report to the DOE and National Science Foundation later this year.

Long-term health

Ger van Middelkoop (NIKHEF), expressing the viewpoint of smaller European nations that have an active high energy physics activity, emphasized the importance of CERN to the long-term health of both European and international particle physics.

Noting the increasing international character of Japanese activities (the KEK B-Factory and the K2K neutrino project in Japan; LEP and the LHC at CERN; and CDF at Fermilab), Sakue Yamada (KEK) described the major accelerator R&D in Japan towards a high-energy electron-positron linear collider design using the Accelerator Test Facility (ATF) at KEK. He emphasized the importance of input from the physics communities before reaching decisions on the construction of the next accelerator facility.

Finally, Kurt Hübner from CERN described longer-term R&D towards CLIC, a CERN design for a multi-TeV electron-positron linear collider, and collaborations with other European laboratories on R&D towards a muon storage ring and intense neutrino beam. He also noted studies towards upgrading the intensity and/or energy of LHC beams.

In the following discussion there was vigorous support for increasing the R&D expenditure on accelerator technologies. There was also a strong plea to maintain some regional competition in the development of promising physics and accelerator programmes. Owing to the overlapping R&D activities in Europe, the US and Japan towards a 0.5-1 TeV electron-positron linear collider design, and with parties in each region pushing to build such a machine, “bottom-up” assessments of the high-energy physics situation have been requested. Reports from these bodies are expected during 2001. In particular, ECFA:

  • is sponsoring an ECFA-DESY working group on physics and detectors at an electron-positron linear collider that will form part of the Technical Design Report for TESLA;
  • is supporting a series of European R&D initiatives towards a muon storage ring and intense neutrino beam complex;
  • has formed a working group chaired by Lorenzo Foà on the future of accelerator-based high-energy physics activities in Europe.

The working group, with its membership representing the different CERN member states, has been requested to reach a European physicist consensus on a roadmap for accelerator-based particle physics beyond LHC, as well as the infrastructure required and the R&D still needed. The group expects to be in a position to complete its report in in the middle of this year. At the same time, ICFA has created subgroups to study the technical and organizational issues related to Wagner’s Global Accelerator Network of accelerator construction.

The PANAGIC subgroup of IUPAP has recently reported to IUPAP the progress in charting its roadmap outlining key activities, and this was distributed by PANAGIC chairman Alessandro Bettini.

The GSF consultative group also set up a small subgroup to work with the OECD on the organizational and sociological issues of a “world laboratory”, and data will be collected on the funding and governmental policies of participating countries. These studies, together with the reports of ICFA, ECFA, ACFA, HEPAP and PANAGIC, will provide the major input to the consultative group.

The road ahead

cerndg1_2-01

CERN is fortunate to have a major accelerator project, the LHC, under active construction. This will take particle physics into a new energy regime, where we are confident that it will resolve many of the puzzles raised by the brilliant confirmation of the Standard Model by experiments at LEP and elsewhere. The LHC is the key to the future of high-energy physics and of CERN, and it offers bright prospects to the new generation of young particle physicists.

The LHC is a highly complex project, both technically and organizationally. The accelerator and the detectors involve sophisticated technologies, in many cases on industrial scales never attempted before in a scientific project.

Moreover, the LHC is truly a global project, with contributions to the accelerator from many countries outside Europe, as well as CERN and its member states, posing difficult problems of coordination and planning. One should also not forget that the long-term plan approved in 1996 left CERN with a reduced budget and no adequate contingency for the LHC.

There have already been unforeseeable delays in the civil engineering for the LHC. The industrialization of the successful prototype magnet technologies remains a challenge, and there are undoubtedly many more obstacles ahead.

Nonetheless, the LHC project is progressing steadily, contracts for a large fraction of subsystems have been adjudicated on schedule, and I consider it one of my primary responsibilities as director-general of CERN to  further it as best I can, and, as the doctors vow, avoid doing it any harm.

Weighing the implications

Over the past two years, dedicated work by CERN’s accelerator staff and the installation of advanced LHC cryogenics made it possible to run LEP at energies greater than design, and for one year longer than originally planned.

When data from LEP in the first part of 2000 revealed hints of new physics, the CERN management extended its run twice, in all from mid-September to the beginning of November, after first reassuring itself that these extensions would have no significant impact on the LHC. I was delighted to hear that the rapid and innovative combination of data from the four LEP experiments by their joint Higgs working group found that these early hints were strengthened, with the most likely interpretation being a Higgs boson weighing about 115 GeV.

In parallel with these extensions of the LEP run, the CERN directorate commissioned a study of the possible implications for the LHC, if LEP were to run in 2001. Two aspects needed to be considered. The LHC will be housed in the same tunnel as LEP, and the dismantling of LEP and modifications to the tunnel to accommodate the LHC are on the critical path. Also, the staff required for the operation of LEP would not be transferred as foreseen to LHC construction. Several ingenious ways to reschedule part of the essential work were tried, but finally we came to the conclusion that the LHC would inevitably be delayed by about a year if LEP was to run a full year in 2001.

Extra cost

There were also financial and personnel problems with a further LEP extension. It would have required around 100 million Swiss francs (about 40 million in running costs, and the rest in penalties for civil engineering contracts that are difficult to quantify a priori, additional expenses for rescheduling, etc).

I was grateful to see that some CERN delegations were willing to consider providing their share of these extra costs, but the bulk would inevitably have been borne by the regular CERN budget. Thus I came equally reluctantly to the conclusion that an LEP extension would be a major squeeze on the resources needed for the LHC project.

Projections of the signal seen at LEP in 2000 indicated that a year’s running might not lead to a conclusive result, particularly if the mass of the Higgs boson was in the upper part of the indicated range, namely around 116 GeV. This reflects the fact that the signal is seen at the very end of the LEP energy range.

To put this region under real scrutiny would require a significant energy increase, which in turn implies significant further expense and a prolongation of at least a two years, one year being approximately the time needed for the industrial production of new accelerating cavities. That would have led to a major disruption of the LHC project.

Overall

cerndg2_2-01

Putting all of these reasons together, and after consultation with the scientific committees, my colleagues in the CERN directorate and I became convinced that running LEP in the year 2001 would put the LHC under unacceptable pressure, and we decided that the CERN programme should not be changed to accommodate it. This decision had to be taken rapidly, precisely so as not to impact on the LHC schedule. I appreciated the efforts of the scientific review committees, which provided their advice and presented vigorously a variety of views, under the pressure of time.

Unlike the running of LEP in the year 2000, the issue of whether one should prolong LEP in 2001 divided the community and the scientific committees, and no consensus solution could be proposed. CERN management eventually cut the Gordian knot in favour of the LHC.

I understand the frustration and sadness of those who feel that they had the Higgs boson within their grasp, and fear that it may be years before their work can be confirmed.

Nevertheless, I am convinced that the best way forward for particle physics is the LHC. A Higgs boson as light as 115 GeV is most likely the signal of a rich supersymmetric particle spectrum at low energy, and the LHC will be the ideal instrument to put CERN and the physics community in a position to explore fully the new frontier in particle physics, which we may have glimpsed through the fascinating LEP events.

I hope that the high-energy physics community will join us in working wholeheartedly towards this exciting and challenging goal.

Against the Donning of the Gown by Galileo Galilei in 1590

translated into English by Giovanni Bignami, Moon Books Limited 2000. Information and orders via http://www.galileounaluna.com/

cernbooks1_1-01

It is with pity and anguish that I see Students and seekers of the Greatest Good Fail yet again to strike where it may be So begins an epic verse penned in 1590 in Pisa, not by poet Francesco Berni, who defined the rhythmic style of the poem, nor by Pisa’s Cardinal Antonio Pozzi, but by his contemporary, Galileo Galilei. To those familiar only with Galileo’s scientific work, the fact that he also composed poetry might come as something of a revelation. That he should begin by talking of the greatest good even more so. Yet the subject-matter of this work was close to the young scientist’s heart, as soon becomes apparent in Giovanni Bignami’s wonderful English translation.

Bignami, head of science at the Italian Space Agency, is a master of modern English. With this work he has gone one step further by translating the poem into the English of Shakespeare, and Berni’s rhythmic form into iambic pentameter. Moreover, as Bignami himself points out, the challenge of translating poetry from a language with 7 vowel sounds to one with 52 was daunting in its own right. But Bignami has succeeded spectacularly. The translation reads with fluid clarity, and the humour is as intact as can be expected after its journey through time and language.

It is a few pages in that we begin to learn what stirred Galileo to put pen to paper: I now conclude, and turn to you, signor, And force you to confess, against your will, The Greatest Good will be all clothes to abhor

As a young lecturer in Pisa, Galileo railed against a system in which he was obliged to wear his academic gown at all times, on pain of heavy fines, and this poem is his response. His technique is to take the very idea of wearing – or rather not wearing – clothes to its logical conclusion and to propose, tongue firmly planted in cheek, that we do as the beasts do and go naked.

Hilarious and profoundly irreverent consequences rapidly ensue as Galileo examines, for example, the potential repercussions for matchmaking and marriage.

Moon Books of Milan has given the translation a fitting treatment by producing a volume using the materials and techniques of the time. It is rare to find a book of such beauty as the company’s calf-bound limited edition printed on hand-made paper and lavishly illustrated with original drawings by Donata Almici. It is even rarer to find such a treat in store on opening the cover, and it would be a great shame if Prof. Bignami’s efforts, and indeed those of Galileo, were limited to the 2000 copies produced by Moon Books. Prof. Bignami is seeking a mainstream publisher to produce a more affordable edition. Here’s hoping that he succeeds.

bright-rec iop pub iop-science physcis connect