by Isaac Elishakoff and Yongjian Ren, Oxford University Press. Hardback ISBN 0198526318, £45 ($74.50).
Published in the series of Oxford texts on applied and engineering mathematics, this book is the first to discuss the finite-element method for structures with large stochastic variations. It has an impressive bibliography.
by Steven Weinberg, Cambridge University Press. Hardback ISBN 052182351X, £18.95 ($25).
I have been an admirer of this book since its first edition 20 years ago, and have recommended it on many courses for the general public, where people might be making their first encounter not only with particle physics but with physics itself. In my opinion that is the great strength of Weinberg’s book: it sets the discoveries of the first subatomic particles – the electron, proton and neutron – against a background of experimentation in physics, explaining in simple terms how we know that this is the way that matter is.
The electron, the longest- and best-known of subatomic particles, takes up the first half of the book, which is enriched by “flashbacks” to discuss topics such as energy and electric and magnetic forces. These subjects may be the bread-and-butter of the physicist’s world, but they are often less than obvious to most other people, who left these ideas behind when they left school. By presenting such concepts in an historical manner, Weinberg allows the reader to learn in a way that mirrors how the physicists of the 18th and 19th centuries themselves learned.
The members of the subatomic “zoo” discovered in the second half of the 20th century – from neutrinos to gluons – are covered in 10 or so pages at the end of the book. As Weinberg points out, it was not his intention to write another popular book on modern physics, and nowadays there are other books that readers can pick up once they have read Weinberg’s. This revised edition, however, brings the section on the modern particles up to date, and Weinberg has also taken the opportunity to point out the links between the historic discoveries and the work of particle physics today. I’m pleased the book is back in print and shall certainly continue to recommend it.
by T J M Boyd and J J Saunderson, Cambridge University Press. Hardback ISBN 0521452902, £80 ($120). Paperback ISBN 0521459125, £29.95 ($50).
Plasma physics is the study of the ionized state of matter; most of the baryonic matter in the universe is in the plasma state. Plasmas occur quite naturally whenever ordinary matter is heated to temperatures greater than 104 K. The resulting plasmas are electrically charged gases or fluids. They are profoundly influenced by the long-range Coulomb interactions of the ions and electrons, and by magnetic fields, either applied externally or generated by currents within the plasma.
The plasma medium is inherently nonlinear because the electromagnetic fields are produced self-consistently by the charge density and currents associated with the plasma particles. The dynamics of such systems are complex, and understanding them requires new concepts and techniques. Plasma physics describes elementary processes in completely or partially ionized matter, using well-known principles at the microscopic level.
The Physics of Plasmas provides a systematic approach to the subject by discussing the models used to describe plasmas, starting with particle-orbit theory, then proceeding to the fluid description, magneto-hydrodynamics, wave modes, the kinetic description, radiation processes, nonlinear effects, and ending with a chapter on the mathematical structure underlying the theoretical models used in plasma physics.
The book provides a comprehensive and refreshing view of plasmas concentrating on the physical interpretation of plasma phenomena. It is advertised as ideally suited to advanced graduate and graduate students of physics, astronomy, applied physics and engineering, with which I wholly agree. The advanced researcher will also find the book of interest and value in its treatment of both natural and laboratory plasmas. In fact anyone interested in plasma physics will find it a very useful book.
String theory in India recently received an unexpected boost when Jeffrey Epstein, a billionaire based in New York, gave string theorists associated with the Tata Institute of Fundamental Research (TIFR) in Mumbai a cheque for $100,000 (€86,200). The money will be managed by the Physics Department of Harvard University as a “TIFR String Theory Travel Fund”. Andrew Strominger of Harvard, who facilitated the gift, said that this team has “the highest intellectual output per dollar of any such group in the world”. A strong collaboration between the Physics Department at Harvard and TIFR is expected to begin when the young string theorist Shiraz Minwalla of Harvard joins the TIFR in 2004.
The gift is a real reward for the Indian string theorists who have made sustained and influential contributions to the subject of string theory over many years. Important contributions are in the areas of string dualities, black-hole physics, matrix models and cosmology.
Parvez Butt, chairman of the Pakistan Atomic Energy Commission (PAEC), visited CERN on 15 October. He is seen here (on the left) in the ATLAS assembly hall with (from left to right) Abdul Hai, Heavy Mechanical Complex-3 project director, Mohammad Naeem, Scientific Engineering Services, and ATLAS spokesperson Peter Jenni. Butt inspected the Pakistan-built saddle supports for the calorimeter, and toured the CMS assembly hall and surface civil engineering works. He also met with members of the Joint CERN-Pakistan Committee.
Most developing countries experience great difficulties because of adverse economic conditions and political instability, which means they lag behind in scientific and technological development. Building science facilities can be very expensive, so there is the potential for an enormous gap between the rich and the poor. However, science has been quite successful in leap-frogging this gap, enabling scientists from developing countries to participate in many scientific activities. This has taken many forms, including the interaction between scientists by e-mail, and visits by senior scientists and graduate students. Large facilities have also opened their doors to scientists from economically disadvantaged countries, literature and equipment has been donated by both organizations and individuals, and conference access has been made available.
With the advent of the World Wide Web and the rapid exchange of information via the Internet, one might naively have thought that much of the gap between developed and developing nations would disappear, even if problems still persisted for those areas of science that need expensive facilities. However, access to information, peer reviewed or not, depends on having the appropriate hardware, i.e. a computer, and Internet connectivity, and there is a serious problem with access to the Internet in developing countries. Gaining access to a computer is more of a question of economics, and one that we will assume will somehow be overcome. In this article we will instead concentrate on the issue of Internet connectivity.
Most of the countries with the lowest income economies have or have had serious problems with bandwidth, as well as with the high cost of access to the Internet. The high cost of connectivity is mainly due to the monopolies that communication companies are able to establish in developing countries. These costs, added to the low bandwidth, do not allow scientists to have timely access to information. In addition, there is also the expense of scientific literature, which is often prohibitive.
In most cases scientists in basic research do not attach economic value to their product, and so are willing to share their knowledge with fellow scientists, independent of their nationality or race. In addition, many scientific publishing companies are run at least partially by scientists, and most are willing to allow those from disadvantaged countries to access their journals, despite the usual high prices. This has given birth to some very successful initiatives in areas such as medicine (HINARI – Health InterNetwork Access to Research Initiative), biology (PERI – Programme for the Enhancement of Research Information), agriculture, fishery and forestry (AGORA – Access to Global Online Research in Agriculture), and physics, mathematics and biology (eJDS – electronic Journals Delivery Service). These are run by the World Health Organization, the International Network for the Availability of Scientific Publications, the Food and Agriculture Organization of the United Nations, and the Abdus Salam International Centre for Theoretical Physics (ICTP), respectively, in strict collaboration with major publishing companies and societies.
All these initiatives, even if they use different ways to access the sources of information, have a common characteristic: they allow scientists in the least-developed countries (individually, or through their libraries) to access the best and most appreciated literature in their fields. And in most cases this access is free.
Renowned Ghanaian scientist Francis Allotey, who is active in the politics of science in Ghana, has said: “We paid the price for not taking part in the Industrial Revolution of the late eighteenth century because we did not have the opportunity to see what was taking place in Europe. Now we see that information and communication technology (ICT) has become an indispensable tool. This time, we should not miss out on this technological revolution.”
Up until a year ago it was not clear, despite the efforts of many, whether bridging the digital divide with Africa would be feasible. However, at a recent meeting in Trieste, “Open Round Table on Developing Countries Access to Scientific Knowledge: Quantifying the Digital Divide”, we were able to see that some of the African countries had come up with very ingenious ideas to keep up with ICT. It is clear that Africa has decided to take Allotey’s words to heart, and has engaged in whatever may be necessary to bridge the technological divide. It is therefore more important than ever that the efforts to help them that have already begun do not stop; the results are there to see. If this ICT revolution is not to be missed, scientific institutions must keep up to date, and this in turn relies on the Internet connectivity of these institutions. But do they have it?
For the first time, at the same round table, Les Cottrell of SLAC, Warren Matthews of the Georgia Institute of Technology and Enrique Canessa of ICTP, Trieste, presented results on the connectivity of institutions in Third World countries, using measurements performed by the SLAC/PingER project. Figure 1 shows all the countries covered by this project, which measures the return time of an Internet packet between monitor sites and remote sites. Monitoring Internet connectivity or Internet performance in this way gives a good idea of trends – who is catching up, who is falling behind – and also allows a comparison of Internet performance with economic indicators. Since it is a quick way of measuring trends in ICT, decisions on investments can be made quickly enough to avoid irreversible damage.
The PingER project started some years ago to monitor Internet performance for data exchange in the large high-energy and nuclear-physics collaborations around the world. Recently, following an agreement with the ICTP and eJDS, the measurements have been extended to a selection of institutional hosts in developing countries, and are now available for around 80 countries.
Figure 2 shows throughputs in kilobytes per second as a function of time between January 1995 and January 2003. The measurements were made from SLAC in the US. The results show that Latin America, China and Africa, while at much lower levels of performance than the US (Edu), Canada and Europe, are keeping pace with these countries. Russia is quickly improving, but surprisingly India is lagging behind. This is a piece of information that should worry policy makers in India, as it is a country with a very well developed ICT. So why are their institutions behind? Part of the reason may be the choice of hosts being monitored in India, which are at academic and research institutions. But even so, this means that some institutions are very backward, with very poor connectivity.
The amount of data gathered is not enough to give a complete picture of the whole world, but it does show that the technology to monitor is there. Policy makers from developing countries can benefit from the data, since the information is freely available on the eJDS website (www.ejds.org). Moreover, it can also help when large funding agencies decide to invest in development, and will give an idea of the performance of the various countries. This measure of connectivity should be considered as a new variable in the complex field of economics.
Several studies have indicated that there are significant returns on financial investment via “Big Science” centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields – for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits.
We have therefore analysed the technological learning and innovation benefits derived from CERN’s procurement activity during 1997-2001. Our study was based on responses from technology-intensive companies of some financial significance. The main aim was to open up the “black box” of CERN as an environment for innovation and learning. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm’s organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings imply ways in which CERN – and by implication other Big Science centres – can further boost technology transfer into spill-over benefits for industrial knowledge and enhance their contribution to industrial R&D and innovation.
The method and sample
The empirical section of the study had several parts. First, a series of case studies was carried out, to develop a theoretical framework describing influences on organizational learning in relationships between Big Science and suppliers. The theoretical framework for the study is indicated in figure 1. A wide survey of CERN’s supplier companies was carried out from autumn 2002 to March 2003. Finally, a parallel survey was carried out among CERN staff responsible for coordinating purchase projects in order to explore learning effects and collaboration outcomes at CERN.
The focus of the survey was CERN-related learning, organizational and other benefits that accrue to supplier companies by virtue of their relationships with CERN. The questionnaire was designed according to best practice with multi-item scales used to measure both predictor and outcome variables. All scales were pre-tested in test interviews, and the feedback was used to iron out any inconsistencies and potential misunderstandings.
The base population of the study consisted of 629 companies – representing about 1197 million SwFr (€768 million) in procurement – which were selected as being technology-intensive suppliers with order sizes greater than 25,000 SwFr (€16,000). This base population was selected from a total of 6806 supplier companies, which generated supplies worth 2132 million SwFr (€1368 million) during this period (see “The selection process” table). The selection procedure therefore indicates that around 10% of CERN’s suppliers are firms that are both genuinely technology-intensive and also of some financial significance; when combined, they represent more than 50% of CERN’s total procurement budget during the period under study. We received 178 valid answers to our questionnaire from 154 of these companies, most of which were still supplying high-technology products or services to CERN beyond 2001. These answers formed the basis for our statistical study.
The findings
The learning outcomes documented in the study range from technology development and product development to organizational changes. The main findings show that while benefits vary extensively among suppliers, learning and innovation benefits tend to occur together: one type of learning is positively associated with other types of learning. Technological learning stands out as the main driver. Indeed, 44% of respondents indicated that they had acquired significant technological learning through their contract with CERN.
The study revealed important signs of development of new businesses, products and services, and increased internationalization of sales and marketing operations. As many as 38% of all respondents said they had developed new products as a direct result of their contractual relationship with CERN, while some 60% of the firms had acquired new customers. Extrapolating to the base population of 629 suppliers suggests that some 500 new products have been developed, attracting around 1900 new customers, essentially from outside high-energy physics. In addition, 41% of respondents said they would have had a poorer technological performance without the benefit of the contract with CERN, and 52% of these believed they would have had poorer sales.
Virtually all CERN’s suppliers appear to derive great value from CERN as a marketing reference (figure 2). In other words, CERN provides a prestigious customer reference that firms can use for marketing purposes. As a result of their interaction with CERN, 42% of the respondents have increased their international exposure, and 17% have opened a new market.
Suppliers also find benefits in terms of organizational effects from their involvement in CERN projects, such as improvements to their manufacturing capability, quality-control systems and R&D processes. In particular, 13% of the respondents indicated that they had formed new-product R&D teams and 14% reported the creation of new business units. A final aspect of the impact on organizational capabilities concerned project management, with 60% indicating strengthened capabilities in this area.
In examining what determines the outcomes of the relationships between suppliers and CERN, we found that learning and innovation benefits appear to be regulated by the quality of the relationship. The greater the social capital – for example, the trust, personal relationships and access to CERN contact networks – built into the relationship, the greater the learning and innovation benefits. This emphasizes the benefits of a partnership-type approach for technological learning and innovation.
In particular we found the frequency of interaction with CERN during the project to be relevant, together with how widely used the technology was at the start of the project, and the number of the supplier’s technical people who frequently interacted with CERN during the project.
During the study we also interviewed physicists and engineers at CERN, to cross check the information provided by the companies. These interviews highlighted the benefits that these interactions with industry have for CERN personnel. We observed that the mutual beneficial effect is independent of the level of in-house experience and participation in high-tech projects, confirming the importance of maintaining an active “industry-Big Science” interaction. Furthermore the results confirmed the importance of technologically challenging projects for the high levels of knowledge acquisition and motivation of highly qualified staff at CERN.
The implications
The study has shown that to a larger scale than foreseen, many suppliers have benefited from technological and other types of learning during the execution of contracts with CERN. The benefits suggest that for companies that are participating in highly demanding and cutting-edge technological projects, conventional procurements and stringent requirements are not the most appropriate modes of interaction, especially if learning and innovation are to be nurtured in the long term. For a Big Science organization such as CERN, some measures to facilitate true partnership and some thoughts on how to maintain the efficiency of the derived benefits for projects with long life-span development, either within the present financial and commercial rules or through other possible routes, will have to be carefully investigated. In particular procurement policy should involve industry early on in the R&D phase of high-tech projects that will lead to significant contracts.
Lastly, note that the methodology developed is not specific to CERN. It would be interesting to determine if and how these findings from CERN compare with those of other Big Science centres.
Twenty years ago, in 1983, CERN announced the discovery of the W and Z bosons, a feat that earned the laboratory its first Nobel prize in 1984. Ten years previously, physicists working at CERN had found indirect evidence for the existence of the Z particle in the “neutral currents”. These discoveries together provided convincing evidence for the electroweak theory that unifies the weak force with the electromagnetic force, and that has become a cornerstone of the modern Standard Model of particles and forces.
These breakthroughs brought modern physics closer to one of its main goals: to understand nature’s particles and forces in a single theoretical framework. James Clerk Maxwell took the first steps along this path in the 1860s, when he realized that electricity and magnetism were manifestations of the same phenomenon. It would be another hundred years, however, before theorists succeeded with the next stage: unifying Maxwell’s electromagnetism with the weak force in a new electroweak theory.
In 1979, three of the theorists responsible for the electroweak theory, Sheldon Glashow, Abdus Salam and Steven Weinberg, were awarded the Nobel prize. In 1984, Carlo Rubbia and Simon van der Meer from CERN shared the prize for their part in the discovery of the W and Z particles. The results ushered in more than a decade of precision measurements at the Large Electron-Positron collider (LEP), which tested predictions of the Standard Model that could be calculated thanks to the work of theorists Gerard ‘t Hooft and Martinus Veltman, who shared the Nobel prize in 1999.
To celebrate the 30th and 20th anniversaries, respectively, of the discoveries of neutral currents and the W and Z bosons, CERN held a special symposium on 16 September. The authors (some 250) of the papers that announced the discoveries were invited to attend, together with others who had played valuable roles in the development of the Standard Model, including Glashow, Weinberg and Veltman. Weinberg gave the opening talk – a masterful discourse on “The making of the Standard Model”. He began in the 1950s, which he described as a time of frustration after the triumph of quantum electrodynamics, before moving on to an account of his own contributions to electroweak theory during the second half of the 1960s. He also mentioned the work of many other physicists, some of whom were in the audience.
An important step towards confirming electroweak unification came in 1973, when the late André Lagarrigue and colleagues working with the Gargamelle bubble chamber at CERN observed neutral currents – the neutral manifestation of the weak force that had been predicted by electroweak theory but never previously observed. Later in the same decade, Rubbia proposed turning the laboratory’s most powerful particle accelerator into a particle collider, an idea that received the support of the then directors-general, John Adams and Léon Van Hove. By colliding counter-rotating proton and antiproton beams head on, enough energy would be concentrated to produce W and Z particles. This was made possible, in particular, through van der Meer’s invention of “stochastic cooling” to produce sufficiently dense antiproton beams.
In the second presentation at the symposium, Giorgio Brianti described CERN’s various contributions to accelerators and beams, beginning in 1961 with van der Meer’s invention of the magnetic horn to provide a more intense neutrino beam by focusing the charged particles (mostly pions) that decay to provide the neutrinos. A decade later the Intersecting Storage Rings became the world’s first proton-proton collider, and also the test-bed where stochastic cooling was invented.
Dieter Haidt then took up the story of the experiments, with his presentation on the discovery of neutral currents. In the early 1970s Haidt, now at DESY, was a member of the Gargamelle collaboration. He described the background to the search for neutral currents, and the difficulties encountered in convincing everyone that they had indeed been observed in Gargamelle. Ultimately the discovery led to the high-precision neutrino experiments at CERN, and allowed the prediction of the mass of the W boson from electroweak theory – which in turn led to the proton-antiproton project.
By 1981 the search for the W and Z particles was on. The observation of W particles by the UA1 and UA2 experiments was announced at CERN on 20 and 21 January 1983. The first observation of Z particles by UA1 followed soon after, with the announcement on 27 May (CERN Courier May 2003 p26). Pierre Darriulat, who was spokesperson of UA2 in 1978-1985, recalled the background to the decision to go ahead with the programme of converting the Super Proton Synchrotron (SPS) to a proton-antiproton collider, and cast his own personal illumination on the nature of the competition between the two experiments, UA1 and UA2.
With the discovery of the W and Z particles, electroweak theory became truly established. The challenge was now for a different kind of collider – LEP – to provide a precision test-bed for the theory. In the final presentation of the morning, Peter Zerwas from DESY looked back on the era of LEP, 1989-2000, and its many tests of the Standard Model of particle physics – testing not only electroweak theory but also quantum chromodynamics, the theory of the strong interaction.
In addition to reflecting on past findings, speakers at the symposium also looked to the future of CERN, in particular the Large Hadron Collider (LHC), which is due to start up in 2007. By colliding particles at extremely high energies, the LHC should shed light on such questions as: why do elementary particles have mass? What is the nature of the dark matter in the universe? Why did matter triumph over antimatter in the first moments of the universe, making our existence possible? And what was the state of matter a few microseconds after the Big Bang? These are the topics that John Ellis from CERN covered in the first presentation after lunch.
This was followed by a session on the challenges of the LHC. Lyn Evans, director of the LHC Project, presented the challenges of constructing and operating the collider; Jos Engelen, currently director of NIKHEF and soon to be chief scientific officer at CERN, described the challenge of building the detectors for the LHC; and Paul Messina, of Caltech, talked of the challenges to be met once the LHC and the experiments are running, when vast amounts of data will need to be collected and analysed.
The discovery of the W and Z particles owed much to the development of detector techniques, in particular by Georges Charpak at CERN, who was rewarded with the Nobel prize in 1992. In his presentation, Charpak recalled his early days at CERN, when he was invited to come to the laboratory for six months only to stay eventually for many years, leading many developments in particle detectors. He also spoke of the opportunities for spreading enthusiasm for physics to the younger generations.
In the final presentation CERN’s director-general Luciano Maiani gave his personal view of the future of the organization, and its importance for particle physics in general. This was followed by a panel discussion on the future of particle physics, chaired by Rubbia – a former director-general as well as Nobel laureate. Panel members included Robert Aymar, who will be director-general of CERN from 1 January 2004, Lev Okun, of ITEP, Donald Perkins of Oxford University, and van der Meer and Veltman, as well as symposium speakers Charpak, Darriulat, Maiani and Weinberg. Among the many points discussed, one of the most important, as Rubbia said in his summary, is that “we should not underestimate the surprise capacity of the LHC”.
The symposium was organized by Roger Cashmore and Jean-Pierre Revol. Many of those attending had participated in the discoveries being celebrated – and many of the younger members of the audience will hope to participate in future discoveries at CERN, in particular in the coming years with the LHC.
Particle physicists were a rare commodity in Spain when the country joined CERN in 1961 as the organization’s 14th member state. The civil war (1936-1939) had virtually created a scientific void; in George Orwell’s words from Homage to Catalonia, “It is not easy to convey the nightmare atmosphere of that time.” Blas Cabrera Felipe, who could have created a modern school of physics, left the country in 1939 and died in exile in Mexico in 1945, age 67. Nor was science a priority in the aftermath of the war. “In fact it is difficult to understand why Spain joined CERN, devoid as she was of a scientific community in the field,” says the distinguished Spanish physicist Pedro Pascual.
Nonetheless, joining CERN gave a cardinal impetus to the development of particle physics in Spain. So when the country left CERN at the end of 1968 it was under vigorous protests from a small community of young and enthusiastic Spanish physicists. Fourteen years later, in 1983, Spain was welcomed back to CERN, now much better prepared for the challenges lying ahead.
Among the duties of the European Committee for Future Accelerators (ECFA) are visits by a sub-panel called RECFA (Restricted ECFA) to the CERN member states. The aim is to gauge the status of particle physics and closely related disciplines and to help improve the prevailing conditions by making recommendations to physicists, funding agencies and politicians.
Past problems
On its first visit to Spain, in 1983, RECFA noted that it would take time and real dedication to build up a community of experimental physicists in the country. Another issue of concern was the urgent need for networking facilities for high-energy physics – both within the country and for connections with other member states – because of, as RECFA phrased it, “the very large distances separating the universities in Spain”.
Now, 20 years later, the distances are no longer perceived as large yet networking is still an issue, albeit on a much more sophisticated level. The unprecedented need for computing and networking has led for example to the LHC Computing Grid (LCG) project, with strong participation from Spain.
This year, in a welcome speech delivered on behalf of the minister of science and technology, the RECFA members were reminded that, ever since Spain rejoined CERN, the country has taken its responsibilities very seriously. In 1983 a special programme (the National Plan for Particle Physics and Large Accelerators) was created, within the framework of the Spanish “National Plan for R&D”. This programme, which funds participation in international experiments and pays some personnel costs, is still running. “The creation of the National Plan for R&D is the single most important event in the history of Spanish science,” says Enrique Fernandez, a former chairman of ECFA. Indeed, RECFA was delighted to learn that the Spanish investment has paid off handsomely – the growth of the community and the quality of the work are impressive.
Experimental particle physics in Spain is a young field, taking off in 1983 when the country rejoined CERN. By then, however, the period of rapid expansion at the universities had come to an end. In Spanish universities, positions are usually created as a response to the required teaching load, so the new experimental groups could not reach the critical mass needed for efficient participation in large collaborations. In addition, finding jobs at universities for engineers and technicians has always been very difficult.
To overcome these obstacles, Spanish physicists have in the past repeatedly expressed their desire to have a national institute, similar to INFN in Italy or IN2P3 in France. This was a major theme of RECFA visits in 1992 and 1997, and it resurfaced at this year’s meeting where it received strong support from the sub-panel. An important point is that Spain has 17 regions, each with its own government. These are responsible for education, which was formerly dealt with by the central authorities. This decentralization favours a multifaceted system, where research centres are funded by several sources, local as well as national. Indeed, there has been increasing support for particle physics from several regional governments. Therefore RECFA recommends that a national institute be created, but with a structure loose enough not to jeopardize local support.
Current structures and activities
In Spain, research in particle physics is carried out at universities as well as at non-university research institutes or centres. Most of the funds for specific projects, such as participation in Large Hadron Collider (LHC) experiments, are granted by the Ministry of Science and Technology, on a competitive basis through the National Plan for Particle Physics and Large Accelerators.
Experimental particle physics has in fact grown rapidly in Spain compared with many other countries. Currently the field is being pursued not only in Madrid, Valencia, Barcelona and Zaragoza, as was the case some years ago, but also in Santiago de Compostela, Santander, Granada and Seville. The amount of work carried out is vast and this article can give only a brief account.
A major centre for research is CIEMAT, the Research Centre for Energy, Environment and Technology in Madrid, which is financed by the Ministry of Science and Technology. It is a big organization with around 1150 employees, half with university degrees. CIEMAT has five research departments, one of them being the Department of Fusion and Elementary Particles. In addition, it has departments that can provide technical services and R&D support. This enables the researchers at CIEMAT to play a leading role in detector construction and R&D projects. Current activities include the construction of the barrel muon detector of CMS and participation in nTOF experiments at CERN, as well as an experiment at the PSI to determine the muon-decay coupling constant 20 times more accurately than previously. In astroparticle physics, CIEMAT is a major partner in the AMS experiment, to be performed on the International Space Station.
Several specialized research institutes are primarily funded by the Consejo Superior de Investigaciones Científicas (CSIC), an agency of the Ministry of Science and Technology. CSIC operates nationwide and provides funds for around 100 research institutes within a broad range of disciplines, similar to the structure of CNRS in France. Its prioritized areas of research include:
• elementary particle physics, including work at CERN;
• LHC computing and Grid technology;
• neutron physics – nTOF at CERN, and experiments at ILL in Grenoble and at the proposed European Spallation Source;
• synchrotron-radiation research at the ESRF, Grenoble, and at LURE, Orsay;
• detector and accelerator technologies.
The largest institute for particle physics funded by CSIC is the Instituto de Física Corpuscula (IFIC) in Valencia. Here, CSIC-funded researchers and those of the University of Valencia share premises for reasons of synergy. Experimental and theoretical research is performed primarily in areas of high-energy, nuclear and astroparticle physics. There are several activities in high-energy physics, among them construction of the ATLAS detector and the Grid project at CERN. Nuclear-physics activities include nTOF, and work at ISOLDE (CERN), GSI (Darmstadt), LNL (Padova) and in Jyväskylä, Finland; nuclear medicine is another line of research. IFIC is participating in the neutrino project ANTARES and in the gamma-ray mission INTEGRAL. IFIC is also an excellent centre for theoretical physics. The Instituto de Física de Cantabria (IFCA), at the University of Cantabria in Santander, is also co-financed by CSIC and the university. The institute participates in CMS and Grid projects as well as in the CDF experiment at Fermilab.
A centre of somewhat different character is the Institut de Física d’Altes Energies (IFAE) in Barcelona, which is a consortium between the local government and the Autonomic University of Barcelona (UAB). In addition to its own staff, the institute has associate members who are affiliated with the UAB or the University of Barcelona (UB). IFAE has the status of an institute of UAB and as such its members are allowed to teach doctoral courses at the university. This facilitates contact with young students, avoiding the danger of isolation that affects some free-standing research institutes.
Current projects at IFAE include detector construction for the ATLAS experiment and work on the Grid project, as well as participation in the CDF experiment. The astrophysics experiment MAGIC, for the detection of cosmic gamma rays, constitutes another major activity (see “MAGIC opens up the gamma-ray sky”). The institute is also moving into the domain of neutrino physics by analysing data from the K2K experiment in Japan, and it intends to participate in the proposed second-generation experiment, J-PARC-Nu. R&D, especially pertaining to the development of a novel X-ray detector for use in medical imaging, is another major activity. Many projects in theoretical particle physics are carried out at IFAE as well as at UAB and UB. In addition the UB has a small group that is involved in the BaBar experiment at SLAC and is participating in the LHCb experiment at CERN and the Grid project.
The university environment
Madrid is a centre for substantial activity in particle physics at two universities. Researchers from the Autonomic University of Madrid (UAM) have for a long time been involved with the ZEUS experiment at DESY. A UAM group is also participating in the construction of the ATLAS detector and another has started working on CMS. Researchers from the Universidad Complutense de Madrid (UCM) are contributing to the MAGIC experiment. Both universities have strong theory groups. Recently, a joint CSIC-UAM Institute has been created, which will no doubt strengthen theoretical particle physics in Spain and lead to further excellence.
Researchers from the University of Santiago de Compostela are engaged in the Auger Project (see “Auger ready for ultra-high-energy cosmic rays”) as well as in the construction of the LHCb detector. They have also been heavily involved in the DIRAC experiment at CERN. The University of Granada, meanwhile, is a newcomer in the area of experimental particle physics, having begun only in 2002. The Granada group is participating in the preparation of the ICARUS experiment, to be done at the Gran Sasso Laboratory in Italy. Theoretical particle physics, by contrast, has a longer history in Granada.
For 15 years Spain has also had an underground lab, Canfranc, situated in a tunnel between France and Spain. During this time the laboratory has undergone several upgrades; current research projects include double beta-decay experiments and searches for dark matter.
Finally, it is interesting to note that the nTOF experiments at CERN attract a rather large community of scientists from Spain, not only from large centres for particle physics but also from polytechnic universities and the University of Seville. Grid computing is another “unifying project”, with the Spanish groups participating in LHC experiments working together to set up the necessary infrastructure. A scientific information centre, the Port d’Informació Científica (PIC), coordinates these efforts.
The impact of the Large Electron-Positron (LEP) collider on the Spanish particle-physics community has been remarkable. Work at LEP has led to 64 PhD theses and about 1000 publications. In addition about 20 PhDs are awarded each year in theoretical particle physics. As mentioned above, both the increase in the size of the community and the financial support have been very substantial compared with the pre-LEP era. There are now around 270 experimenters and 150 particle theorists in Spain.
RECFA found that Spanish experimental particle physicists are at the forefront of international research and make an impressive contribution. They are indispensable partners in the collaborations to which they belong. RECFA was also very impressed by the vitality and intellectual leadership of the theoretical particle-physics community in Spain, which the committee found to be among the strongest in Europe.
Despite the Spanish success story, however, serious structural problems loom on the horizon. Finding permanent positions at the universities for gifted young physicists is currently deemed to be almost impossible, not only for demographic reasons but also because of a downward trend in the number of first-year physics students (in the past two years, however, this number has been more stable). Solving this problem would require finding new methods of funding.
Another issue of great concern is the lack of engineers and technicians at universities because, in general, there are no permanent positions available in these categories. These people, vital to research, have access only to temporary positions, supported by project funds, or to posts created within regional institutes.
The above problems must be solved urgently. The present major commitment of Spanish universities and institutes to the LHC project is expected to result in a repeat of the success story of LEP. Spain is an attractive country for high-level scientists, postdocs and visitors. There are great opportunities – but much more could be done if only the necessary funds and posts were available.
Dear Professor Bourquin,
The European laboratory for particle physics, CERN, is often cited as the jewel of international collaboration. Created almost half a century ago by a consortium of 11 governments, CERN is now the world’s largest research laboratory, with 20 member states and significant support and collaboration from many other nations. Its scientific accomplishments are legion. CERN has been the worldwide leader in elementary-particle research for decades past and, hopefully, it will continue to be for decades to come.
The research carried out at CERN by its member states is “pure” in the sense of being guided primarily by the drive to understand the universe in which we live. It is an established fact that the practical consequences of even the most apparently exotic research have always been plentiful and unexpected: the Web, born at CERN, is only one example.
CERN is now embarked on an ambitious and fascinating endeavour: the construction of the Large Hadron Collider (LHC) and the deployment of large detectors to exploit it. Scientists within and outside the discipline of particle physics agree on the necessity for the timely completion of this promising project.
A series of circumstances has led its governing Council to require that all of CERN’s efforts, resources and manpower be focused on the LHC. While this may have become unavoidable, we wish to point out that such measures will be extremely damaging to the long-term future of the laboratory.
The scope of research now being done at CERN, as well as R&D for future CERN projects not related to the LHC, has already been reduced to a bare minimum. The moratorium on most lines of research will virtually terminate the training of the next generations of high-energy physicists. The severe cuts on R&D will have similar consequences for the training of accelerator and detector scientists, engineers and technicians. Moreover, with the absence of sufficient current long-term investments, there will be no further endeavours, CERN will have no post-LHC future, and its member states will have lost their leadership role in one of the most fundamental sciences. The CERN experience is not something that can be restarted “at the turn of a key”.
To remedy this situation, we would suggest that a small percentage of the CERN budget be earmarked for continued and renewed non-LHC research and for further R&D on future detectors and accelerators. We suggest that an appropriate ad hoc committee be constituted to study this proposal in detail. A modest effort of this kind would restore the enthusiastic morale once shared by all CERN personnel and users engaged in the quest for new knowledge; would preserve and transmit to future generations the carefully honed art and know-how underlying our discipline; and would enable CERN to continue to contribute to a better Europe and a better world.
We encourage the governments of the member states of CERN to weigh the present restrictive measures against a relatively small effort that would have an enormous impact upon the present and future research accomplishments at CERN, with considerable benefits to both science and society. Sincerely,
Georges Charpak, Val L Fitch, Sheldon L Glashow, Gerard ‘t Hooft, Leon Lederman, Tsung-Dao Lee, Martin L Perl, Norman F Ramsey, Burton Richter, Carlo Rubbia, Martinus J G Veltman, Jack Steinberger, Steven Weinberg and Chen Ning Yang.
cc. Professors Aymar and Maiani, the members of Council and the members of the Scientific Policy Committee.
Maurice Bourquin and Luciano Maiani reply:
We feel very honoured that such a distinguished panel of scientists thinks so highly of the organization and of its achievements and shows concern about its future.
CERN was forced to make painful choices at the end of 2001, when the organization was confronted with a funding shortfall for the completion of the LHC project. In line with the recommendations of the External Review Committee, CERN management submitted a long-term plan, in June 2002, aimed at focusing resources and manpower on the LHC machine and experiments. The target of commissioning the machine in April 2007 was confirmed. The plan was finally accepted by Council in December 2002, and it puts the LHC construction on a firm, if very tight, basis.
The LHC is the worldwide priority in high-energy physics. The reduction in the scientific activities at CERN during the LHC construction was the price to pay for the future possession of this powerful tool.
A limited number of ring-fenced parallel activities have been maintained, however, to keep a minimum of scientific diversity (COMPASS, long base-line neutrino beam, low-energy antiprotons, low-energy nuclear and neutron physics) or to preserve vital options for CERN’s future (CLIC and its test facility and the design of the front-end of the Superconducting Proton Linac).
While submitting to Council the strategic decisions that have been adopted at the end of 2002, CERN management and several delegations pleaded with Council to consider providing additional resources to ensure an extension of CERN’s scientific activities, once the LHC financial situation was consolidated.
We welcome the recommendations made by the authoritative panel of Nobel laureates and we are sure they will help in opening a new fruitful discussion with the member states on the future of CERN and of international research in particle physics.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.