Comsol -leaderboard other pages

Topics

Developing countries and the global science web

cernrsis3_12-03

Most developing countries experience great difficulties because of adverse economic conditions and political instability, which means they lag behind in scientific and technological development. Building science facilities can be very expensive, so there is the potential for an enormous gap between the rich and the poor. However, science has been quite successful in leap-frogging this gap, enabling scientists from developing countries to participate in many scientific activities. This has taken many forms, including the interaction between scientists by e-mail, and visits by senior scientists and graduate students. Large facilities have also opened their doors to scientists from economically disadvantaged countries, literature and equipment has been donated by both organizations and individuals, and conference access has been made available.

With the advent of the World Wide Web and the rapid exchange of information via the Internet, one might naively have thought that much of the gap between developed and developing nations would disappear, even if problems still persisted for those areas of science that need expensive facilities. However, access to information, peer reviewed or not, depends on having the appropriate hardware, i.e. a computer, and Internet connectivity, and there is a serious problem with access to the Internet in developing countries. Gaining access to a computer is more of a question of economics, and one that we will assume will somehow be overcome. In this article we will instead concentrate on the issue of Internet connectivity.

Most of the countries with the lowest income economies have or have had serious problems with bandwidth, as well as with the high cost of access to the Internet. The high cost of connectivity is mainly due to the monopolies that communication companies are able to establish in developing countries. These costs, added to the low bandwidth, do not allow scientists to have timely access to information. In addition, there is also the expense of scientific literature, which is often prohibitive.

In most cases scientists in basic research do not attach economic value to their product, and so are willing to share their knowledge with fellow scientists, independent of their nationality or race. In addition, many scientific publishing companies are run at least partially by scientists, and most are willing to allow those from disadvantaged countries to access their journals, despite the usual high prices. This has given birth to some very successful initiatives in areas such as medicine (HINARI – Health InterNetwork Access to Research Initiative), biology (PERI – Programme for the Enhancement of Research Information), agriculture, fishery and forestry (AGORA – Access to Global Online Research in Agriculture), and physics, mathematics and biology (eJDS – electronic Journals Delivery Service). These are run by the World Health Organization, the International Network for the Availability of Scientific Publications, the Food and Agriculture Organization of the United Nations, and the Abdus Salam International Centre for Theoretical Physics (ICTP), respectively, in strict collaboration with major publishing companies and societies.

cernrsis1_12-03

All these initiatives, even if they use different ways to access the sources of information, have a common characteristic: they allow scientists in the least-developed countries (individually, or through their libraries) to access the best and most appreciated literature in their fields. And in most cases this access is free.

Renowned Ghanaian scientist Francis Allotey, who is active in the politics of science in Ghana, has said: “We paid the price for not taking part in the Industrial Revolution of the late eighteenth century because we did not have the opportunity to see what was taking place in Europe. Now we see that information and communication technology (ICT) has become an indispensable tool. This time, we should not miss out on this technological revolution.”

Up until a year ago it was not clear, despite the efforts of many, whether bridging the digital divide with Africa would be feasible. However, at a recent meeting in Trieste, “Open Round Table on Developing Countries Access to Scientific Knowledge: Quantifying the Digital Divide”, we were able to see that some of the African countries had come up with very ingenious ideas to keep up with ICT. It is clear that Africa has decided to take Allotey’s words to heart, and has engaged in whatever may be necessary to bridge the technological divide. It is therefore more important than ever that the efforts to help them that have already begun do not stop; the results are there to see. If this ICT revolution is not to be missed, scientific institutions must keep up to date, and this in turn relies on the Internet connectivity of these institutions. But do they have it?

For the first time, at the same round table, Les Cottrell of SLAC, Warren Matthews of the Georgia Institute of Technology and Enrique Canessa of ICTP, Trieste, presented results on the connectivity of institutions in Third World countries, using measurements performed by the SLAC/PingER project. Figure 1 shows all the countries covered by this project, which measures the return time of an Internet packet between monitor sites and remote sites. Monitoring Internet connectivity or Internet performance in this way gives a good idea of trends – who is catching up, who is falling behind – and also allows a comparison of Internet performance with economic indicators. Since it is a quick way of measuring trends in ICT, decisions on investments can be made quickly enough to avoid irreversible damage.

cernrsis2_12-03

The PingER project started some years ago to monitor Internet performance for data exchange in the large high-energy and nuclear-physics collaborations around the world. Recently, following an agreement with the ICTP and eJDS, the measurements have been extended to a selection of institutional hosts in developing countries, and are now available for around 80 countries.

Figure 2 shows throughputs in kilobytes per second as a function of time between January 1995 and January 2003. The measurements were made from SLAC in the US. The results show that Latin America, China and Africa, while at much lower levels of performance than the US (Edu), Canada and Europe, are keeping pace with these countries. Russia is quickly improving, but surprisingly India is lagging behind. This is a piece of information that should worry policy makers in India, as it is a country with a very well developed ICT. So why are their institutions behind? Part of the reason may be the choice of hosts being monitored in India, which are at academic and research institutions. But even so, this means that some institutions are very backward, with very poor connectivity.

The amount of data gathered is not enough to give a complete picture of the whole world, but it does show that the technology to monitor is there. Policy makers from developing countries can benefit from the data, since the information is freely available on the eJDS website (www.ejds.org). Moreover, it can also help when large funding agencies decide to invest in development, and will give an idea of the performance of the various countries. This measure of connectivity should be considered as a new variable in the complex field of economics.

Can companies benefit from Big Science?

Several studies have indicated that there are significant returns on financial investment via “Big Science” centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields – for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits.

cernett1_12-03

We have therefore analysed the technological learning and innovation benefits derived from CERN’s procurement activity during 1997-2001. Our study was based on responses from technology-intensive companies of some financial significance. The main aim was to open up the “black box” of CERN as an environment for innovation and learning. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm’s organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings imply ways in which CERN – and by implication other Big Science centres – can further boost technology transfer into spill-over benefits for industrial knowledge and enhance their contribution to industrial R&D and innovation.

The method and sample

The empirical section of the study had several parts. First, a series of case studies was carried out, to develop a theoretical framework describing influences on organizational learning in relationships between Big Science and suppliers. The theoretical framework for the study is indicated in figure 1. A wide survey of CERN’s supplier companies was carried out from autumn 2002 to March 2003. Finally, a parallel survey was carried out among CERN staff responsible for coordinating purchase projects in order to explore learning effects and collaboration outcomes at CERN.

The focus of the survey was CERN-related learning, organizational and other benefits that accrue to supplier companies by virtue of their relationships with CERN. The questionnaire was designed according to best practice with multi-item scales used to measure both predictor and outcome variables. All scales were pre-tested in test interviews, and the feedback was used to iron out any inconsistencies and potential misunderstandings.

cernett2_12-03

The base population of the study consisted of 629 companies – representing about 1197 million SwFr (€768 million) in procurement – which were selected as being technology-intensive suppliers with order sizes greater than 25,000 SwFr (€16,000). This base population was selected from a total of 6806 supplier companies, which generated supplies worth 2132 million SwFr (€1368 million) during this period (see “The selection process” table). The selection procedure therefore indicates that around 10% of CERN’s suppliers are firms that are both genuinely technology-intensive and also of some financial significance; when combined, they represent more than 50% of CERN’s total procurement budget during the period under study. We received 178 valid answers to our questionnaire from 154 of these companies, most of which were still supplying high-technology products or services to CERN beyond 2001. These answers formed the basis for our statistical study.

The findings

The learning outcomes documented in the study range from technology development and product development to organizational changes. The main findings show that while benefits vary extensively among suppliers, learning and innovation benefits tend to occur together: one type of learning is positively associated with other types of learning. Technological learning stands out as the main driver. Indeed, 44% of respondents indicated that they had acquired significant technological learning through their contract with CERN.

The study revealed important signs of development of new businesses, products and services, and increased internationalization of sales and marketing operations. As many as 38% of all respondents said they had developed new products as a direct result of their contractual relationship with CERN, while some 60% of the firms had acquired new customers. Extrapolating to the base population of 629 suppliers suggests that some 500 new products have been developed, attracting around 1900 new customers, essentially from outside high-energy physics. In addition, 41% of respondents said they would have had a poorer technological performance without the benefit of the contract with CERN, and 52% of these believed they would have had poorer sales.

Virtually all CERN’s suppliers appear to derive great value from CERN as a marketing reference (figure 2). In other words, CERN provides a prestigious customer reference that firms can use for marketing purposes. As a result of their interaction with CERN, 42% of the respondents have increased their international exposure, and 17% have opened a new market.

cernett3_12-03

Suppliers also find benefits in terms of organizational effects from their involvement in CERN projects, such as improvements to their manufacturing capability, quality-control systems and R&D processes. In particular, 13% of the respondents indicated that they had formed new-product R&D teams and 14% reported the creation of new business units. A final aspect of the impact on organizational capabilities concerned project management, with 60% indicating strengthened capabilities in this area.

In examining what determines the outcomes of the relationships between suppliers and CERN, we found that learning and innovation benefits appear to be regulated by the quality of the relationship. The greater the social capital – for example, the trust, personal relationships and access to CERN contact networks – built into the relationship, the greater the learning and innovation benefits. This emphasizes the benefits of a partnership-type approach for technological learning and innovation.

In particular we found the frequency of interaction with CERN during the project to be relevant, together with how widely used the technology was at the start of the project, and the number of the supplier’s technical people who frequently interacted with CERN during the project.

During the study we also interviewed physicists and engineers at CERN, to cross check the information provided by the companies. These interviews highlighted the benefits that these interactions with industry have for CERN personnel. We observed that the mutual beneficial effect is independent of the level of in-house experience and participation in high-tech projects, confirming the importance of maintaining an active “industry-Big Science” interaction. Furthermore the results confirmed the importance of technologically challenging projects for the high levels of knowledge acquisition and motivation of highly qualified staff at CERN.

The implications

The study has shown that to a larger scale than foreseen, many suppliers have benefited from technological and other types of learning during the execution of contracts with CERN. The benefits suggest that for companies that are participating in highly demanding and cutting-edge technological projects, conventional procurements and stringent requirements are not the most appropriate modes of interaction, especially if learning and innovation are to be nurtured in the long term. For a Big Science organization such as CERN, some measures to facilitate true partnership and some thoughts on how to maintain the efficiency of the derived benefits for projects with long life-span development, either within the present financial and commercial rules or through other possible routes, will have to be carefully investigated. In particular procurement policy should involve industry early on in the R&D phase of high-tech projects that will lead to significant contracts.

Lastly, note that the methodology developed is not specific to CERN. It would be interesting to determine if and how these findings from CERN compare with those of other Big Science centres.

Neutral currents and W and Z: a celebration

Steven Weinberg

Twenty years ago, in 1983, CERN announced the discovery of the W and Z bosons, a feat that earned the laboratory its first Nobel prize in 1984. Ten years previously, physicists working at CERN had found indirect evidence for the existence of the Z particle in the “neutral currents”. These discoveries together provided convincing evidence for the electroweak theory that unifies the weak force with the electromagnetic force, and that has become a cornerstone of the modern Standard Model of particles and forces.

Carlo Rubbia

These breakthroughs brought modern physics closer to one of its main goals: to understand nature’s particles and forces in a single theoretical framework. James Clerk Maxwell took the first steps along this path in the 1860s, when he realized that electricity and magnetism were manifestations of the same phenomenon. It would be another hundred years, however, before theorists succeeded with the next stage: unifying Maxwell’s electromagnetism with the weak force in a new electroweak theory.

Peter Zerwas (left) and John Ellis

In 1979, three of the theorists responsible for the electroweak theory, Sheldon Glashow, Abdus Salam and Steven Weinberg, were awarded the Nobel prize. In 1984, Carlo Rubbia and Simon van der Meer from CERN shared the prize for their part in the discovery of the W and Z particles. The results ushered in more than a decade of precision measurements at the Large Electron-Positron collider (LEP), which tested predictions of the Standard Model that could be calculated thanks to the work of theorists Gerard ‘t Hooft and Martinus Veltman, who shared the Nobel prize in 1999.

Don Perkins (right)

A big celebration

To celebrate the 30th and 20th anniversaries, respectively, of the discoveries of neutral currents and the W and Z bosons, CERN held a special symposium on 16 September. The authors (some 250) of the papers that announced the discoveries were invited to attend, together with others who had played valuable roles in the development of the Standard Model, including Glashow, Weinberg and Veltman. Weinberg gave the opening talk – a masterful discourse on “The making of the Standard Model”. He began in the 1950s, which he described as a time of frustration after the triumph of quantum electrodynamics, before moving on to an account of his own contributions to electroweak theory during the second half of the 1960s. He also mentioned the work of many other physicists, some of whom were in the audience.

Giorgio Brianti

An important step towards confirming electroweak unification came in 1973, when the late André Lagarrigue and colleagues working with the Gargamelle bubble chamber at CERN observed neutral currents – the neutral manifestation of the weak force that had been predicted by electroweak theory but never previously observed. Later in the same decade, Rubbia proposed turning the laboratory’s most powerful particle accelerator into a particle collider, an idea that received the support of the then directors-general, John Adams and Léon Van Hove. By colliding counter-rotating proton and antiproton beams head on, enough energy would be concentrated to produce W and Z particles. This was made possible, in particular, through van der Meer’s invention of “stochastic cooling” to produce sufficiently dense antiproton beams.

Dieter Haidt

In the second presentation at the symposium, Giorgio Brianti described CERN’s various contributions to accelerators and beams, beginning in 1961 with van der Meer’s invention of the magnetic horn to provide a more intense neutrino beam by focusing the charged particles (mostly pions) that decay to provide the neutrinos. A decade later the Intersecting Storage Rings became the world’s first proton-proton collider, and also the test-bed where stochastic cooling was invented.

Georges Charpak (left)

Dieter Haidt then took up the story of the experiments, with his presentation on the discovery of neutral currents. In the early 1970s Haidt, now at DESY, was a member of the Gargamelle collaboration. He described the background to the search for neutral currents, and the difficulties encountered in convincing everyone that they had indeed been observed in Gargamelle. Ultimately the discovery led to the high-precision neutrino experiments at CERN, and allowed the prediction of the mass of the W boson from electroweak theory – which in turn led to the proton-antiproton project.

Lyn Evans, Jos Engelen, Paul Messina

By 1981 the search for the W and Z particles was on. The observation of W particles by the UA1 and UA2 experiments was announced at CERN on 20 and 21 January 1983. The first observation of Z particles by UA1 followed soon after, with the announcement on 27 May (CERN Courier May 2003 p26). Pierre Darriulat, who was spokesperson of UA2 in 1978-1985, recalled the background to the decision to go ahead with the programme of converting the Super Proton Synchrotron (SPS) to a proton-antiproton collider, and cast his own personal illumination on the nature of the competition between the two experiments, UA1 and UA2.

Sam Ting, John Ellis, Georges Charpak, Steven Weinberg and Carlo Rubbia

With the discovery of the W and Z particles, electroweak theory became truly established. The challenge was now for a different kind of collider – LEP – to provide a precision test-bed for the theory. In the final presentation of the morning, Peter Zerwas from DESY looked back on the era of LEP, 1989-2000, and its many tests of the Standard Model of particle physics – testing not only electroweak theory but also quantum chromodynamics, the theory of the strong interaction.

abiola Gianotti (left) and Ignatios Antoniadis

A challenging future

In addition to reflecting on past findings, speakers at the symposium also looked to the future of CERN, in particular the Large Hadron Collider (LHC), which is due to start up in 2007. By colliding particles at extremely high energies, the LHC should shed light on such questions as: why do elementary particles have mass? What is the nature of the dark matter in the universe? Why did matter triumph over antimatter in the first moments of the universe, making our existence possible? And what was the state of matter a few microseconds after the Big Bang? These are the topics that John Ellis from CERN covered in the first presentation after lunch.

Pierre Darriulat

This was followed by a session on the challenges of the LHC. Lyn Evans, director of the LHC Project, presented the challenges of constructing and operating the collider; Jos Engelen, currently director of NIKHEF and soon to be chief scientific officer at CERN, described the challenge of building the detectors for the LHC; and Paul Messina, of Caltech, talked of the challenges to be met once the LHC and the experiments are running, when vast amounts of data will need to be collected and analysed.

Simon van der Meer

The discovery of the W and Z particles owed much to the development of detector techniques, in particular by Georges Charpak at CERN, who was rewarded with the Nobel prize in 1992. In his presentation, Charpak recalled his early days at CERN, when he was invited to come to the laboratory for six months only to stay eventually for many years, leading many developments in particle detectors. He also spoke of the opportunities for spreading enthusiasm for physics to the younger generations.

Robert Aymar (left)

In the final presentation CERN’s director-general Luciano Maiani gave his personal view of the future of the organization, and its importance for particle physics in general. This was followed by a panel discussion on the future of particle physics, chaired by Rubbia – a former director-general as well as Nobel laureate. Panel members included Robert Aymar, who will be director-general of CERN from 1 January 2004, Lev Okun, of ITEP, Donald Perkins of Oxford University, and van der Meer and Veltman, as well as symposium speakers Charpak, Darriulat, Maiani and Weinberg. Among the many points discussed, one of the most important, as Rubbia said in his summary, is that “we should not underestimate the surprise capacity of the LHC”.

Herwig Schopper

The symposium was organized by Roger Cashmore and Jean-Pierre Revol. Many of those attending had participated in the discoveries being celebrated – and many of the younger members of the audience will hope to participate in future discoveries at CERN, in particular in the coming years with the LHC.

Further reading

See www.cern.ch/cerndiscoveries.

The proceedings of the symposium will be published as:

1973: neutral currents, 1983: W± and Z0 bosons – the anniversary of CERN’s discoveries and a look into the future The European Physical Journal C 34 1.

Particle physics is thriving in Spain

Particle physicists were a rare commodity in Spain when the country joined CERN in 1961 as the organization’s 14th member state. The civil war (1936-1939) had virtually created a scientific void; in George Orwell’s words from Homage to Catalonia, “It is not easy to convey the nightmare atmosphere of that time.” Blas Cabrera Felipe, who could have created a modern school of physics, left the country in 1939 and died in exile in Mexico in 1945, age 67. Nor was science a priority in the aftermath of the war. “In fact it is difficult to understand why Spain joined CERN, devoid as she was of a scientific community in the field,” says the distinguished Spanish physicist Pedro Pascual.

cernspain1_12-03

Nonetheless, joining CERN gave a cardinal impetus to the development of particle physics in Spain. So when the country left CERN at the end of 1968 it was under vigorous protests from a small community of young and enthusiastic Spanish physicists. Fourteen years later, in 1983, Spain was welcomed back to CERN, now much better prepared for the challenges lying ahead.

Among the duties of the European Committee for Future Accelerators (ECFA) are visits by a sub-panel called RECFA (Restricted ECFA) to the CERN member states. The aim is to gauge the status of particle physics and closely related disciplines and to help improve the prevailing conditions by making recommendations to physicists, funding agencies and politicians.

Past problems

On its first visit to Spain, in 1983, RECFA noted that it would take time and real dedication to build up a community of experimental physicists in the country. Another issue of concern was the urgent need for networking facilities for high-energy physics – both within the country and for connections with other member states – because of, as RECFA phrased it, “the very large distances separating the universities in Spain”.

Now, 20 years later, the distances are no longer perceived as large yet networking is still an issue, albeit on a much more sophisticated level. The unprecedented need for computing and networking has led for example to the LHC Computing Grid (LCG) project, with strong participation from Spain.

This year, in a welcome speech delivered on behalf of the minister of science and technology, the RECFA members were reminded that, ever since Spain rejoined CERN, the country has taken its responsibilities very seriously. In 1983 a special programme (the National Plan for Particle Physics and Large Accelerators) was created, within the framework of the Spanish “National Plan for R&D”. This programme, which funds participation in international experiments and pays some personnel costs, is still running. “The creation of the National Plan for R&D is the single most important event in the history of Spanish science,” says Enrique Fernandez, a former chairman of ECFA. Indeed, RECFA was delighted to learn that the Spanish investment has paid off handsomely – the growth of the community and the quality of the work are impressive.

Experimental particle physics in Spain is a young field, taking off in 1983 when the country rejoined CERN. By then, however, the period of rapid expansion at the universities had come to an end. In Spanish universities, positions are usually created as a response to the required teaching load, so the new experimental groups could not reach the critical mass needed for efficient participation in large collaborations. In addition, finding jobs at universities for engineers and technicians has always been very difficult.

cernspain2_12-03

To overcome these obstacles, Spanish physicists have in the past repeatedly expressed their desire to have a national institute, similar to INFN in Italy or IN2P3 in France. This was a major theme of RECFA visits in 1992 and 1997, and it resurfaced at this year’s meeting where it received strong support from the sub-panel. An important point is that Spain has 17 regions, each with its own government. These are responsible for education, which was formerly dealt with by the central authorities. This decentralization favours a multifaceted system, where research centres are funded by several sources, local as well as national. Indeed, there has been increasing support for particle physics from several regional governments. Therefore RECFA recommends that a national institute be created, but with a structure loose enough not to jeopardize local support.

Current structures and activities

In Spain, research in particle physics is carried out at universities as well as at non-university research institutes or centres. Most of the funds for specific projects, such as participation in Large Hadron Collider (LHC) experiments, are granted by the Ministry of Science and Technology, on a competitive basis through the National Plan for Particle Physics and Large Accelerators.

Experimental particle physics has in fact grown rapidly in Spain compared with many other countries. Currently the field is being pursued not only in Madrid, Valencia, Barcelona and Zaragoza, as was the case some years ago, but also in Santiago de Compostela, Santander, Granada and Seville. The amount of work carried out is vast and this article can give only a brief account.

cernspain3_12-03

A major centre for research is CIEMAT, the Research Centre for Energy, Environment and Technology in Madrid, which is financed by the Ministry of Science and Technology. It is a big organization with around 1150 employees, half with university degrees. CIEMAT has five research departments, one of them being the Department of Fusion and Elementary Particles. In addition, it has departments that can provide technical services and R&D support. This enables the researchers at CIEMAT to play a leading role in detector construction and R&D projects. Current activities include the construction of the barrel muon detector of CMS and participation in nTOF experiments at CERN, as well as an experiment at the PSI to determine the muon-decay coupling constant 20 times more accurately than previously. In astroparticle physics, CIEMAT is a major partner in the AMS experiment, to be performed on the International Space Station.

Several specialized research institutes are primarily funded by the Consejo Superior de Investigaciones Científicas (CSIC), an agency of the Ministry of Science and Technology. CSIC operates nationwide and provides funds for around 100 research institutes within a broad range of disciplines, similar to the structure of CNRS in France. Its prioritized areas of research include:
• elementary particle physics, including work at CERN;
• LHC computing and Grid technology;
• neutron physics – nTOF at CERN, and experiments at ILL in Grenoble and at the proposed European Spallation Source;
• synchrotron-radiation research at the ESRF, Grenoble, and at LURE, Orsay;
• detector and accelerator technologies.
The largest institute for particle physics funded by CSIC is the Instituto de Física Corpuscula (IFIC) in Valencia. Here, CSIC-funded researchers and those of the University of Valencia share premises for reasons of synergy. Experimental and theoretical research is performed primarily in areas of high-energy, nuclear and astroparticle physics. There are several activities in high-energy physics, among them construction of the ATLAS detector and the Grid project at CERN. Nuclear-physics activities include nTOF, and work at ISOLDE (CERN), GSI (Darmstadt), LNL (Padova) and in Jyväskylä, Finland; nuclear medicine is another line of research. IFIC is participating in the neutrino project ANTARES and in the gamma-ray mission INTEGRAL. IFIC is also an excellent centre for theoretical physics. The Instituto de Física de Cantabria (IFCA), at the University of Cantabria in Santander, is also co-financed by CSIC and the university. The institute participates in CMS and Grid projects as well as in the CDF experiment at Fermilab.

A centre of somewhat different character is the Institut de Física d’Altes Energies (IFAE) in Barcelona, which is a consortium between the local government and the Autonomic University of Barcelona (UAB). In addition to its own staff, the institute has associate members who are affiliated with the UAB or the University of Barcelona (UB). IFAE has the status of an institute of UAB and as such its members are allowed to teach doctoral courses at the university. This facilitates contact with young students, avoiding the danger of isolation that affects some free-standing research institutes.

cernspain4_12-03

Current projects at IFAE include detector construction for the ATLAS experiment and work on the Grid project, as well as participation in the CDF experiment. The astrophysics experiment MAGIC, for the detection of cosmic gamma rays, constitutes another major activity (see “MAGIC opens up the gamma-ray sky”). The institute is also moving into the domain of neutrino physics by analysing data from the K2K experiment in Japan, and it intends to participate in the proposed second-generation experiment, J-PARC-Nu. R&D, especially pertaining to the development of a novel X-ray detector for use in medical imaging, is another major activity. Many projects in theoretical particle physics are carried out at IFAE as well as at UAB and UB. In addition the UB has a small group that is involved in the BaBar experiment at SLAC and is participating in the LHCb experiment at CERN and the Grid project.

The university environment

Madrid is a centre for substantial activity in particle physics at two universities. Researchers from the Autonomic University of Madrid (UAM) have for a long time been involved with the ZEUS experiment at DESY. A UAM group is also participating in the construction of the ATLAS detector and another has started working on CMS. Researchers from the Universidad Complutense de Madrid (UCM) are contributing to the MAGIC experiment. Both universities have strong theory groups. Recently, a joint CSIC-UAM Institute has been created, which will no doubt strengthen theoretical particle physics in Spain and lead to further excellence.

Researchers from the University of Santiago de Compostela are engaged in the Auger Project (see “Auger ready for ultra-high-energy cosmic rays”) as well as in the construction of the LHCb detector. They have also been heavily involved in the DIRAC experiment at CERN. The University of Granada, meanwhile, is a newcomer in the area of experimental particle physics, having begun only in 2002. The Granada group is participating in the preparation of the ICARUS experiment, to be done at the Gran Sasso Laboratory in Italy. Theoretical particle physics, by contrast, has a longer history in Granada.

For 15 years Spain has also had an underground lab, Canfranc, situated in a tunnel between France and Spain. During this time the laboratory has undergone several upgrades; current research projects include double beta-decay experiments and searches for dark matter.

Finally, it is interesting to note that the nTOF experiments at CERN attract a rather large community of scientists from Spain, not only from large centres for particle physics but also from polytechnic universities and the University of Seville. Grid computing is another “unifying project”, with the Spanish groups participating in LHC experiments working together to set up the necessary infrastructure. A scientific information centre, the Port d’Informació Científica (PIC), coordinates these efforts.

The impact of the Large Electron-Positron (LEP) collider on the Spanish particle-physics community has been remarkable. Work at LEP has led to 64 PhD theses and about 1000 publications. In addition about 20 PhDs are awarded each year in theoretical particle physics. As mentioned above, both the increase in the size of the community and the financial support have been very substantial compared with the pre-LEP era. There are now around 270 experimenters and 150 particle theorists in Spain.

RECFA found that Spanish experimental particle physicists are at the forefront of international research and make an impressive contribution. They are indispensable partners in the collaborations to which they belong. RECFA was also very impressed by the vitality and intellectual leadership of the theoretical particle-physics community in Spain, which the committee found to be among the strongest in Europe.

Despite the Spanish success story, however, serious structural problems loom on the horizon. Finding permanent positions at the universities for gifted young physicists is currently deemed to be almost impossible, not only for demographic reasons but also because of a downward trend in the number of first-year physics students (in the past two years, however, this number has been more stable). Solving this problem would require finding new methods of funding.

Another issue of great concern is the lack of engineers and technicians at universities because, in general, there are no permanent positions available in these categories. These people, vital to research, have access only to temporary positions, supported by project funds, or to posts created within regional institutes.

The above problems must be solved urgently. The present major commitment of Spanish universities and institutes to the LHC project is expected to result in a repeat of the success story of LEP. Spain is an attractive country for high-level scientists, postdocs and visitors. There are great opportunities – but much more could be done if only the necessary funds and posts were available.

The case for investing in the future

Dear Professor Bourquin,
The European laboratory for particle physics, CERN, is often cited as the jewel of international collaboration. Created almost half a century ago by a consortium of 11 governments, CERN is now the world’s largest research laboratory, with 20 member states and significant support and collaboration from many other nations. Its scientific accomplishments are legion. CERN has been the worldwide leader in elementary-particle research for decades past and, hopefully, it will continue to be for decades to come.

The research carried out at CERN by its member states is “pure” in the sense of being guided primarily by the drive to understand the universe in which we live. It is an established fact that the practical consequences of even the most apparently exotic research have always been plentiful and unexpected: the Web, born at CERN, is only one example.

CERN is now embarked on an ambitious and fascinating endeavour: the construction of the Large Hadron Collider (LHC) and the deployment of large detectors to exploit it. Scientists within and outside the discipline of particle physics agree on the necessity for the timely completion of this promising project.

A series of circumstances has led its governing Council to require that all of CERN’s efforts, resources and manpower be focused on the LHC. While this may have become unavoidable, we wish to point out that such measures will be extremely damaging to the long-term future of the laboratory.

The scope of research now being done at CERN, as well as R&D for future CERN projects not related to the LHC, has already been reduced to a bare minimum. The moratorium on most lines of research will virtually terminate the training of the next generations of high-energy physicists. The severe cuts on R&D will have similar consequences for the training of accelerator and detector scientists, engineers and technicians. Moreover, with the absence of sufficient current long-term investments, there will be no further endeavours, CERN will have no post-LHC future, and its member states will have lost their leadership role in one of the most fundamental sciences. The CERN experience is not something that can be restarted “at the turn of a key”.

To remedy this situation, we would suggest that a small percentage of the CERN budget be earmarked for continued and renewed non-LHC research and for further R&D on future detectors and accelerators. We suggest that an appropriate ad hoc committee be constituted to study this proposal in detail. A modest effort of this kind would restore the enthusiastic morale once shared by all CERN personnel and users engaged in the quest for new knowledge; would preserve and transmit to future generations the carefully honed art and know-how underlying our discipline; and would enable CERN to continue to contribute to a better Europe and a better world.

We encourage the governments of the member states of CERN to weigh the present restrictive measures against a relatively small effort that would have an enormous impact upon the present and future research accomplishments at CERN, with considerable benefits to both science and society.
Sincerely,
Georges Charpak, Val L Fitch, Sheldon L Glashow, Gerard ‘t Hooft, Leon Lederman, Tsung-Dao Lee, Martin L Perl, Norman F Ramsey, Burton Richter, Carlo Rubbia, Martinus J G Veltman, Jack Steinberger, Steven Weinberg and Chen Ning Yang.

cc. Professors Aymar and Maiani, the members of Council and the members of the Scientific Policy Committee.

Maurice Bourquin and Luciano Maiani reply:
We feel very honoured that such a distinguished panel of scientists thinks so highly of the organization and of its achievements and shows concern about its future.

CERN was forced to make painful choices at the end of 2001, when the organization was confronted with a funding shortfall for the completion of the LHC project. In line with the recommendations of the External Review Committee, CERN management submitted a long-term plan, in June 2002, aimed at focusing resources and manpower on the LHC machine and experiments. The target of commissioning the machine in April 2007 was confirmed. The plan was finally accepted by Council in December 2002, and it puts the LHC construction on a firm, if very tight, basis.

The LHC is the worldwide priority in high-energy physics. The reduction in the scientific activities at CERN during the LHC construction was the price to pay for the future possession of this powerful tool.

A limited number of ring-fenced parallel activities have been maintained, however, to keep a minimum of scientific diversity (COMPASS, long base-line neutrino beam, low-energy antiprotons, low-energy nuclear and neutron physics) or to preserve vital options for CERN’s future (CLIC and its test facility and the design of the front-end of the Superconducting Proton Linac).

While submitting to Council the strategic decisions that have been adopted at the end of 2002, CERN management and several delegations pleaded with Council to consider providing additional resources to ensure an extension of CERN’s scientific activities, once the LHC financial situation was consolidated.

We welcome the recommendations made by the authoritative panel of Nobel laureates and we are sure they will help in opening a new fruitful discussion with the member states on the future of CERN and of international research in particle physics.

Weep for ISABELLE – a rhapsody in a minor key

by Mel Month, Avant Garde Press. ISBN 1410732533, $28.95.

410C0Z7Z1NL._SX314_BO1,204,203,200_

This book attempts to unravel a complicated politico-scientific tapestry, but in trying to unpick some tricky knots it creates a few new tangles of its own. In 1982 the US high-energy physics community organized a meeting at Snowmass to look at the future of national high-energy physics. After riding the crest of a wave for 30 years, the community felt in danger of falling into deep water. Across the Atlantic, CERN’s proton-antiproton collider had not yet discovered the W and Z carriers of the weak nuclear force, but the writing on the wall was clear (the crucial discovery came in 1983).

The US community was pushing for an ultra-high-energy proton collider to probe a distant energy frontier and search for the “Higgs mechanism” – which drives the subtle electroweak symmetry breaking and ensures that the weak W and Z carriers are much heavier than the massless photon that mediates electromagnetism. Thus Snowmass helped paint the wagon for the US Superconducting Supercollider (SSC), which was to emerge as the nation’s bid to regain particle-physics superiority. But by 1993 the global financial climate had cooled and the SSC was sacrificed, leaving the field clear for CERN’s LHC to become the world focus for high-energy physics.

Among those at Snowmass in 1982 was Mel Month of the Brookhaven National Laboratory and founder of the US Particle Accelerator School. One lunchtime, Month blurted out his views on the current US physics scene. These innocent remarks were not meant for general consumption, but bosses have long ears and there was an abrasive run-in. Month’s career then became slow-tracked. A heavy chip on the shoulder can be difficult to offload. To help, Month compiled this 600-page book, in which he portrays himself as “Mickey”, a highly motivated but politically naive young Brookhaven researcher.

The first half of the book depicts the evolution of particle physics in the second half of the 20th century as seen through Brookhaven eyes. Brookhaven was the site of major postwar US high-energy machines, which from 1960 to 1975 made many discoveries and reaped an impressive Nobel harvest. But as the US continued to disperse its high-energy physics effort, Brookhaven began to lag behind in this research sector. Its contender, the ISABELLE proton collider, was overshadowed by other US plans and hampered by difficult technology for superconducting magnets to guide its high-energy protons. Eventually ISABELLE had to make way for the new SSC, and Brookhaven looked to have missed the boat. (Ironically, when the SSC was finally cancelled, ISABELLE was reincarnated as the RHIC high-energy nuclear collider now in full swing at Brookhaven.) The laboratory’s stock tumbled further in the 1990s with unwarranted scaremongering of a tritium leak from its nuclear reactor.

The evolution of particle physics as seen from Brookhaven is a little like the British view of Europeanism – interesting but distorted because of evolved isolation. Month attributes blame, while his skewed overview brings some fresh insight and provides some vivid quotes: “Always the bridesmaid and never the bride”, referring to CERN’s early history; and for the SSC, “Like Lady ISABELLE a decade earlier, this dressed-to-kill damsel turned out to be a flash in the pan”.

The survey would be more valuable with a detailed index to help track through the intricate history. However, as the book calls itself a “historical novel”, none is supplied. The “novel” content is mainly confined to the second half of the book, where Month imagines Mickey interviewing the “Players”, the major characters in the book, most of whom are ex-Brookhaven management.

The book is difficult to read without an insider’s knowledge of particle physics. In the very first paragraph, BNL (for Brookhaven National Laboratory) appears without explanation, the first of much in-your-face shorthand, not all of which gets sorted out in the glossary. There are also some inaccuracies, such as: “1993 – Rubbia forced to resign as CERN director-general”.

In the push and shove of ruthless competition, most people experience at some time the bitterness of career injustice. These unpleasant episodes can be sublimated into fresh motivation, or simply filed. This book looks to have been a catharsis for Month, but does so much subjective detail need to be displayed?

From the Preshower to the New Technologies for Supercolliders

Edited by Björn H Wiik, Albrecht Wagner and Horst Wenninger, World Scientific. Hardback ISBN 9812381996, £53 ($78).

In 2000, the city of Bologna was the European Capital for Culture. To mark the occasion the University of Bologna and its Academy of Sciences published the achievements of their most distinguished members in the field of science and technology. This collection acknowledges the contributions of Antonino Zichichi and his colleagues in the development of experimental techniques that have contributed to the discovery of new particles and phenomena in the field of high-energy physics.

The collection was originally prepared by Björn Wiik, who at the time was director of DESY. After Wiik’s untimely death in 1999, Albrecht Wagner, Wiik’s successor, continued and completed his work, with the help of Horst Wenninger of CERN.

cernboo1_11-03

In his introduction Wiik recalls how in the early 1960s, when the dominant detector was the bubble chamber and the dominating field of interest was the physics of hadrons and neutrinos, Zichichi started to study the unfashionable topic of lepton-pair production in hadronic interactions. During the 1960s and early 1970s, Zichichi and colleagues, mainly working at CERN, developed a number of techniques to help in the problem of particle identification. This foresight was vindicated with the discovery in 1974 of the J/Ψ particle. The “pre shower” technology was essential to this discovery. In fact, this early emphasis on the development of innovative detection techniques continued to be one of Zichichi’s main scientific motivations.

The first section of this collection contains the major contributions from Zichichi and his co-workers on the development of three techniques that have come to be widely used in high-energy physics experiments: the “early shower development” method (universally used and now called the “pre shower” method), the study of range curves for high-energy muons in order to discriminate against pion penetration (“muon punch-through method”), and the “lead scintillator sandwich telescope” – the precursor of today’s calorimeters.

In the second section there are original papers by Zichichi and colleagues on high-precision time-of-flight (TOF) counters and the neutron missing-mass spectrometer technique. This section also includes an extract of a paper by Federico Palmonari on the AMS experiment. This experiment uses a TOF system that relies heavily on the early work of Zichichi and his colleagues at Bologna and CERN.

The third part of the collection describes the achievements of the LAA project. This was initiated by Zichichi, funded by the Italian government and implemented at CERN in 1986. The goal of the project was to prove the feasibility of a series of detector technologies that could be used in a future multi-TeV hadron collider. Zichichi had long promoted the construction in his native Sicily of a very high-energy hadron collider, the “Eloisatron”, with a collision energy of 200-1000 TeV and luminosities of up to 1036 cm-2 s-1. The machine parameters that served as a basis of the LAA project were those of a 10% model of the Eloisatron, surprisingly close to those of the LHC. The book reproduces the CERN report by Zichichi on the main achievements of the LAA project. All aspects of detector layout were considered in the project and, in view of the demands of the machine, special attention was paid to radiation hardness, rate capability, hermeticity and momentum resolution of the detector assemblies.

From 1990 to 1996 the LAA was transformed into the CERN Detector R&D. The fourth section is a review by Wenninger of the impact of the results from these two programmes on the design of the LHC detectors. Although the solutions adopted for the LHC may differ from those studied at the LAA, Wenninger argues convincingly that the initial work had a great influence and measurable impact on the design of the present LHC detectors.

Through this collection of papers, which touch only on one aspect of his work, Zichichi emerges as a person highly motivated by the development of experimental techniques to meet the challenges of future high-energy particle-physics experiments. The early work carried out directly by Zichichi and colleagues and the later LAA work that he inspired have certainly had a significant and continuing influence on particle detector design.

Evolution of Networks: From Biological Nets to the Internet and WWW

By S N Dorogovstsev and J F F Mendes, Oxford University Press. Hardback ISBN 0198515901, £49.95 ($95).

cernboo2_11-03

Imagine some collections of diverse objects: autonomous systems on the Internet, pages on the World Wide Web, neurons, genes, proteins, citations of scientific publications, the words in a human language, etc. Is it not fascinating that the graph-based abstractions of these different systems reveal that certain qualitative relationships among the objects remain invariant from one system to another? For instance, it was found that in a graph of the protein interactions of the yeast S.cerevisiae, the number of nearest neighbours follows a power-law distribution just as in a graph of the pages on the World Wide Web.

This book is an introduction to the exciting area of networks modelled as random graphs. The authors describe some fundamental structural properties of these graphs, and give a tour through a variety of real-world examples. They explain the underlying mechanisms that drive the evolution of graphs over time, and discuss the impact that a structural property of a graph may have on performance issues such as virus spreading and network connectivity.

As Dorogovstev and Mendes put it, this book was written by physicists but is aimed at a broader audience. The technical developments are kept at a low level, so no particular prerequisites are needed to follow them, and the authors present timely examples that cover the broad scope of the book.

The first chapter gives definitions of basic metrics that characterize a graph. The next chapter is devoted to exposing the preferential linking, a principle that, for instance, explains the emergence of the power-law distribution of the number of nearest neighbours of a vertex in a graph. The book proceeds with a discussion of a broad set of network examples, including scientific literature, communication systems and biological systems. In the subsequent two chapters, the authors separately cover equilibrium and non-equilibrium networks. The chapter on equilibrium networks analyses the stochastic recursive evolutions that drive a random graph to its steady state (equilibrium). A standard example in this context is the construction of a random graph by Erdos and Renyi. The chapter on non-equilibrium networks focuses on temporal aspects, especially the evolution of some probability measures of a graph over time. The book continues with a chapter on the global properties of graphs and their effect on performance. The authors end with appendices including some mathematical content and a detailed bibliography on the graph literature.

The exposition of the book is very pedagogical. Instead of rushing to examples, the authors first introduce readers to important elementary notions. After motivating the problems through well-chosen examples, they delve into specific subjects in detail, and the fundamental principles are unveiled in a suitable manner. There is, however, a certain imbalance in the lengths of different chapters.

This book should benefit readers who seek to gain an insight into the fundamental principles that underlie the random graphs found in diverse scientific disciplines.

CEBAF celebrates seven years of physics

cernceb1_11-03

Jefferson Lab in Newport News, Virginia, recently celebrated the first seven years of physics with the Continuous Electron Beam Accelerator Facility, CEBAF. The unique design of this electron accelerator allows three experimental halls to be operated simultaneously, with a total beam current of 200 µA and a beam polarization of up to 80%. With this facility, a user community of more than 1000 scientists from 187 institutions in 20 countries has completed 81 nuclear-physics experiments, with substantial data taken on 23 more. From the data obtained in these experiments, more than 250 refereed journal articles have been published and 146 doctoral degrees have been awarded. In the near future more than 60 experiments are planned, and there are currently 128 PhD theses in progress.

To celebrate and review these accomplishments, while also looking toward the future, the Jefferson Lab user group board of directors organized a symposium, which was held on 11-13 June and dedicated to the memory of Nathan Isgur, Jefferson Lab’s first chief scientist. The meeting was divided into eight physics topics: nucleon form factors, few-body physics, reactions involving nuclei, strangeness production, structure functions, parity violation, deep exclusive reactions and hadron spectroscopy. Each topic was presented by one experimentalist and one theorist.

The symposium began with presentations by Donal Day of Virginia and John Ralston of Kansas on nucleon form factors, which probe the electromagnetic structure of the proton and neutron. The presentations included a discussion of the most referenced and surprising result from Jefferson Lab, that the proton’s form factors do not follow an expected simple relation. While theorists have proposed different models to explain this result, the basic ingredient in almost all new models is the addition of relativistic effects.

The talks continued with presentations focusing on few-body systems, such as the deuteron and 3He, by Paul Ulmer of Old Dominion University and Franz Gross from the College of William and Mary. In these experiments, the Jefferson Lab electron beam is used to knock out a proton from the few-body system or to probe it with elastic scattering. The expected yield can be calculated exactly, assuming nucleons and mesons are the underlying particles. The presentations showed that even with beam energies of up to 5.7 GeV, the electron scattering results are surprisingly well explained by the nucleon-meson models to distance scales of the order of 0.5 fm. In contrast, experiments on deuteron photodisintegration, which probe even smaller distance scales, have revealed clear evidence of the limitations of the nucleon-meson models and of the onset of quark-gluon degrees of freedom.

For reactions involving nuclei, i.e. many-body systems such as oxygen and carbon, statistical methods in the context of the nucleon-meson picture are used to calculate the expected yields of the quasi-elastic reaction. Larry Weinstein of Old Dominion University presented a talk entitled “So where are the quarks?”, in which he showed that the nucleon-meson model describes even the highest momentum transfer Jefferson Lab data, while Misak Sargsian of Florida International presented a talk looking mostly to the future, when the quark-gluon nature of matter should become evident from experiments with a 12 GeV electron beam.

Reinhard Schumacher of Carnegie Mellon and Steve Cotanch of North Carolina State presented reactions involving strangeness production, which includes the production of particles such as kaons. They showed new Jefferson Lab data confirming the θ+ particle as discovered by SPring-8 in Japan. This new particle is comprised of five quarks and has been dubbed the pentaquark. This had been described as the first observed nucleon resonance comprised of more than three valence quarks and has sparked international excitement. A new Jefferson Lab experiment to further study this new particle has already been approved.

cernceb2_11-03

Structure-function experiments, which provide information on the quark and gluon structure of the nucleon, were presented by Keith Griffioen of the College of William and Mary, and Wally Melnitchouk of Jefferson Lab. While Jefferson Lab’s beam energy is relatively low for this type of experiment, the high luminosity available has allowed many high-precision structure-function results to be produced. An interesting feature of the Jefferson Lab data is that if one scales the smooth deep-inelastic cross-section results from high-energy physics to the laboratory’s kinematics, the scaled results will pass through the average of the resonant peaks of the laboratory’s data. This effect, known as duality, may lead to a better understanding of how the underlying quarks and gluons link to the nucleon-meson models.

Krishna Kumar of Massachusetts and Michael Ramsey-Musolf from the California Institute of Technology presented the parity-violation experiments, where the strange quark distributions in the proton can be extracted by measuring the extremely small asymmetry in the elastic scattering of polarized electrons from an unpolarized proton target. One series of these experiments has already been completed at Jefferson Lab and several more are planned, including the G0 and HAPPEX-II experiments scheduled for next year.

Deep exclusive reactions – experiments done in deep-inelastic kinematics but where the detection of multiple particles allows the final state of the system to be determined – were presented by Michel Garcon of SPhN/Saclay and Andrei Belitsky of Maryland. Generalized parton distribution models, which should enable a complete description of the nucleon’s quark and gluon distributions to be extracted from this type of data, were presented along with the results of the deeply virtual Compton scattering experiments at HERMES, DESY, and at Jefferson Lab. The results indicate that generalized parton distributions can indeed be extracted from this type of data. Several high-precision experiments are also planned for the coming years.

cernceb3_11-03

Steve Dytman of Pittsburgh and Simon Capstick of Florida State presented the wealth of hadron spectroscopy data that is coming from Jefferson Lab. The analysis of the vast set of data produced by the laboratory on the nucleon resonances has been only partially completed, but hints of new states are already emerging and work on a full partial-wave analysis of the data is now getting underway.

Following the physics presentations, Larry Cardman, the Jefferson Lab associate director for physics, presented the long-term outlook for the laboratory. This talk focused primarily on upgrading the CEBAF to a 12 GeV electron machine and building a fourth experimental hall. The higher energy will allow Jefferson Lab to continue its mission of mapping out the transition from the low-energy region where matter can be thought of as made of nucleons and mesons, to the high-energy region that reveals the fundamental quark and gluon nature of matter.

DESY looks to the future

In February 2003 Edelgard Bulmahn, the German federal minister of education and research, decided to support several large-scale facilities for basic scientific research. These included the 4 km long X-ray free-electron laser (XFEL), which was originally conceived as part of the project proposed by the international TESLA collaboration for a 33 km electron-positron linear collider to be built near DESY, Hamburg, together with an integrated X-ray laser laboratory. At the same time, the German government decided not to proceed nationally with the linear collider part of the TESLA project and not to propose a German site for such a machine at this moment, but to wait for international developments. These decisions will have important implications for DESY in the coming years.

The German government is thus proposing Hamburg as the site for a European XFEL facility, and is prepared to carry half of the investment costs of €673 million. A decision on construction should be possible within two years, and would be followed by a construction period of about six years. Since the announcement in February, the German government has entered into bilateral discussions with other European governments. The first goal is to set up two European working groups, one on scientific and technical issues, and one on organizational and administrative matters. In parallel, the European Strategy Forum on Research Infrastructure (ESFRI) recently organized a workshop at DESY, on 30-31 October, on the technological challenges of X-ray lasers. So far discussions have led to the conclusion that only one major facility for research with hard X-ray radiation should be developed in Europe. The XFEL is the only project proposal in Europe in this field.

In addition, the ministry foresees €120 million for the conversion of the PETRA storage ring – which currently serves as a pre-accelerator for HERA – into a high-performance third-generation synchrotron radiation source. This upgrade is scheduled to start in 2007, after the conclusion of the HERA physics programme, and is intended to strengthen further the research with synchrotron radiation.

Regarding the linear collider part of TESLA, the German government decided that DESY will continue work on the project as part of the international research and development effort. At the EPS conference on High Energy Physics held in Aachen in July, Hermann Schunck, director-general of the Federal Ministry of Education and Research, said: “We have to wait for international developments. But we will continue our efforts so that we can participate in a global linear collider project. Let me underline this – my government is the first one to have announced that it is in principle committed to participating in this project.”

Testing acceleration structures

For the past 10 years the TESLA collaboration has made decisive progress with the superconducting accelerator technology that forms the basis of the TESLA linear collider and the XFEL. A test accelerator of 250 MeV – the TESLA Test Facility (TTF) – has been built and has operated at DESY since 1997. The international partners in the project provided about 35% of the investment and personnel funding. At the TTF the collaboration successfully tested the superconducting acceleration structures and made groundbreaking progress on the SASE (Self-Amplified Spontaneous Emission) principle for a free-electron laser at short wavelengths around 100 nm. The first experiments with this new type of laser provided an impressive demonstration of the high scientific potential of free-electron lasers in the UV and X-ray region. The TTF is currently being extended to reach an energy of 1 GeV, that is, a length of 260 m. Starting in 2004, it will be available as a user facility for experiments with soft X-ray laser radiation above 6 nm wavelength. As such it will allow researchers to gain important experience in experimentation with free-electron lasers in the X-ray region, and it will provide valuable operating experience for the linear collider.

In co-operation with European partners, DESY is actively preparing for the construction of the 20 GeV superconducting linear accelerator for the XFEL laboratory, and is focusing on issues related to the industrialization, mass production, quality assurance and reliability of all the linear accelerator components. A first step in the concrete planning of the XFEL will be the commissioning of the free-electron laser for soft X-ray radiation at the expanded TTF. Since the XFEL is to be realized as a European project, discussions are being held with scientists and politicians in countries that are interested in participating in the effort. In these discussions a number of issues must be examined and clarified, such as the exact operational parameters of the laser and the organizational models for the laser laboratory. The inclusion of international partners from a very early stage in the planning and development of the superconducting accelerator within the TESLA project has proved very helpful in this respect.

At the same time, the TESLA collaboration continues to pursue the high-gradient programme to demonstrate the accelerating field of 35 MV/m that is required to reach 800 GeV for a 33 km TESLA collider. Substantial progress has been made in this area. In a test at low RF power, four nine-cell cavities have shown the required performance of 35 MV/m after electro-polishing. Two of these cavities were then fully assembled with all their ancillaries and have reached gradients above 35 MV/m in long-term testing under typical collider operating conditions (at Q > 5 x 109 with an RF loading as required for linear collider operation), but without beam. Each of these cavities corresponds to one-eighth of a TESLA cryo-module. This represents a significant step towards the milestones set by the International Linear Collider Technical Review Committee for “Phase II” of TESLA. A full test of one module – eight cavities – at 35 MV/m with beam will, however, take more time due to constraints on the resources available at DESY.

Towards a linear collider

The next major step towards a global collider project concerns the choice of technology. The International Linear Collider Steering Committee is currently setting up an advisory group (“wise persons”), which will be charged with performing an analysis of the status of the two competing technologies (“warm” and “cold”) and with making a technology recommendation before the end of 2004.

If the chosen linear collider technology is “cold”, a major synergy will exist between the work on the XFEL and the linear collider. In this case the contribution of DESY and the partners in the TESLA collaboration will most likely be in the area of the main accelerator of the collider. A recent analysis of the work needed to be done on the basic accelerator unit for the XFEL, the cryo-module, has shown that more than 90% of the issues to be tackled are the same for the XFEL and the linear collider. The synergy is therefore achieved by the fact that the work done now for the XFEL will be largely of direct use for the linear collider. In addition, the R&D funds now spent on the XFEL will not need to be spent again for a collider built using the cold technology.

If the chosen technology is “warm”, a major reassessment of the contributions will be necessary. In this case DESY will probably participate in other subsystems and its contribution will probably be less pronounced than for a “cold” machine due to the commitment to the XFEL.

DESY will continue to participate in the linear collider working groups of ICFA and ECFA and once the technology choice has been made, the laboratory will be a partner in a European team within an international linear collider design team. DESY will also play a major role in the design, construction and future operation of the collider detector(s).

The international efforts for the coming years aim at reaching an agreement, in principle, to start the construction of a linear collider in time for commissioning in 2014/15, in accordance with the recommendations of ACFA, ECFA, HEPAP and the OECD Global Science Forum. Taking into account a construction time of seven to eight years, this requires a decision to go ahead to be made in 2007. Such a schedule also requires the first funds to become available in 2007, although the major investment spending for the linear collider will typically begin three years after the project starts, i.e. around 2010, as has been the case for other major accelerators.

The future for DESY

The strength of DESY is the result of an in-house synergy in three key areas: accelerator development, particle physics and research with synchrotron radiation. Particle physics has been the driving force behind accelerator development, and this also applies to the TESLA project. The decisions of the research ministry have secured DESY’s long-term future as one of the world’s leading centres for research at accelerators.

In particle physics DESY’s research and its contributions to both the linear collider itself and the detector, will ensure that the laboratory remains a major contributor to the realization of the project, regardless of whether or not the facility is built in Germany.

bright-rec iop pub iop-science physcis connect