Comsol -leaderboard other pages

Topics

CERN and Saclay: 40 years of co-operation

cernsac1_5-03

The Saclay Research Centre near Paris is the largest of the French Atomic Energy Commission’s research centres. Conceived from the start as a multi-purpose centre to bring together fundamental research and technical innovation, it has since expanded to explore many different aspects of nuclear physics and its applications, including, of course, particle physics. In 1963, when physicists from Saclay began experiments on the PS at CERN, it marked the start of a long and fruitful collaboration between the two laboratories. By the time the LHC is commissioned, this collaboration will have encompassed more than 30 different experiments, to which Saclay has brought its expertise in instrumentation, data acquisition and data analysis.

The bubble-chamber era

When researchers from Saclay first came to CERN in the 1960s, the majority of experiments in particle physics involved bubble chambers. Saclay was one of the pioneers of their construction and use at a number of accelerators: first Saturne at Saclay, then Nimrod – the Rutherford Laboratory’s 8 GeV proton synchrotron – and then the PS at CERN and the 70 GeV accelerator at Serpukhov. The first series of experiments at the PS by physicists from Saclay’s SPCHE (see “The early days of Saclay” box) involved the use of an 81 cm hydrogen bubble chamber. This was developed by the technical services at Saturne for the laboratory of the Ecole Polytechnique under the leadership of Bernard Gregory, who later became director-general of CERN. The chamber was used to study K p, K n, π p, π+ n and pbar-p scattering at energies of a few GeV, with the aim being to understand the collision mechanisms. The same theme was repeated, but at higher energies, in a second series of experiments at the PS on CERN’s 2 m hydrogen bubble chamber.

cernsacbox_5-03

The 1970s saw a further increase in collision energy, and bubble chambers became bigger and bigger in line with the increasing multiplicity and energy of the final particles. In 1971, André Lagarrigue led the construction of the Gargamelle bubble chamber at the Saturne laboratory. Exposed to the neutrino beam from CERN’s PS, Gargamelle led to the discovery of neutral currents in 1973. This was followed by the Big European Bubble Chamber (BEBC) at the SPS, which involved energies approximately 10 times higher than at the PS. In addition to investigating strong interaction mechanisms and resonances, these experiments also explored neutrino-nucleon and antineutrino-nucleon scattering, the first stage in a better understanding of neutrino physics.

cernsac2_5-03

During its 30 years of bubble-chamber experiments, Saclay’s DPhPE (Département de physique des particules élémentaires) – which the SPCHE had become in 1966 – built up strong teams, of around 150 people, that specialized in the scanning and measurement of images, before going on to develop automatic scanning techniques that allowed more than 10 million images to be analysed. As a result the DPhPE, together with CERN, had the greatest measurement and data-handling capacity of any European laboratory, and so was able to play a major role in the collaborations in which it participated. In developing bubble chambers, the DPhPE’s technical services also acquired skills in the fields of magnetism, cryogenics and control systems, as well as experience in the design, construction and running of large projects. This expertise was to come in useful in the new generation of experiments at CERN in which the DPhPE took part.

The first electronic experiments

The spark chamber was invented at the beginning of the 1960s, and when used in conjunction with counters equipped with fast electronic read-out systems, it allowed events to be pre-selected – something that is impossible in bubble chambers. Spark chambers are also able to record signals much more quickly than bubble chambers, and their use became widespread in high-energy physics, marking the start of the “electronic” detector era. The SPCHE soon turned to this new technology. Its first electronic experiment at CERN, performed in 1964 by the team of Paul Falk-Vairant, involved the measurement of the high-energy charge exchange reaction π p → π0 n at the PS, in an extension of an earlier experiment at Saturne. The equipment designed at the SPCHE consisted of scintillation counters and optical spark chambers to detect electromagnetic showers. Working first with a liquid hydrogen target, the experiment seemed to confirm the simple Regge pole theory favoured at the time; but when carried out with a polarised target, the results showed that a more complex interpretation was needed.

cernsac3_5-03

The DPhPE continued its extensive study of strong interaction mechanisms at the PS and also began to study strange particles following the 1964 demonstration of CP violation in the neutral kaon system, to which Réne Turlay, later a key figure at Saclay, contributed. In 1971, the start-up of the Intersecting Storage Rings (ISR) at CERN allowed matter to be explored at much higher energies, and physicists from the DPhPE took part in two experiments there. One of these was R702, whose purpose was to measure the production of particles with large transverse momentum, and whose results corroborated the theory of the granular structure of protons. In these early electronic experiments, the DPhPE contributed detector elements and the associated electronics commonly used at the time: scintillation counters for the trigger, Cerenkov counters for identifying particles, spark chambers for measuring trajectories, and polarised targets. The end of the 1960s saw the appearance of wire chambers for tracking, which were faster and more precise than spark chambers, and which allowed detectors to be built that were larger and easier to operate.

By 1977 when CERN’s 200-400 GeV proton synchrotron, the SPS, was commissioned, the results from the previous 15 years had changed the perception of particle physics. In particular, the discoveries of neutral currents at Gargamelle in 1973 and of charmed particles in 1974 represented an initial experimental validation of the Glashow-Weinberg-Salam theory of the electroweak force and of quantum chromodynamics (QCD), the theory of strong interaction. The various experiments at the SPS set out to test these theories in more depth. The DPhPE played an active role, taking part in the deep inelastic scattering experiments with neutrinos (CDHS) and muons (BCDMS), as well as experiments in hadroproduction (WA11, NA3) and photoproduction (NA14). These brought a large haul of results to which the DPhPE’s physicists made significant contributions: measurement of nucleon structure functions, confirmation of violations of the scale-invariance predicted by QCD, precise measurements of sin2ΘW and αs, and charm studies.

cernsac4_5-03

DPhPE built proportional chambers or drift chambers of various sizes and geometries for all of these experiments at the SPS. Given the large number of wire chambers that were needed, the assembly lines the laboratory had at that time were a valuable asset. In particular, large 4 m sided hexagonal chambers were designed for the CDHS, and were subsequently used in numerous detector tests under beam conditions before being integrated in the recent CHORUS experiment. The DPhPE also built its first large-scale calorimeter for the NA3 experiment. Comprising lead plates and scintillating tiles, its size (5 m x 2 m) necessitated the development of a new low-cost type of scintillator with a high attenuation length. Again the skills acquired through participation in these projects were put to good use in the next generation of experiments.

cernsac6_5-03

In the 1980s, CERN took the major step of converting its SPS into a 540 GeV proton-antiproton collider, which later ran at 630 GeV. Commissioned in 1981, the SppbarS and its two general-purpose experiments, UA1 and UA2, led to the discovery of the W± and Z bosons, bringing resounding proof of the Glashow-Weinberg-Salam model for electroweak interactions (“When CERN saw the end of the alphabet”). The DPhPE took part in both experiments, contributing not only through technical achievements but also in obtaining physics results, in particular regarding the W± and Z bosons, jets, and the search for the top quark. Building on its experience with NA3, the DPhPE became involved in calorimetry in both UA1 and UA2, with lead-sandwich electromagnetic calorimeters and scintillators for the UA1 and UA2 endcaps, followed by scintillating optical fibre detectors for the fibre tracker for the second phase of UA2. The scintillator “gondolas” of the UA1 calorimeter, a key component in identifying and reconstructing the decays of the W± and Z bosons based on electron decay, were one of the DPhPE’s most significant achievements in terms of the specific developments and equipment needed, including the development of a new extruded polystyrene scintillator that allowed large thin leaves of uniform thickness to be made.

The LEP era

The DPhPE was involved in LEP right from the outset, and participated in the ALEPH, DELPHI and OPAL experiments. In ALEPH, the contributions involved the superconducting solenoid – which was 5 m in diameter, 7 m long, with a field of 1.5 T – the lead-sandwich electromagnetic calorimeter that incorporated proportional tubes, and the silicon-tungsten luminosity calorimeter. For DELPHI, the DPhPE was involved with the tracker – a time projection chamber – and its associated data acquisition and read-out electronics. For OPAL, the contributions included the scintillator hodoscope for time-of-flight measurement and general trigger electronics. The 12 years of data taking at LEP contributed in many essential ways to refining the Standard Model. Physicists from DAPNIA, which the DPhPE became in 1991, were involved in this progress, taking part in, for example, beauty studies, the accurate measurement of the W boson mass, and the search for the Higgs boson and supersymmetric particles.

At the end of the 1980s, as the experiments at LEP progressed, the fixed-target programme began to focus again on the subjects for which this kind of experiment is still the most suited. The DPhPE decided to be involved in four experiments, namely CP violation in the neutral-kaon system (CP LEAR, then NA48), nucleon spin structure (NMC, SMC, then COMPASS), neutrino oscillation (NOMAD) and quark-gluon plasma (NA34). While most of these experiments have been completed, NA48 and COMPASS are still taking data.

Into the 21st century

Chambers are still the dominant instrument in particle physics but technologies have evolved, resulting for example in the “micromégas” (micromesh gaseous structure) chambers, which are able to absorb particle fluxes 1000 times more intense than conventional chambers, and which are also faster and more accurate. Developed by the DPhPE, together with the necessary state-of-the-art electronics, these chambers are used in COMPASS and have recently been adopted by the CAST experiment.

cernsac7_5-03

At the same time, the make-up of the Saclay teams has also evolved. The particle physicists no longer have a monopoly on experiments at CERN. They have been joined by teams of nuclear physicists from DAPNIA, studying the structure of the nucleus in the SMC and COMPASS experiments, for example, and in the neutron time-of-flight programme (nTOF). The future of the fixed-target programme at CERN also concerns DAPNIA, whose physicists and engineers are contributing to the proposal for a future Superconducting Proton Linac (SPL) accelerator complex at CERN, and to the design of experiments that would use the SPL’s intense neutrino beams for the study of CP violation in the lepton sector.

From LEP to the LHC

CERN’s latest machine, the Large Hadron Collider (LHC), will open up a new high-energy domain, and its experiments should clarify the precise nature of the electroweak symmetry breaking mechanism once and for all. DAPNIA is investing heavily in this future, with its particle physicists taking part in the ATLAS and CMS experiments and its nuclear physicists participating in ALICE. It is also involved in designing and monitoring the manufacture of the quadrupoles for the machine itself. Participation in ATLAS involves the design of the superconducting air-cored toroid magnet system, and the construction of the central electromagnetic liquid argon calorimeter. Involvement in CMS covers the on-line calibration system for the crystal electromagnetic calorimeter, which is based on the injection of laser light, as well as the general design and monitoring of certain components of the experiment’s superconducting solenoid magnet, which is 6 m in diameter, 12.5 m long, and has a field of 4 T. In ALICE, DAPNIA is contributing the design and production of the wire chambers for the muon spectrometer. Muons, electrons and photons are all hints of the signals that these experiments hope to discover or measure, whether it be the Higgs boson at ATLAS and CMS, or the quark-gluon plasma at ALICE. What more promising subjects for the continuation of the 40 year long co-operation between Saclay and CERN could one hope for?

Let the data free!

Making astronomical data from the telescopes in space or on Earth freely available is common practice. A first step in this direction for particle physics data has been undertaken recently with QUAERO, a scheme developed at Fermilab to make high-energy data from the D0 experiment generally available (Abazov et al. 2001). This kind of “experimental transparency” allows any physicist in the world to test a new theoretical idea or evaluation algorithm. However, the practice does not exist for data taken from dark-matter experiments, although the most natural approach for this relatively new cross-disciplinary field of astroparticle physics should be that the data do not remain the private property of each experimental collaboration, but become public, as in the case of astronomical data.

cernvie1_5-03

We do not believe that the continuing secrecy in experimental astroparticle physics has been introduced intentionally. On the contrary the reason most probably lies in the lack, as yet, of any direct signature for dark-matter particles, which are believed to dominate the gravitational mass of the universe strongly. This situation has existed for decades, but despite this the challenging experimental question of the nature of dark matter is now fascinating more and more physicists across different disciplines. To our knowledge, there is no other similar example in the past.

As long as dark-matter physicists believe they have a zero result with their data, they will focus on improving detector performance to stay at the forefront of their field of research. Who then, has the time and the courage to consider releasing data collected over several years, which have become downgraded at best to measurements of background? There is no lack of data coming out of the underground dark-matter experiments worldwide, but these data have already been quasi disqualified because they do not fit the widely accepted picture of dark-matter interactions at the Earth.

However, in the past even dark-matter data have been re-evaluated following a new (theoretical) approach from inside as well as outside the collaboration, and this is exactly why astroparticle physicists should release their data. Most, if not all, dark-matter experiments are not complex, and their data can easily be formatted for non-experts. Scientific problems know no frontiers, and certainly not those as defined by a collaboration, even an international one. The dark-matter problem itself might also require some kind of synergism, or even a cross-correlation, between different experiments that have been declared – or even not declared – as dark-matter experiments.

As in particle physics, astroparticle physics theory is far ahead of experimental performance. However, it could be that the generally accepted theoretical picture does not point the experimentalists in the right direction. After all, there have been plenty of unanticipated discoveries in the past. For example, if the recently widely discussed theory of extra dimensions reflects reality, at least some of the approaches of the dark-matter searches must be revised because the particles they are aiming to detect have completely different properties from those assumed so far. Obviously, we must be sure that a signature in dark-matter data from previous experiments has not been overlooked, otherwise the broken dreams of dark-matter physicists will become their nightmares.

Making the data from astroparticle physics public will certainly promote scientific collaboration and will increase the many numbers of “amateurs” working in this field. Scientific transparency can only be beneficial to the science we are supposed to serve, and we have therefore suggested to the astroparticle physics community that it releases its data (Hoffmann et al. 2003). CERN, with its astroparticle physics programme, could once more be the pioneer of a new approach.

A Brazilian feast of cosmology and gravitation

The Brazilian School of Cosmology and Gravitation celebrated its 25th anniversary in 2002 by launching a website that contains all 93 lectures and seminars of the nine schools that have been organized since the first school in 1977. The site, set up by the Cosmology and Gravitation Group at the Brazilian Center of Scientific Research (CBPF), which organizes the schools, contains an impressive collection of talks by many of the most important scientists in the areas of cosmology, gravitation, astrophysics and field theory. It is an important resource for students and researchers, which also shows the evolution of these areas of physics during the past 25 years. The material, which is in PDF format, can be accessed via the website of the Cosmology and Gravitation Group at the CBPF.

The proceedings of the 10th school, which was held from 29 July – 9 August 2002, will be published this year by AIP.

Theory of Optical Processes in Semiconductors

by P K Basu, Oxford University Press. Paperback ISBN 0198526292, £39.95.

61MchTQWYdL

Now out in paperback this book, aimed at graduate students in physics and engineering, and other beginners in the field, provides a simple quantum mechanical theory of important optical processes in semiconductors.

Plasma Waves: Second Edition

by D G Swanson, Institute of Physics Publishing. Hardback ISBN 075030927X, £48 ($75).

41CUVAeR0ZL

This extended and revised edition encompasses waves in cold, warm and hot plasmas and relativistic plasmas. Written as a textbook for students, it also provides essential reference material for researchers.

Dall’Atomo al Cosmo (From Atoms to the Cosmos)

by Franco Foresta Martin, Editoriale Scienza. ISBN 8873072305, €12.90.

cernboo1_5-03

From the title it might seem that this book is a scientific voyage from the infinitely small to the infinitely large, but in fact it’s more like a historic trip from the origin of science to today’s research in particle physics. From the Greek philosophers to the Standard Model, it introduces the reader to the most important contributors to today’s physical description of the constituents of matter. Although cosmology and astrophysics are not discussed, the book explores the history of particle physics in 12 chapters. It chronologically presents the most important challenges and breakthroughs, and includes some fascinating anecdotes, which make reading the book more pleasant.

The layout of the book looks inviting, especially (in the reviewer’s opinion) for a younger audience, and no formulae are used. Simple experiments presented at the end of almost every chapter can be easily performed at home or at school. The book is produced in collaboration with the Italian National Institute for Nuclear Physics (INFN), which is distributing it free to schools in Italy. The final chapter describes the activities of the INFN and thereby shows how Italian researchers contribute to physics. In this way, students are informed about the opportunities they can expect if they study physics at university.

The author, Franco Foresta Martin, is a well known science journalist and popularizer. His personal experience as a writer, as well as good scientific accuracy, come out throughout the book. However, it is not clear whether the historic approach and the (often too?) simple experiments can be really effective with today’s young people, in the era of the Internet and high technology. Sometimes the difficulty in finding the right level of explanation is particularly apparent, as the reader can easily get lost between the history and life of the scientists and some of the deep notions of the physics they performed. The chapter on radioactivity shows this very well: Rutherford’s theory about atoms is introduced at the same time as ions and natural radioactivity. On the other hand, the experiment proposed at the end of the same chapter – radioactivity seen on a TV screen – seems potentially more tailored to today’s young generation, and therefore more appropriate.

The “Quarkoscopio”, or quarkoscope, is the activity proposed at the end of the last chapter, which is about the Standard Model and the use of fundamental particles in medicine. With this simple instrument, which can be built using cardboard, one can easily find out the quark constituents of the most common particles. The quarkoscope is quite an original idea and could turn out to be useful even to more experienced physicists!

Cohesion: A Scientific History of Intermolecular Forces

by J S Rowlinson, Cambridge University Press. Hardback ISBN 0521810085, £65.

415wdi7v1YL

A detailed historical account of how leading scientists of the past 300 years have tried to understand why matter sticks together, this book will interest physicists and physical chemists, as well as historians of science.

German government pronounces on TESLA projects

The German Federal Ministry of Education and Research gave the go-ahead on 5 February for the TESLA X-ray laser to be built at DESY as a European project. At the same time, it pledged continued support for R&D on the TESLA linear collider, while recognizing that decisions on the location of such a machine must be made at an international level. These decisions were part of a package to support large-scale projects in basic research, worth €1.6 billion, which also includes approval of a new accelerator complex at the GSI laboratory in Darmstadt.

DESY is to receive half the costs of the TESLA X-ray laser, which total €673 million, from the German government. The next step will be for DESY to work with interested European partners to develop the appropriate financial, technological and organizational framework for the project. The aim will be to make a decision within about two years on the construction of the machine, which will take around six years.

The Ministry also recognized the importance of the TESLA linear collider for Germany, by agreeing continued support for R&D work at DESY. This will allow DESY to continue working at an international level on the coordination and decision processes, which are currently in progress around the world.

Albrecht Wagner, chairman of the DESY Directorate, has welcomed the decisions. “The possibility to realize the TESLA X-ray laser as a European project at DESY opens up outstanding research possibilities”, he said after the announcement by the Ministry. “For the linear collider for particle physics, which is being planned on a longer term basis, DESY is able to continue the international research work.” Wagner also said that the decisions represent “a great recognition of the achievements of the TESLA collaboration, which have been widely acknowledged throughout the world.”

Ireland invests in a scientific future

cernire1_4-03

In 1938, the prime minister of Ireland, Eamon de Valera, invited Erwin Schrödinger to join the newly established Institute for Advanced Studies in Dublin. Today, the Irish government is echoing this lead with a new initiative. In February 2000, following an investigation by the Irish Technology Foresight panel into the issues pertaining to basic research in Ireland, the Irish government established Science Foundation Ireland (SFI). Its remit is to attract world-class research scientists and engineers in information and communications technology (ICT) and biotechnology to academic appointments in Ireland. Under the Irish National Development Plan 2000-2006, SFI was allocated €646 million. It has been charged with investing this money in individuals who are most likely to generate new knowledge, leading-edge technologies and competitive enterprises. The intention is that SFI will help Ireland to diversify, and its economy to grow, by recruiting and retaining creative individuals with advanced research experience in areas that are critical to the development of a knowledge-based economy. By the end of 2002, SFI had committed approximately €152 million to projects and teams working in these areas.

SFI recognizes that the future competitiveness of the Irish economy will be increasingly based on the quality of the intellectual capital available to stimulate innovation, excellence and entrepreneurship. Therefore, its aim is to use the resulting capability to create a reservoir of ideas, skills and talent that will profit Ireland in the future. To meet this goal, SFI is working in partnership with all tertiary educational institutions in Ireland, both to raise the quality of research and to increase the amount carried out. The best way to achieve this is by investing in creative and successful teacher-scholars who are in these institutions, and who have been selected on a competitive basis. The focus is on enhancing Ireland’s strengths in the fields that underpin biotechnology and ICT, as these fields currently promise more than others to drive scientific and economic advancement in the decades ahead.

About SFI’s programmes

cernire2_4-03

Since its establishment, SFI has developed five flexible programmes for making its grants and awards. SFI Fellow Awards are five-year awards to attract senior, distinguished researchers to Ireland in the fields underpinning biotechnology and ICT; the grants are normally up to €1 million or more per year. Investigator Programme Grants are four-year awards to recruit leading researchers in the science and engineering sectors that underpin biotechnology and ICT. These grants can be as large as fellowships, but are usually between €100,000 and €250,000 per year. Centres for Science, Engineering and Technology Grants – Campus-Industry Partnership (CSET) – have been established to fund researchers who will build collaborative efforts that develop internationally competitive research programmes together with researchers from industry. Such grants can be valued at up to €5 million per year initially, for up to five years, and they are to support research partnerships linking scientists, engineers and industry. E T S Walton Visitor Awards (named after Ireland’s Nobel prize winning accelerator pioneer) have been instituted with the aim of bringing international researchers to Ireland for periods of up to one year. These grants usually total €200,000 per year, including salary, laboratory and moving expenses. SFI Workshop and Conference Grants are set up to support events either sponsored by or involving Irish scientists and research bodies that reach an international scientific audience.

SFI has initially concentrated on assessing research activities within Ireland’s R&D community, and establishing and completing the funding for a core set of internationally competitive research programmes. Grants and awards to successful researchers are made after a process of international peer review of research proposals by distinguished scientists and engineers. The reviewers apply the criteria approved by SFI’s board – namely, quality of the idea, quality of the recent track record of the researcher, and strategic relevance of the research.

In summary, SFI is seeking to support the continued growth and development of a thriving research base from which the country can benefit. Its aim is to support innovative and creative individuals in carrying out their work in Ireland, and we look forward to making additional investments in researchers in both ICT and biotechnology, using our grants and awards programmes.

Science, technology and the Third World

cernvie1_4-03

Abdus Salam, who died on 21 November 2001, would have been 77 on 29 January 2003. In remembering him on such occasions, one misses his sharp intellect and his passion for promoting science and technology in Third World countries. Few have discovered a universal law of nature, and still fewer have founded an Institute for the underprivileged. Salam accomplished both. In addition to seeking “unity in seemingly disparate forces of nature”, he sought unity in mankind, and his crowning achievement was the creation in 1964 of the International Centre for Theoretical Physics at Trieste – now named after him – which has touched the lives of physicists and other scientists the world over.

Yet Salam failed in one of his lifelong goals, perhaps the one closest to his heart. Near the end of his life, he lamented: “Countries like Turkey, Egypt and my own country, Pakistan, have no science communities geared to development because we do not want such communities. We suffer from a lack of ambition towards acquiring science, a feeling of inferiority towards it, bordering sometimes even on hostility.”

Passive tolerance of poverty in the Third World was of deep concern to Salam. The greatest failure of science and technology is their failure to act as a social equalizer, and the gap between rich and poor has increased, despite the fact that the wealth created by science and technology is sufficient to alleviate poverty. “Predictions that the ‘poor might not always be with us’ have not come true. In 1990, there were optimistic forecasts that the percentage of absolute poor in the world (those with income below US$1 a day) would drop to 18% by 2000. By 1998, the figure was at 24% and the trend-line had turned upward” (Mooney 1999).

This echoes what Salam said in 1988: “This globe of ours is inhabited by two distinct types of humans. According to the UNDP count of 1983, one-quarter of mankind – some 1.1 billion people – are developed. They inhabit two-fifths of the land area of the Earth and control 80% of the world’s natural resources, while 3.6 billion developing humans – ‘les miserables’, the ‘mustazeffin’ – live on the remaining three-fifths of the globe. What distinguishes one type of human from the other is the ambition, the power, the elan which basically stems from their differing mastery and utilization of present-day science and technology. It is a political decision on the part of those (principally from the South) who decide on the destiny of developing humanity if they will take steps to let the less miserable create, master and utilize modern science and technology for their betterment.”

Again he wrote: “Today the Third World is only slowly waking up to the realization that in the final analysis, creation, mastery and utilization of modern science and technology is basically what distinguishes the South from the North. On science and technology depend the standards of living of a nation. The widening gap in economics and influence between the nations of the South and the North is essentially the science and technology gap. Nothing else – neither differing cultural values, nor differing perceptions or religious thoughts, nor differing systems of economics or of governance – can explain why the North (to the exclusion of the South) can master this globe of ours and beyond.”

Indeed, scientific knowledge and innovation are becoming leading factors of production and economic development around the world. There can be no high technology without first-rate science. Science develops new tools in laboratories for its progress, and trains students and technicians to build them. These tools find users outside, and some young people become entrepreneurs and launch their own companies, which then grow into large enterprises. However, such companies grow around big centres of scientific research, for example Silicon Valley around Stanford. But the Third World countries do not have big centres of research. So do they have a chance, or have they lost out for ever? I believe the answer lies in linkages with big science centres in developed countries. A fine example is CERN, where high technology and fundamental science reinforce each other.

Let me end by quoting from a paper by Salam, presented on 11 May 1983 in Bahrain: “We forget that an accelerator like the one at CERN develops sophisticated modern technology at its furthest limit. I am not advocating that we should build a CERN for Islamic countries. However, I cannot but feel envious that a relatively poor country like Greece has joined CERN, paying a subscription according to the standard GNP formula. I cannot rejoice that Turkey, or the Gulf countries, or Iran, or Pakistan seems to show no ambition to join this fount of science and get their men catapulted into the forefront of the latest technological expertise. Working with CERN accelerators brings at the least this reward to a nation, as Greece has had the perception to realize.”

Since then, Pakistan and Iran have joined CERN collaborations and, if Salam were alive today, I am sure he would be delighted to see that aspects of his vision are at last being transformed into reality.

bright-rec iop pub iop-science physcis connect