By Daniel Z Freedman and Antoine Van Proeyen Cambridge University Press
Hardback: £45
E-book: $64
Since the work of Emmy Noether nearly a century ago, the idea of symmetry has played an increasingly important role in physics, resulting in spectacular successes such as Yang-Mills gauge theory along the way. Albert Einstein, in particular, realized that symmetry could be a foundational principle; his understanding that the space–time dependent (“local”) symmetry of general co-ordinate invariance could be used to build general relativity had an enormous impact on the development of 20th-century physics.
The current zenith of the local symmetry principle is the theory of supergravity, which combines general relativity with the spin-intermingling theory of supersymmetry to construct the richest and deepest symmetry-based theory yet discovered. Supergravity also lies at the foundation of string theory – a theory whose own symmetry principle has not yet been uncovered – and so is one of the central ideas of modern high-energy theoretical physics.
Unfortunately, since its invention in the 1970s, supergravity has been an infamously difficult subject to learn. Now, two of the inventors and masters of supergravity – Dan Freedman and Antoine Van Proeyen – have produced a superb, pedagogical textbook that covers the classical theory in considerable depth.
The book is notably self-contained, with substantial and readable introductory material on the ideas and techniques that combine to make up supergravity, such as global supersymmetry, gauge theory, the mathematics of spinors and general relativity. There are many well chosen problems for the student along the way, together with compact discussions of complex geometry. After the backbone of the book on N=1 and N=2 supergravities, there is an excellent and especially clear chapter on the anti-deSitter supergravity/conformal field theory correspondence as an application.
Naturally, any finite book has to cut short some deserving topics. I hope that any second edition has an expanded discussion on superspace to complement the current, clear treatment based on the component multiplet calculus, as well as a greater discussion on supergravity and supersymmetry in the quantum regime.
Overall, this is a masterful introduction to supergravity for students and researchers alike, which I strongly recommend.
A new initiative to provide open access to peer-reviewed particle physics research literature was launched at CERN on 1 October by the Sponsoring Consortium for Open Access Publishing in Particle Physics – SCOAP3. Open dissemination of preprints has been the norm in particle physics for two decades but this initiative now brings the peer-review service provided by journals into the open-access domain.
In the SCOAP3 model, funding agencies, research institutions, libraries and library consortia pool resources that are currently used to subscribe to journal content and they use them to support the peer-review system directly. Publishers then make electronic versions of their journals open access. Articles funded by SCOAP3 will be available under a Creative Commons, CC BY licence, meaning that they can be copied, distributed, transmitted and adapted as needed, with proper attribution.
Representatives from the science-funding agencies and library communities of 29 countries were present at the launch. The publishers of 12 journals, accounting for the vast majority of articles in particle physics, have been identified for participation in SCOAP3 through an open and competitive process. With a projected SCOAP3 budget of SwFr36 million over three years, more partnerships with key institutions in Europe, America and Asia are foreseen as the initiative moves through the technical steps of organizing the re-direction of funds from the current subscription model towards a common internationally co-ordinated fund. SCOAP3 expects to be operational for articles published as of 2014.
On 10–12 September, some 500 physicists attended an open symposium in Krakow for the purpose of updating the European Strategy for Particle Physics, which was adopted by CERN Council in 2006. The meeting provided an opportunity for the global particle-physics community to express views on the scientific objectives of the strategy in light of developments over the past six years. With the aid of a local organizing committee, it was arranged by a preparatory group chaired by Tatsuya Nakada (see Viewpoint Charting the future of European particle physics).
In the early 1960s, a 4-km-long strip of land in the rolling hills west of Stanford University was transformed into the longest, straightest structure in the world – a linear particle accelerator. It was first dubbed Project M and affectionately known as “the Monster” by the scientists at the time. Its purpose was to explore the mysterious subatomic realm.
Fifty years later, more than 1000 people gathered at SLAC National Accelerator Laboratory to celebrate the scientific successes generated by that accelerator and the ones that followed, and the scientists who developed and used them. The two-day event on 24–25 August, for employees, science luminaries and government and university leaders, was more than a tribute to the momentous discoveries and Nobel prizes made possible by the minds and machines at SLAC. It also provided a look ahead at the lab’s continuing evolution and growth into new frontiers of scientific research, which will keep it at the forefront of discovery for decades to come.
A history of discovery
The original linear-accelerator project, approved by Congress in 1961, was a supersized version of a succession of smaller accelerators, dubbed Mark I to Mark IV, which were built and operated at Stanford University and reached energies of up to 730 MeV. The “Monster” would accelerate electrons to much higher energies – ultimately to 50 GeV – for ground-breaking experiments in creating, identifying and studying subatomic particles. Stanford University leased the land to the federal government for the new Stanford Linear Accelerator Center (SLAC) and provided the brainpower for the project. This set the stage for a productive and unique scientific partnership that continues today, supported and overseen by the US Department of Energy.
Soon after the new accelerator reached full operation, a research team that included physicists from SLAC and Massachusetts Institute of Technology (MIT) used the electron beam in a series of experiments starting in 1967 that provided evidence for hard scattering centres within the proton – in effect, the first direct dynamical evidence for quarks. That research led to the awarding of the 1990 Nobel Prize in Physics to Richard Taylor and Jerome Friedman of SLAC and Henry Kendall of MIT.
SLAC soon struck gold again with discoveries that were made possible by another major technical feat – the Stanford Positron Electron Asymmetric Ring, SPEAR. Rather than aiming the electron beam at a fixed target, the SPEAR ring stored beams of electrons and positrons from the linear accelerator and brought them into steady head-on collisions.
In 1974, the Mark I detector at SPEAR, run by a collaboration from SLAC and Lawrence Berkeley National Laboratory, found clear signs of a new particle – but so had an experiment on the other side of the US. In what became known as the “November Revolution” in particle physics, Burton Richter at SLAC and Samuel Ting at Brookhaven National Laboratory announced their independent discoveries of the J/ψ particle, which consists of a paired charm quark and anticharm quark. They received the Nobel Prize in Physics for this work in 1976. Only a year after the J/ψ discovery, SLAC physicist Martin Perl announced the discovery of the τ lepton, a heavy relative of the electron and the first of a new family of fundamental building blocks. He went on to share the Nobel Prize in Physics in 1995 for this work.
These and other discoveries that reshaped understanding of matter were empowered by a series of colliders and detectors. The Positron–Electron Project (PEP), a collider ring with a diameter almost 10 times larger than SPEAR, ran during the years 1980–1990. The Stanford Linear Collider (SLC), completed in 1987, focused electron and positron beams from the original linac into micron-sized spots for collisions at a total energy of 100 GeV. Making thousands of Z bosons in its lifetime, the SLC hosted a decade of seminal experiments. It also pioneered the concepts behind the current studies for a linear electron–positron collider to reach energies in the region of 1 TeV.
PEP was followed by the PEP-II project, which included a set of two storage rings and operated in the years 1998–2008. PEP-II featured the BaBar experiment, which created huge numbers of B mesons and their antimatter counterparts. In 2001 and 2004, BaBar researchers and their Japanese colleagues at KEK’s Belle experiment announced evidence supporting the idea that matter and antimatter behave in slightly different ways, confirming theoretical predictions of charge-parity violation.
Synchrotron research and an X-ray laser
Notably, new research areas and projects at SLAC have often evolved as the offspring of the original linear accelerator and storage rings. Researchers at Stanford and SLAC quickly recognized that electromagnetic radiation generated by particles circling in SPEAR, while considered a nuisance to the particle collision experiments, could be extracted from the ring and used for other types of research. They developed this synchrotron radiation – in the form of beams of X-ray and ultraviolet light – as a powerful scientific tool for exploring samples at a molecular scale. This early research blossomed as the Stanford Synchrotron Radiation Project (SSRP), a set of five experimental stations that opened to visiting researchers in 1974.
Its modern descendant, the Stanford Synchrotron Radiation Lightsource (SSRL), now supports 30 experimental stations and about 2000 visiting researchers a year. SPEAR – or more precisely, SPEAR3 following a series of upgrades – became dedicated to SSRL operations 20 years ago. This machine, too, has allowed Nobel-prize winning research. Roger Kornberg, professor of structural biology at Stanford, received the Nobel Prize in Chemistry in 2006 for work detailing how the genetic code in DNA is read and converted into a message that directs protein synthesis. Key aspects of that research were carried out at the SSRL.
Cutting-edge facilities
Meanwhile, sections of the linear accelerator that defined the lab and its mission in its formative years are still driving electron beams today as the high-energy backbone of two cutting-edge facilities: the world’s most powerful X-ray free-electron laser, the Linac Coherent Light Source (LCLS), which began operating in 2009; and FACET, a test bed for next-generation accelerator technologies. LCLS-II, an expansion of the LCLS, should begin construction next year. It will draw electrons from the middle section of the original linear accelerator and use them to generate X-rays for probing matter with high resolution at the atomic scale.
The late Wolfgang “Pief” Panofsky, who served as the first director of SLAC from 1961 until 1984, often noted that big science is powered by a ready supply of good ideas. He referred to this as the “innovate or die” syndrome. In 1983, Panofsky wrote that he had been asked since the formation of the lab, “How long will SLAC live?” The answer was and still is: “about 10 to 15 years, unless somebody has a good idea. As it turns out, somebody always has had a good idea which was exploited and which has led to a new lease on life for the laboratory.”
Under the leadership of its past two directors – Jonathan Dorfan, who helped launch the BaBar experiment and the astrophysics programme, and Persis Drell, who presided over the opening of the LCLS – SLAC’s scientific mission has grown and diversified. In addition to its original focus on particle physics and accelerator science, SLAC researchers now delve into astrophysics, cosmology, materials and environmental sciences, biology, chemistry and alternative energy research. Visiting scientists still come by the thousands to use lab facilities for an even broader spectrum of research, from drug design and industrial applications to the archaeological analysis of fossils and cultural objects. Much of this diversity in world-class experiments is based on continuing modernizations at the SSRL and the unique capabilities of the LCLS.
SLAC’s scientists and engineers continue to collaborate actively in international projects – designing machines and building components, running experiments and sharing data with other accelerator laboratories in the US and countries around the globe, including China, France, Germany, Italy, Japan, Korea, Latin America, Russia, Spain and the UK. The lab’s long-standing collaboration with CERN provided an important spark in the formative years of the World Wide Web and led to SLAC’s launch of the first web server in the US. SLAC is also playing an important role in the ATLAS experiment at CERN’s LHC. In the area of synchrotron science, collaborations with US national laboratories and with overseas labs such as DESY in Germany and KEK in Japan have contributed greatly to the development of advanced tools and methodologies, with enormous scientific impact.
Expertise in particle detectors has even elevated the lab’s research into outer space. SLAC managed the development of the Large Area Telescope, the main instrument on board the Fermi Gamma-ray Space Telescope, which was launched into orbit in 2008 and continues to make numerous discoveries. The lab has also earned a role in building the world’s largest digital camera for an Earth-based observatory, the Large Synoptic Survey Telescope, with construction scheduled to begin in 2014 for eventual operation on a mountaintop in Chile.
Richter, who served as SLAC director from 1984 to 1999, has said that the fast-evolving nature of science necessitates a changing path and pace of research. “Labs can remain on the frontiers of science only if they keep up with the evolution of those frontiers,” remarks Richter. “SLAC has evolved over its first 50 years and is still a world leader in areas beyond what was thought of when it was first built. It is up to the scientists of today to keep it moving and keep it on some perhaps newly discovered frontiers for the next 50.”
This article is based on the one published on the SLAC News Centre.
Stanford University, in California, already has a leading position as far as linear accelerators are concerned. It operates a whole family of linacs, several of which are used for medical purposes. The 200 ft machine [Mark III] in operation there produces 700 MeV electrons and its energy will be stepped up to 1050 MeV.
Late in May, Stanford made the scientific headlines – again with a linac.
Addressing a science research symposium in Manhattan, President Eisenhower announced that he would recommend to the US Congress the financing of a “large new electron linear accelerator … a machine two miles long, by far the largest ever built”.
This machine, intended for Stanford University, would be one of the most spectacular atom smashers ever devised. Two parallel tunnels would have to be driven for two miles into the rock of a small mountain in the vicinity of Palo Alto. Such natural cover would, of course, stop any dangerous radiation. One of the tunnels, the smaller in diameter, would house the accelerator proper, while the bigger one would be used for maintenance purposes.
The proposed new linac for Stanford would initially produce 15 BeV (GeV) electrons; it is announced that this energy could later be raised to 40 BeV. It is believed that the machine would take six years to build, at a cost of 100 million dollars. Approval of the project, now only taken after Congressional hearings, depends on the decision to be held in July.
By Gordon Fraser Oxford University Press
Hardback: £25
Don’t be misled by the title of this book. It contains a surprising amount of information, much more than focusing on the exodus of Jewish scientists from Germany after the rise of the Nazi Party. The book puts anti-Semitism into a broad historical perspective, starting with the destruction of the Temple in Jerusalem, the expelling of the Jews all across Europe and the growth of a mild and sometimes hidden anti-Semitism. This existed in Germany in the 19th century and even to some extent under the Nazis, when the initial objective was to cleanse German culture of all non-Aryan influences. However, various phases led eventually to the Holocaust. A political spark was ignited when the parliamentary building in Berlin went up in flames in February 1933 and Adolf Hitler became Chancellor. The Civil Service Law was soon introduced that forbade Jews from being employed by the state, followed by the burning of books and the Kristallnacht, during which Jewish shops were destroyed – all of which were further steps towards the “final solution”.
In parallel to these political developments, Quantum Exodus describes the rise of quantum physics in Germany during the 19th century, with protagonists such as Alexander von Humboldt, Wilhelm Röntgen, Hermann von Helmholtz, Max Planck, Walther Nernst and Arnold Sommerfeld. They attracted many Jewish scientists from all over Europe, among them Hans Bethe, Max Born, Peter Debye, Albert Einstein, Lise Meitner, Leó Szilárd, Edward Teller and Eugene Wigner, who went on to become key players in 20th-century physics. Most of them left Germany, some at an early time, others escaping at the last moment and most of them going to the UK or US, often via Denmark, with Niels Bohr’s institute as a temporary shelter. An exodus also started from other countries such as Austria and Italy. The book recounts the adventurous and disheartening fates of many of these physicists. Arriving as refugees, they were initially often considered aliens and during the war sometimes even as spies. The author gives some spice to his narrative by adding amusing details from the private lives of some of the protagonists.
A detailed account is given of the Manhattan Project project and how the famous letter by Einstein to President Franklin Roosevelt initiated the building of the fission bomb. It was written as a result of pressure by Szilárd, the main mover behind the scenes. What is less known is the primordial importance of a paper by Otto Frisch and Rudolf Peierls in the UK, which already contained the detailed ideas of the fission bomb. Robert Oppenheimer, an American Jew, became scientific director of the Manhattan Project after his studies in Europe, bringing the European mindset to the US. He attracted many émigrés to the project, such as Bethe, Teller, Felix Bloch and Victor Weisskopf. The book relates vividly how Teller, because of his stubborn character, could not be well integrated into this project; rather, he pushed in parallel for the H-bomb.
The author implies, although somewhat indirectly, that the rise of Nazism and the development of the nuclear bomb have a deeper correlation, without giving convincing details. However, the interaction of science (its stars) and politics is well described. Bohr’s influence, although at the centre of nuclear physics, was limited – partly because of his mumbling and bad English (something that I witnessed at the Geneva Atoms for Peace Conference in 1957, where his allocution in English had to be translated simultaneously into English.)
Many of the exiled physicists who worked on the Manhattan Project developed considerable remorse after the events of Hiroshima and Nagasaki. When I invited Isidor Rabi to speak at the 30th anniversary of CERN he considered his involvement in the foundation of CERN as a kind of recompense for his wartime activities.
The descriptive account of science in the US and Europe after the Second World War is interesting. In the US, politicians’ interest in science decreased substantially and a change was introduced only when the shock of Sputnik led eventually to the “space race”. Basic science also benefited from this change, leading for example to the foundation of various national laboratories such as Fermilab. In Europe, a new stage for science emerged when a pan-European centre to provide resources on a continental rather than a national scale was proposed and CERN was founded in 1954.
The book benefits from the fact that the author is competent in physics, which he sometimes describes poetically, but never wrongly. He has done extremely careful research, giving many references and a long list of Jewish emigrants. I found few points to criticise. Minor objections concern passages about CERN, although the author knows the organization so well. For example, the response of CERN towards the Superconducting Super Collider was the final choice of the circumference of the LEP tunnel (27 km) in view of the possibility of a later proton–proton or proton–electron collider in the same tunnel, while the definite LHC proposal came only in 1987; and the LHC magnets are superconducting to achieve the necessary high magnetic fields and not so much to save electricity.
The various chapters are not written in chronological order, and political or scientific developments are integrated with human destinies. This assures easy and entertaining reading. Like me, older readers who have known many of the protagonists, will not avoid poignant emotions. For young readers, the book is recommended because they will learn many historical facts that should not be forgotten.
One intriguing question (probably unanswerable) that was not considered, is: what would have happened to US science without the contribution of Jewish immigrants?
On 5 October 1962, five nations signed the convention that founded the European Southern Observatory (ESO). Belgium, France, the Federal Republic of Germany, the Netherlands and Sweden where soon followed by Denmark. They were later joined by Switzerland, Italy, Portugal, the UK, Finland, Spain, the Czech Republic and, most recently, Austria in 2009. Brazil, whose membership is pending ratification, will be the 15th member state and the first from outside Europe. The organization’s main mission, laid down in the convention signed in 1962, is to provide state-of-the-art research facilities to astronomers and astrophysicists, allowing them to conduct front-line science in the best conditions. With headquarters in Garching near Munich, ESO operates three observing sites high in the Atacama Desert region of Chile, which are home to a world-leading collection of observing facilities.
ESO’s ruling body is its council, which delegates day-to-day responsibility to the executive under the director-general, while other governing bodies of ESO include the Finance Committee and the Committee of Council. If this sounds familiar, it is probably because the origins of ESO bear more than a passing resemblance to those of CERN. The founding of ESO has its roots in a statement signed on 26 January 1954 by leading astronomers from six countries – the five nations that would later sign the ESO convention, plus the UK (which was to go in a different direction and join ESO only in 2002). The statement pointed to the lack of coverage of the skies of the southern hemisphere – which include interesting regions such as the Magellanic Clouds – by powerful telescopes at that time. It went on to put the case that although no one country had sufficient resources for such a project, it could be possible through international collaboration. Finally, it recommended the establishment of a joint observatory in South Africa that would house a 3 m telescope and a 1.2 m Schmidt telescope with a wide field of view, which would be valuable for surveys. These instruments would complement the 5 m Hale Telescope and the 1.2 m Schmidt that had been observing the skies of the northern hemisphere from the Palomar Observatory in California since 1948.
The idea for a joint European effort had originated the previous spring, when the pioneering Dutch astronomer, Jan Oort, invited Walter Baade, a renowned German working at the Mt Wilson and Palomar Observatories, to stay at Leiden for a couple of months. Oort mobilized a group of leading European astronomers for a meeting with the influential visitor on 21 June 1953, where Baade proposed capitalizing on existing designs for a 3 m telescope being built for the Lick Observatory in California and for the Schmidt telescope at Palomar. Also present at the meeting was Jan Bannier, director of the Dutch national science foundation and president of the provisional CERN Council.
In November 1954, Bannier and Gösta Funke, director of the Swedish National Research Council and a member of the newly established formal CERN Council, drew up the first draft of a convention for ESO, with key similarities to the CERN convention. ESO would have a council with two delegates (at least one an astronomer) from each member state; each country would have an equal vote; financial contributions would be in proportion to national income up to a fixed limit.
Further progress was slow because the project’s supporters grappled with financial and political difficulties in their countries. Important impetus came with Oort’s successful application in 1959 for a grant from the Ford Foundation in the US for a $1 million – a fifth of the estimated cost at the time – on condition that at least four of the five potential members sign the convention. It took another three years for further issues to be resolved and for the convention to be signed on 5 October 1962, in the Ministry of Foreign Affairs in Paris. Even then, it was only in early 1964 that real work could begin (and the grant from the Ford Foundation released) when France became the fourth country to ratify the convention, after the Netherlands, Sweden and the German Federal Republic.
The original idea had been to locate the observatory in South Africa and over the period 1953–1963 searches for suitable places were followed by systematic tests at three sites in the Karoo region. However, in 1959 astronomers in the US began to explore the possibilities in the Chilean Andes, through the Association of Universities for Research in Astronomy (AURA). It soon became clear that the Andes might offer better climatic conditions than South Africa for astronomy and in November 1962 two members of ESO’s site-testing team went to Chile. Their findings indicated a general superiority, in particular longer spells of clear weather and smaller temperature differences during the night (owing, in fact, to the higher altitude).
So, in June 1963 Otto Heckmann, the embryonic organization’s provisional director-general, and others including Oort went to Chile to meet members of AURA and see the mountains chosen by the Americans. Although the ESO convention had still to be ratified by the requisite four countries, in November the ESO Committee opted unanimously for the Andes, a decision that the formal ESO Council approved at its first meeting in 1964. Later that year, ESO decided on a site that was independent of the Americans – a mountaintop at 2400 m that Heckmann proposed naming La Silla (the saddle).
ESO went on to develop La Silla, first installing a number of intermediate-size telescopes that had been foreseen in the convention, as well as some smaller national telescopes. The official inauguration, by the president of the Republic of Chile, Eduardo Frei Montalva, took place on 25 March 1969.
In the meantime, there was mounting concern about the slow progress on the larger telescopes described in the ESO convention and in March 1969 a working group was set up to advise the ESO Council on this and various administrative matters. In particular, it was to look into budget procedures and the project for the 3.6 m telescope. (The proposed size had grown after experience in the US had shown that the observer’s cage for a 3 m instrument raised problems for larger astronomers.) The working group was chaired by Funke and both he and Augustin Alline, the French government ESO Council delegate, were members of CERN Council. Their recommendations led to the introduction at ESO of the “Bannier process”, which had been established at CERN for budgetary matters; and at Alline’s suggestion, ESO also followed CERN’s example in setting up a Committee of Council, whose informal meetings of fewer people could iron out potential difficulties between meetings of Council.
It was at the meeting of CERN’s Committee of Council in November 1969 that CERN’s director-general, Bernard Gregory, reported on discussions with his counterpart at ESO about a possible collaboration between the two organizations – in essence, a rescue plan for the 3.6 m telescope. The project was similar in size and complexity to that of a large bubble chamber and there was also a strong feeling that particle physicists and astronomers could benefit from closer contact. The committee gave Gregory the go-ahead to report to the meeting of CERN Council in December, which in turn authorized him to continue the discussions with ESO. At the meeting, Bannier, who was currently president of the ESO Council, pointed out that with its greater experience in building large-scale apparatus and in dealing with industry CERN would bring valuable expertise to advance the 3.6 m project.
By June 1970, a draft co-operation agreement had been drawn up that foresaw the setting up of ESO’s Telescope Project Division at CERN. CERN would provide administrative, technical and professional services – the latter covering the project management as well as technical and scientific advice. This would be at no cost to CERN because all would be financed by ESO and no additional staff at CERN would be required. The June council meetings at ESO and CERN consented to collaboration between the two organizations and on 16 September the agreement was signed by Gregory and Adriaan Blaauw, ESO’s director-general. Within six months, the nucleus of the Telescope Project (TP) Division had formed at CERN. Led by ESO’s Svend Laustsen, it included his small technical group. The division then grew to comprise some 40 astronomers, engineers and technicians, all involved in the final design, construction and testing of the 3.6 m telescope, while benefiting from CERN’s experience in engineering and the administrative aspects of implementing a large project.
The members of TP interacted mainly with CERN’s Proton Synchrotron Department (particularly Wolfgang Richter and the department head, Kees Zilverschoon), the Technical Services and Building Division (Henri Laporte and E Leroy) and the Data Handling Division (Detmar Wiskott), while the placing of contracts involved working with the Finance Division. The first two years focused on completing the design of the telescope and the building to house it, with a first design report issued in February 1971. A year later, the group was awarding contracts related to the construction of the telescope, the building and a computer system, both to steer the telescope and for data-acquisition and some online analysis.
November 1972 saw another development at CERN, with the inauguration of the ESO Sky Atlas Laboratory. To match the atlas of the northern sky made by the 1.2 m Schmidt telescope at Mt Palomar, ESO and the UK were pooling the resources of ESO’s 1 m Schmidt in Chile and the UK’s 1.2 Schmidt in Australia. A copy of each of the glass plates recorded in Chile was sent to the lab at CERN for further copying onto film. After a first rapid survey, ESO’s Schmidt telescope went on to cover red wavelengths in detail, the UK’s instrument covering blue. The Sky Atlas Lab was involved in producing 200 copies of the complete atlas, the full view totalling 200 m2 of film. One highlight of this work was the discovery of a new comet on 5 November 1975, named after its discoverer, the lab’s head, Danish astronomer Richard West.
In April 1975, the 3.6 m telescope was ready for testing in Europe. One innovation concerned the use of a fully automated control system, which involved some 120 individual computer-controlled motors for steering. The 18 m tall structure was assembled in a hall with a specially constructed pit to accommodate it at the Société Creusot-Loire at St Chamond. There, a van from CERN packed with electronic control-circuitry tested out the control system, determining the optimum configuration for driving the telescope’s two orientation axes. With testing complete, the telescope was dismantled and packed up for its journey to Chile, where it would be fitted with its giant mirror. The mirror blank had been ordered from Corning in the US as early as 1965 but a number of problems meant that its final processing to achieve a surface accuracy of 0.06 μm was not completed by the Recherches et études d’optique et de sciences connexes (REOSC), near Paris, until early 1972.
A year after it arrived in Chile, the telescope finally saw its “first light” on the night of 7–8 November 1976. The links with CERN were not quite over, however. A smaller 1.4 m instrument – the Coudé Auxiliary telescope (CAT) – was later designed by the TP team at CERN. Manufactured mainly by industry, it was assembled in CERN in early 1979 before going to Chile, where it fed the 3.6 m Coudé Echelle Spectrometer through a light tunnel. Fully computer controlled, the CAT was used for many different astronomical observations, including measuring the ages of ancient stars. The 3.6 m itself has since gone on to be highly productive, most recently with the High Accuracy Radial velocity Planet Searcher (HARPS), the world’s foremost hunter of planets beyond the solar system.
Writing in ESO’s journal, The Messenger, in 1981, Charles Fehrenbach, the director of the Haute Provence Observatory, who was involved with ESO for many of the early years, stated: “There is no doubt in my mind that it was the installation in Geneva which saved our organization.” The strong links with CERN certainly helped to set ESO on its way and the older organization can now look on with pleasure at its younger sibling’s many achievements.
1948: The 5 m Hale telescope is inaugurated in Palomar, California. 1954: At the instigation of Jan Oort and Walter Baade, a group of renowned European astronomers meets to discuss how, by pooling the efforts of several countries, Europe could rise to the challenge and keep an important place in astronomical research; Jan Bannier, president of the CERN Council, is also present. A statement is adopted: “There is not a more urgent task for astronomers than to install powerful instruments in the southern hemisphere, and in particular a telescope … of at least 3 m.” But the scars of the Second World War are there and it will take several years of discussion before, on 5 October 1962, five governments (Belgium, France, the Federal Republic of Germany, the Netherlands and Sweden) sign the convention that creates the European Southern Observatory, ESO. The convention was drafted by Bannier, largely adapted from the CERN convention in its constitutional set-up, its financial basis and its personnel regulations (ESO and CERN: a tale of two organizations). Thus, in a sense, ESO is a younger sibling of CERN.
Soon, it was decided to establish the observatory at a site in Chile, in the Atacama Desert, chosen for its large proportion of clear nights and its excellent sky quality. A suitable piece of land was purchased at La Silla, close to La Serena. By 1969, a number of 1-m-class telescopes were in operation. Attention then focused on the construction of a 3.6 m telescope. The young organization had not yet mastered the skills necessary for such an endeavour and problems appeared on many fronts. CERN offered its help and soon the ESO Telescope Project Division moved to the CERN. A participant in the preceding discussions, CERN’s Kees Zilverschoon reported that “practically everyone … emphasized the importance of the collaboration between astronomy and high-energy physics [and] common technical developments … and the political aspect: formation of a ‘Communauté scientifique européenne’ .” This was long before the discussions on a European area of research started at the political level. With the help of some CERN engineers, the 3.6 m telescope was completed by 1976. It is still in use today, in particular for the successful search for extra-solar planets with the HARPS spectrometer.
ESO was offered new headquarters in Garching by the German government, settling there in 1980. By then, it had an excellent set of experienced engineers and in 1989 deployed a revolutionary 3.5 m telescope, the New Technology Telescope (NTT). This introduced “active optics” in which the effects of gravity, winds and temperature on image quality are counteracted by controlling the shape of the primary mirror and the position of the secondary mirror.
Even before the first light of the NTT, ESO had begun the Very Large Telescope (VLT) project. It all started in December 1977 with a lively conference at CERN on “Optical Telescopes of the Future”. Detailed studies led to the selection of an array of four telescopes of 8.2 m aperture and with active optics, with the NTT serving as a prototype for the construction of the VLT. An impressive suite of first- and second-generation instruments, most of them developed in national laboratories, have been placed at the 11 available foci, while the 12th is reserved for visitor instruments. The second ESO observatory, on Mt Paranal – with its four large VLT telescopes, four 1.8 m telescopes dedicated to interferometry and two telescopes devoted to surveys of the sky in the optical and the infrared – is now the most productive observatory in the world, allowing major advances in virtually all fields of astrophysics.
It is in the same vicinity, on Mt Armazones, that ESO plans to erect its Extremely Large Telescope (ELT), based on a novel concept that features five mirrors in sequence instead of the usual two, with a segmented primary mirror 39 m in diameter. Corrections for blurring owing to turbulence in the atmosphere, which are today made with small deformable mirrors at the level of the instruments (“adaptive optics”), will – in the ELT – be made partially by two of the five mirrors of the telescope itself.
Following an agreement signed by ESO and the US National Science Foundation in 2003, which was soon joined by the National Astronomical Observatory of Japan in collaboration with Taiwan, ALMA, an ambitious millimetre and submillimetre observatory featuring 66 antennas has been under construction for the past few years on the Chajnantor plateau in the Atacama, at an altitude of 5000 m. The inauguration will take place next March but early science, with 16 telescopes, is already bringing highly exciting results (see, for example, ALMA tastes sugar around a Sun-like star).
In 2000, ESO fostered the creation of EIROforum, a partnership of seven European research organizations with the mission of combining the resources, facilities and expertise of its members to support European science in reaching its full potential. Chaired most recently by CERN in 2011–2012, it has just been joined by a new member, the European X-ray free-electron laser project, XFEL.
ESO and CERN share a range of scientific interests and have held stimulating joint conferences in the past, the last ones also involving ESA. Today, cosmology, dark matter, dark energy, high-energy gamma rays, neutrinos, gravitational waves, general relativity and processes in the vicinity of black holes are all hot topics for both communities and would deserve a new joint conference in the near future.
By Stanley Greenberg; Introduction by David C Cassidy Hirmer Verlag
Hardback: €39.90 SwFr53.90 £39.95 $59.95
The American photographer Stanley Greenberg travelled 130,000 km over five years to create the 82 black-and-white photographs included in this large-format book. They are a record of the extraordinary and sometimes surreal complexity of the machinery of modern particle physics. From a working replica of an early cyclotron to the LHC, Greenberg covers the world’s major accelerators, laboratories and detectors. There are images from Gran Sasso, Super-Kamiokande, Jefferson Lab, DESY and CERN, as well as Fermilab, SLAC and LIGO, Sudbury Neutrino Observatory, IceCube at the South Pole and many more.
The LUNA experiment at Frascati is like a giant steel retort-vessel suspended in the air; a LIDAR installation at the Pierre Auger Cosmic Ray Observatory in Argentina is a fantastically hatted creature from outer space bearing the warning “RADIACION LASER”; and the venerable 15-foot bubble chamber sits on the prairie at Fermilab like a massive space capsule that landed in the 1960s. (Who knows where its occupants might be now?)
Not a single person is seen in these beautiful images. They are clean, almost clinical studies of ingenious experiments and intricate machines and they document a world of pipes, concrete blocks, polished steel, electronics and braided ropes of wires. Greenberg has said that his earlier books, such as Invisible New York – which explores the city’s underbelly, its infrastructure, waterworks and hidden systems – are “about how cities and buildings work”, whereas Time Machines is about “how the universe works”. More accurately, perhaps, it is about the things that we build to help us understand how the universe works – but here the builders are invisible, like the particles that they are studying.
In a book whose photographs clearly demonstrate the global nature of particle physics, David Cassidy, author of an excellent biography of Werner Heizenberg, includes a one-sided introduction, concentrating on US labs and achievements. Accelerators are “prototypically American” and his main comment on the LHC is that the US has contributed half a billion dollars to it and that Americans form its “largest national group”. There are also inaccuracies: electroweak theory was confirmed by the discovery of the W and Z bosons at CERN in 1983, not 1973; and the top quark discovery was announced in 1995, not 2008. The introduction does not do justice to Greenberg’s excellent and wide-ranging photography but, fortunately, nor does it detract from it.
By Laurence Plévert World Scientific
Paperback: £32
E-book: £41
Pierre-Gilles de Gennes obtient le prix Nobel de physique en 1991 « pour avoir découvert que les méthodes développées dans l’étude des phénomènes d’ordre s’appliquant aux systèmes simples peuvent se généraliser à des formes plus complexes, cristaux liquides et polymères ». Ni invention ni découverte, c’est un curieux intitulé. Le comité semble honorer un homme plus qu’une contribution identifiée. De fait, la vie de de Gennes se lit comme une épopée. Il naît en 1932 d’une famille alliant la banque et l’aristocratie. Ses parents se séparent, il est doublement choyé. La guerre éclate, c’est l’occasion de vacances alpestres. Cette enfance hors du commun lui apprend discipline et curiosité et lui donne une grande confiance en lui.
Attiré par les sciences à 15 ans, il surmonte les années difficiles de la classe préparatoire en jouant dans un orchestre de jazz. Reçu premier à l’Ecole normale supérieure, il commence la vie libre de normalien, se mariant et devenant papa avant l’agrégation. Il se passionne pour la mécanique quantique et la théorie des groupes, qu’il décortique dans les livres. Feynman est son modèle. L’intuition doit rester souveraine, et il l’applique aussi en politique où il rejette les modes de l’époque. Il vit une révélation avec l’école d’été des Houches, où il rencontre Pauli, Peierls… Les deux mois les plus importants de sa vie, dit-il. Sa vocation pour la physique s’y confirmera, mais quelle voie suivre ? La physique nucléaire ? « J’ai l’impression que personne ne sait décrire une interaction sinon en ajoutant des paramètres de manière ad hoc ».
Sorti de l’ENS, il intègre la division théorique de Saclay. Après son service militaire, il devient professeur à Orsay à l’âge de 29 ans. On lui laisse carte blanche, il s’attaque à la supraconductivité, montant un laboratoire à partir de rien. Se laissant guider par l’imagination, il mélange expérience et théorie. Aux plus jeunes, il insuffle l’enthousiasme, son charisme opère sur tous.
Il quitte Orsay en 1971, appelé au prestigieux Collège de France, pour y créer son propre laboratoire. Il y développe la science de « la matière molle », comprenant les bulles et les sables, les gels, les polymères… Théoricien du pratique, il prône une forte collaboration avec l’industrie. Pluridisciplinaire avant la lettre, il exploite les analogies suggérées par sa grande culture scientifique.
Dans son parcours sans faute, une hésitation apparaît. « Au milieu du chemin de sa vie », il sent le défi de l’âge. Il le relève allègrement fondant une seconde famille, en maintenant un bon rapport avec la première où vivent trois grands enfants. Sa femme ne s’insurge pas. Sa vie privée est aussi fertile que sa carrière, et trois nouveaux enfants naîtront.
Arrive l’heure des distinctions. Il est élu à l’Académie des Sciences, il reçoit la médaille d’or du CNRS, la Légion d’honneur, on lui propose un ministère. Tout en demeurant au Collège de France, il est appelé à la direction de l’ESPCI, qu’il remodèle à son goût, il s’y fait une réputation de despote. C’est un grand patron qui assume sa fonction. De fait, son autorité naturelle suscite chez ses collaborateurs une crainte sacrée. L’apothéose que représente le prix Nobel lui permet d’appliquer ses idées avec encore moins de retenue. Grand communicateur, il popularise ses idées à la télévision.
Un cancer se déclare, il s’accroche à ses activités. Retraité du Collège de France, il poursuit sa vie de recherche à l’Institut Curie dans le domaine des neurosciences. Il meurt en 2007 après une dure bataille.
Pierre-Gilles de Gennes fut un homme de convictions. Parfois décrié pour ses prises de position, il ne craint pas de secouer les habitudes en s’attaquant aux structures sclérosées : « L’université a besoin d’une révolution. » Autre cheval de bataille : la « Big Science » ; il s’oppose au laboratoire de rayonnement synchrotron Soleil et au projet ITER. Humaniste, il publie un délicieux tableau de caractères à la manière de la Bruyère, et il avoue : « J’ai tendance à croire que notre esprit a des besoins autant rationnels qu’irrationnels. »
Bouillonnant d’idées, auteur de 550 publications, homme d’influence qui s’exprime de manière franche, il ose dire : « Il faut accélérer la mort lente de champs épuisés comme la physique nucléaire », et il remarque : « Quand j’ouvrais PRL en 1960, je trouvais chaque fois une idée révolutionnaire, aujourd’hui j’arrive à 2 ou 3 idées par an, dans un journal devenu 5 fois plus épais. » Il est vrai que les idées neuves se font rares. Nous vivons sur l’acquis d’anciennes avancées théoriques, et le Higgs découvert récemment a été postulé il y a 50 ans. D’où le le sentiment dérangeant que le progrès avance plus laborieusement.
Pierre-Gilles de Gennes fut un esprit fertile et passionné, mais il vécut aussi dans une période favorable, offrant des domaines vierges permettant de multiplier les recherches. Une carrière comme la sienne semble impossible aujourd’hui, les spécialités poussées à l’extrême étouffant les initiatives individuelles.
La biographie, très bien écrite par la journaliste Laurence Plévert, est truffée d’anecdotes, elle se lit comme un roman qui emplit le lecteur d’un optimisme renouvelé sur les potentialités de l’aventure humaine et de la recherche fondamentale.
« Renaissance man », dit la quatrième de couverture ; j’oserai comparer Pierre-Gilles de Gennes à un monarque éclairé façon condottiere, ce qui ne contredit pas l’aphorisme d’un journaliste résumant l’attrait de l’homme : « Il est quelqu’un qu’on aimerait avoir comme ami, pour partager le privilège de se sentir un instant plus intelligent. »
This book is a translation of the original French edition Pierre-Gilles de Gennes. Gentleman physicien (Belin, 2009).
On 4 July, particle physicists around the world eagerly joined many who had congregated early at CERN to hear the latest news on the search for the Higgs boson at the LHC (4 July: a day to remember). It was a day that many will remember for years to come. The ATLAS and CMS collaborations announced that they had observed clear signs of a new boson consistent with being the Higgs boson, with a mass of around 126 GeV, at a siginificance of 5 σ. In this issue of CERN Courier the two collaborations present their evidence (Discovery of a new boson – the ATLAS perspective and Inside story: the search in CMS for the Higgs boson) and CERN’s director-general reflects on broader implications (Viewpoint: an important day for science). There was further good news from Fermilab with new results on the search for the Higgs at the Tevatron, described above.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.