The 4% Universe is, as you might gather from the title, an account of how the scientific community has come to the idea that only (a little over) 4% of the universe seems to be made of the same stuff as you and me. In other words, normal matter is only a tiny percentage of all that there is, with the remainder being about 23% dark matter holding galaxies together and 73% being dark energy, which drives the acceleration of cosmic expansion.
This account is unusual, written more like a thriller than in the style of many popularizations. There is a great emphasis on not only describing the sequences of events leading to the discoveries of dark matter and dark energy but also of the people involved. Personalities, co-operations, disagreements, collaboration and individualism all take a large part of the stage, making the book lively and readable. I had originally planned to read it in chunks over a few days but found myself taking it all in during a single sitting, somewhat later into the night than I had planned!
This book would be a nice gift for anyone with a genuine interest in science but, oddly enough, it may be a hard read for someone without at least some background knowledge. At the same time, it is short on details (no equations, graphs, plots or photographs) for a practising physicist who is not so interested in the personal dramas involved. If you’re looking for a book about dark matter and dark energy per se, then this may not be the best choice. While the science is probably more than 4% of the book, the bulk is about sociology, history and politics.
Nevertheless, technical terms are well explained, down to footnotes for those who need to know what the Kelvin scale or a megaparsec is. The physics is pretty good, too, but not perfect in all places. For example, the discussion on the Casimir effect seems not quite to get that the energy density between the plates is negative with respect to the region outside.
The emphasis is very much on astronomy and astronomical observation and how data are collected and presented. Particle physicists should not expect much about the direct search for particles that could make up dark matter. The LHC merits a brief mention but without further discussion. Axions and neutralinos are introduced as dark-matter candidates but without any explanation of the ideas that gave rise to them.
Apart from the insights into the sociology of how “big astronomy” is done, I think that the book’s greatest merit is to drive home how much our view of the universe has changed in the past 100 or so years – from a rather simple, static universe to an expanding, even accelerating one, with far more stars and galaxies than had ever been imagined and, now, the realization that all of that visible matter may be only a few per cent of all that is. That, as well as to show how cosmology has made the giant step from being little different from theology to being a real scientific discipline.
In 1963, Bob Dylan penned the song The Times They Are a-Changin’, which quickly became the anthem for a new generation. But according to Adam Frank’s provocative book, the times have always been changing: first, hunter-gatherers driven by the immediacy of hunger; then pioneer farmers dictated by the seasons. After that came a series of industrial revolutions: workers having to move to towns and adapt to factory drudgery; mechanical transport extending the span of distance of daily life; and today’s digital devices compressing time and distance even further (with the constant pressure to download the latest app or have the newest browser update).
About Time compares the accelerating pace of this race towards no clear destination with the evolution of cosmology, from ancient mythology to the modern picture of multiple universes. The changing world picture is continually benchmarked against the seemingly unpredictable emergence of new lifestyles as technology advances.
In doing so, the story line can lurch startlingly at times. It leaps from the introduction of labour-saving electrical household appliances in the early 20th century to the commissioning of the Mt. Wilson Hooker telescope; from the measurement of galactic red shifts and an apparently expanding universe to the cultural revolution brought about by domestic radio. The ideas of quantum mechanics are then wedged into two pages.
Frank’s illustrations cover a wide range. I appreciated being reminded of the tragic figure of British music producer Joe Meek, whose 1962 instrumental piece marking the technological miracle of Telstar resonated in contemporary lifestyle as the first British recording to appear in the US charts – one year before the Beatles, who the mercurial Meek had meanwhile chosen to ignore.
The book traces the key historical giants, from the Ancient Greek philosophers and before through to Albert Einstein, Edwin Hubble and beyond. Some figures are less familiar, for example Ambrose Crowley, a British industrial magnate who was a contemporary of Isaac Newton. Despite his obscurity, Crowley’s impact on technology is compared with that of Newton’s on science.
Some conventional ideas are sold short, for example the role of time in quantum physics and its deep connection with antimatter. Paul Dirac, the pioneer of antimatter, appears in a cameo role to introduce a whole section on the iconoclast Julian Barbour and his provocative book The End of Time. Barbour suggests that the continual quest to understand time fails because time itself is an illusion.
Although Frank’s About Time does not venture that far, it is an unconventional book, which could motivate an inquisitive young mind.
By Jim Baggott Oxford University Press
Hardback: £14.99 $24.95
Jim Baggott is the author of The Quantum Story, an exceptionally interesting and detailed “biography” of quantum physics, very nicely exposed over almost 500 pages. Having had the pleasure of reviewing this wonderful book for the CERN Courier, I was quite happy to learn, through a text by Steven Weinberg that appeared on 9 July this year on The New York Review of Books (NYRB) website, that Baggott had written a new book, succinctly titled Higgs. However, I was perplexed to realize that the new book had been finished just two days after the seminar at CERN on 4 July, when the ATLAS and CMS collaborations announced “the discovery of a new particle that seems to be the long-sought Higgs particle” (to quote Weinberg). Indeed, most of the book had been written well before, in anticipation of the day when the discovery would be announced.
Unfortunately, I became rather disappointed soon after getting my hands on the new book. Apart from Weinberg’s “foreword” (most of it available through the NYRB blog) and from the final chapters, most of the book left me with a feeling of “déjà vu“, constantly reminding me of pages from The Quantum Story. As the author writes in the preface, “the present book is based, in part, on that earlier work”. Some sentences were refurbished and some (not all) minor mistakes were corrected, but if you have read the original you will feel that much of the new book is a “remake”. At least Baggott has added a few Feynman diagrams, which were clearly lacking in The Quantum Story, such as the one relating the GIM mechanism to the dimuon decay of the neutral kaons, but a lot more illustrations (and a few equations) could have been included to facilitate the understanding of certain narratives.
The final three chapters of Higgs, written specifically for the new book, should have gone through an extra round of editing to eliminate several imperfections. For instance, the general reader will be puzzled when reading that the CMS collaboration is led by Guido Tonelli (page 188), that the CMS spokesperson is Tejinder Virdee (page 189) and that Joe Incandela is “acting as spokesperson for CMS” (page 215); the three sentences were no doubt correct when they were written but producing a good book implies more than copy/pasting sentences written over a period of several years. In general, the original chapters provide enjoyable reading but some details reveal that the author followed the action from far away and, in a few instances, became sidetracked by blog-driven animation. This constitutes an eye-opening experience for some readers (such as myself). Having followed the reality of the discovery as an insider and now seeing how things are written up in a popular-science book will allow me to assess the kind of “acceptance correction” that I should apply to analogous descriptions of the many things of which I have no direct knowledge. As an aside, I was amused to see that Baggott decided to illustrate the LHC’s achievements using a dimuon mass distribution that I helped to prepare but astonished to see that an error was introduced in the CMS Higgs plot when restyled for inclusion in the book. Things were really done too much in a hurry.
If you are looking for a good book to read over the end-of year break, I highly recommend The Quantum Story, a dense plot with heroic characters, covering the fantastic odyssey of quantum physics. But how many of us have crossed paths with Einstein, Bohr, Pauli or Dirac? It is refreshing to read books about present-day physics and physicists, where one can enjoy the plot and recognize the main characters. In that respect, Higgs is an interesting alternative and has the advantage of being much faster to read. Another option for people specifically interested in reading about the “hunt for the God particle”, is Massive, by Ian Sample, an easy-to-read, lively book that gives a fast-paced and well humoured overview of the history behind and surrounding the Higgs boson, until mid-2010, although the reader needs to be patient and ignore the annoying detail of seeing CERN written as Cern and RHIC as Rick … oh, well.
I am looking forward to reading more books about the LHC experiments and their discoveries, concerning Higgs physics and other topics, written by people who made those experiments and those discoveries. These are important issues and they deserve being treated by professionals with direct knowledge of the inside action, who can provide much more information – and much more accurately – than (award-winning) popular-science authors.
On 5 October, CERN’s director-general, Rolf Heuer, and the minister of education and culture of the Republic of Cyprus, George Demosthenous, signed an agreement under which the Republic of Cyprus will become an associate member state in the pre-stage to membership. Before it comes into force, the agreement has to be ratified by the Parliament of Cyprus.
In the early 1990s, physicists from the Republic of Cyprus took part in the L3 experiment at CERN’s Large Electron Positron collider before joining the CMS collaboration in 1995. A memorandum of understanding was signed between the University of Cyprus and CMS in 1999 under which Cypriot physicists have contributed to the development of the solenoid magnet and of the CMS electromagnetic calorimeter. They are also involved in the physics analyses of the CMS experiment, including certain searches for the Higgs boson and beauty quarks.
The Republic of Cyprus is the third country to accede to the status of associate member state in the pre-stage to membership, following Israel in 2011 and Serbia earlier this year.
Theory: A Lecture Course By M Shifman Cambridge University Press
Hardback: £45 $80
E-book: $64
Many interesting developments have taken place in quantum field theory (QFT) since the 1970s, and there is no better place to learn about them than this book. The author has been an active contributor to the field over the past four decades and he has produced a personal book based on his lectures over the years. Reading the table of contents virtually gives you vertigo because the depth and breath of the topics covered is simply staggering.
The book is structured as two parts: before and after supersymmetry – although many of the concepts introduced in the first part have an extension in the supersymmetric context, with interesting conceptual variations. All subjects are treated thoroughly and with great clarity. The opening chapter deals with the important subject of the phases of gauge theories and it continues with the many exotic objects that populate QFT, namely kinks, domain walls, strings, vortices, monopoles, skyrmions, instantons, chiral anomalies, confinement, chiral-symmetry breaking and a quick overview of lower-dimensional models related to the theory of phase transitions. The treatment of each subject is rather complete, and many difficult subjects are explained with exemplary clarity, for example the use of collective co-ordinates and their importance in the quantization of semiclassical states, as well as the interplay between chiral-symmetry breaking in QCD-like theories and confinement.
The author provides one of the most elegant and concise presentations of the use of instantons in gauge theories that I have ever seen. This is a notoriously complex subject, with important physical implications, such as the vacuum angle in QCD and the strong CP problem. The computations are carried out in detail and the reader is led by a safe hand through all of the delicate aspects of rather complex calculations. This is a great service to anyone trying to learn advanced QFT after a grounding in the standard courses in the subject.
In the second part of the book, the author provides a remarkably lucid and complete presentation of supersymmetry, supersymmetric gauge theories and all of their associated phenomenology. It is difficult to pack more information in the 150 pages dedicated to the subject. The phase diagram of supersymmetric gauge theories is rather complex and subtle. There is no one better suited than the author to introduce the subject; he has been one of its main contributors over the years. Subjects such as supersymmetry anomalies, the Witten index, the implications and uses of instantons within supersymmetry, the super-Higgs mechanism and the Russian beta-function are but a few of the subjects featured in this part of the volume.
This book is a must for anyone interested in learning about the developments in advanced field theory over the past few decades. It is a pedagogical and deeply insightful presentation by one of the masters in the field.
By Daniel Z Freedman and Antoine Van Proeyen Cambridge University Press
Hardback: £45
E-book: $64
Since the work of Emmy Noether nearly a century ago, the idea of symmetry has played an increasingly important role in physics, resulting in spectacular successes such as Yang-Mills gauge theory along the way. Albert Einstein, in particular, realized that symmetry could be a foundational principle; his understanding that the space–time dependent (“local”) symmetry of general co-ordinate invariance could be used to build general relativity had an enormous impact on the development of 20th-century physics.
The current zenith of the local symmetry principle is the theory of supergravity, which combines general relativity with the spin-intermingling theory of supersymmetry to construct the richest and deepest symmetry-based theory yet discovered. Supergravity also lies at the foundation of string theory – a theory whose own symmetry principle has not yet been uncovered – and so is one of the central ideas of modern high-energy theoretical physics.
Unfortunately, since its invention in the 1970s, supergravity has been an infamously difficult subject to learn. Now, two of the inventors and masters of supergravity – Dan Freedman and Antoine Van Proeyen – have produced a superb, pedagogical textbook that covers the classical theory in considerable depth.
The book is notably self-contained, with substantial and readable introductory material on the ideas and techniques that combine to make up supergravity, such as global supersymmetry, gauge theory, the mathematics of spinors and general relativity. There are many well chosen problems for the student along the way, together with compact discussions of complex geometry. After the backbone of the book on N=1 and N=2 supergravities, there is an excellent and especially clear chapter on the anti-deSitter supergravity/conformal field theory correspondence as an application.
Naturally, any finite book has to cut short some deserving topics. I hope that any second edition has an expanded discussion on superspace to complement the current, clear treatment based on the component multiplet calculus, as well as a greater discussion on supergravity and supersymmetry in the quantum regime.
Overall, this is a masterful introduction to supergravity for students and researchers alike, which I strongly recommend.
A new initiative to provide open access to peer-reviewed particle physics research literature was launched at CERN on 1 October by the Sponsoring Consortium for Open Access Publishing in Particle Physics – SCOAP3. Open dissemination of preprints has been the norm in particle physics for two decades but this initiative now brings the peer-review service provided by journals into the open-access domain.
In the SCOAP3 model, funding agencies, research institutions, libraries and library consortia pool resources that are currently used to subscribe to journal content and they use them to support the peer-review system directly. Publishers then make electronic versions of their journals open access. Articles funded by SCOAP3 will be available under a Creative Commons, CC BY licence, meaning that they can be copied, distributed, transmitted and adapted as needed, with proper attribution.
Representatives from the science-funding agencies and library communities of 29 countries were present at the launch. The publishers of 12 journals, accounting for the vast majority of articles in particle physics, have been identified for participation in SCOAP3 through an open and competitive process. With a projected SCOAP3 budget of SwFr36 million over three years, more partnerships with key institutions in Europe, America and Asia are foreseen as the initiative moves through the technical steps of organizing the re-direction of funds from the current subscription model towards a common internationally co-ordinated fund. SCOAP3 expects to be operational for articles published as of 2014.
On 10–12 September, some 500 physicists attended an open symposium in Krakow for the purpose of updating the European Strategy for Particle Physics, which was adopted by CERN Council in 2006. The meeting provided an opportunity for the global particle-physics community to express views on the scientific objectives of the strategy in light of developments over the past six years. With the aid of a local organizing committee, it was arranged by a preparatory group chaired by Tatsuya Nakada (see Viewpoint Charting the future of European particle physics).
In the early 1960s, a 4-km-long strip of land in the rolling hills west of Stanford University was transformed into the longest, straightest structure in the world – a linear particle accelerator. It was first dubbed Project M and affectionately known as “the Monster” by the scientists at the time. Its purpose was to explore the mysterious subatomic realm.
Fifty years later, more than 1000 people gathered at SLAC National Accelerator Laboratory to celebrate the scientific successes generated by that accelerator and the ones that followed, and the scientists who developed and used them. The two-day event on 24–25 August, for employees, science luminaries and government and university leaders, was more than a tribute to the momentous discoveries and Nobel prizes made possible by the minds and machines at SLAC. It also provided a look ahead at the lab’s continuing evolution and growth into new frontiers of scientific research, which will keep it at the forefront of discovery for decades to come.
A history of discovery
The original linear-accelerator project, approved by Congress in 1961, was a supersized version of a succession of smaller accelerators, dubbed Mark I to Mark IV, which were built and operated at Stanford University and reached energies of up to 730 MeV. The “Monster” would accelerate electrons to much higher energies – ultimately to 50 GeV – for ground-breaking experiments in creating, identifying and studying subatomic particles. Stanford University leased the land to the federal government for the new Stanford Linear Accelerator Center (SLAC) and provided the brainpower for the project. This set the stage for a productive and unique scientific partnership that continues today, supported and overseen by the US Department of Energy.
Soon after the new accelerator reached full operation, a research team that included physicists from SLAC and Massachusetts Institute of Technology (MIT) used the electron beam in a series of experiments starting in 1967 that provided evidence for hard scattering centres within the proton – in effect, the first direct dynamical evidence for quarks. That research led to the awarding of the 1990 Nobel Prize in Physics to Richard Taylor and Jerome Friedman of SLAC and Henry Kendall of MIT.
SLAC soon struck gold again with discoveries that were made possible by another major technical feat – the Stanford Positron Electron Asymmetric Ring, SPEAR. Rather than aiming the electron beam at a fixed target, the SPEAR ring stored beams of electrons and positrons from the linear accelerator and brought them into steady head-on collisions.
In 1974, the Mark I detector at SPEAR, run by a collaboration from SLAC and Lawrence Berkeley National Laboratory, found clear signs of a new particle – but so had an experiment on the other side of the US. In what became known as the “November Revolution” in particle physics, Burton Richter at SLAC and Samuel Ting at Brookhaven National Laboratory announced their independent discoveries of the J/ψ particle, which consists of a paired charm quark and anticharm quark. They received the Nobel Prize in Physics for this work in 1976. Only a year after the J/ψ discovery, SLAC physicist Martin Perl announced the discovery of the τ lepton, a heavy relative of the electron and the first of a new family of fundamental building blocks. He went on to share the Nobel Prize in Physics in 1995 for this work.
These and other discoveries that reshaped understanding of matter were empowered by a series of colliders and detectors. The Positron–Electron Project (PEP), a collider ring with a diameter almost 10 times larger than SPEAR, ran during the years 1980–1990. The Stanford Linear Collider (SLC), completed in 1987, focused electron and positron beams from the original linac into micron-sized spots for collisions at a total energy of 100 GeV. Making thousands of Z bosons in its lifetime, the SLC hosted a decade of seminal experiments. It also pioneered the concepts behind the current studies for a linear electron–positron collider to reach energies in the region of 1 TeV.
PEP was followed by the PEP-II project, which included a set of two storage rings and operated in the years 1998–2008. PEP-II featured the BaBar experiment, which created huge numbers of B mesons and their antimatter counterparts. In 2001 and 2004, BaBar researchers and their Japanese colleagues at KEK’s Belle experiment announced evidence supporting the idea that matter and antimatter behave in slightly different ways, confirming theoretical predictions of charge-parity violation.
Synchrotron research and an X-ray laser
Notably, new research areas and projects at SLAC have often evolved as the offspring of the original linear accelerator and storage rings. Researchers at Stanford and SLAC quickly recognized that electromagnetic radiation generated by particles circling in SPEAR, while considered a nuisance to the particle collision experiments, could be extracted from the ring and used for other types of research. They developed this synchrotron radiation – in the form of beams of X-ray and ultraviolet light – as a powerful scientific tool for exploring samples at a molecular scale. This early research blossomed as the Stanford Synchrotron Radiation Project (SSRP), a set of five experimental stations that opened to visiting researchers in 1974.
Its modern descendant, the Stanford Synchrotron Radiation Lightsource (SSRL), now supports 30 experimental stations and about 2000 visiting researchers a year. SPEAR – or more precisely, SPEAR3 following a series of upgrades – became dedicated to SSRL operations 20 years ago. This machine, too, has allowed Nobel-prize winning research. Roger Kornberg, professor of structural biology at Stanford, received the Nobel Prize in Chemistry in 2006 for work detailing how the genetic code in DNA is read and converted into a message that directs protein synthesis. Key aspects of that research were carried out at the SSRL.
Cutting-edge facilities
Meanwhile, sections of the linear accelerator that defined the lab and its mission in its formative years are still driving electron beams today as the high-energy backbone of two cutting-edge facilities: the world’s most powerful X-ray free-electron laser, the Linac Coherent Light Source (LCLS), which began operating in 2009; and FACET, a test bed for next-generation accelerator technologies. LCLS-II, an expansion of the LCLS, should begin construction next year. It will draw electrons from the middle section of the original linear accelerator and use them to generate X-rays for probing matter with high resolution at the atomic scale.
The late Wolfgang “Pief” Panofsky, who served as the first director of SLAC from 1961 until 1984, often noted that big science is powered by a ready supply of good ideas. He referred to this as the “innovate or die” syndrome. In 1983, Panofsky wrote that he had been asked since the formation of the lab, “How long will SLAC live?” The answer was and still is: “about 10 to 15 years, unless somebody has a good idea. As it turns out, somebody always has had a good idea which was exploited and which has led to a new lease on life for the laboratory.”
Under the leadership of its past two directors – Jonathan Dorfan, who helped launch the BaBar experiment and the astrophysics programme, and Persis Drell, who presided over the opening of the LCLS – SLAC’s scientific mission has grown and diversified. In addition to its original focus on particle physics and accelerator science, SLAC researchers now delve into astrophysics, cosmology, materials and environmental sciences, biology, chemistry and alternative energy research. Visiting scientists still come by the thousands to use lab facilities for an even broader spectrum of research, from drug design and industrial applications to the archaeological analysis of fossils and cultural objects. Much of this diversity in world-class experiments is based on continuing modernizations at the SSRL and the unique capabilities of the LCLS.
SLAC’s scientists and engineers continue to collaborate actively in international projects – designing machines and building components, running experiments and sharing data with other accelerator laboratories in the US and countries around the globe, including China, France, Germany, Italy, Japan, Korea, Latin America, Russia, Spain and the UK. The lab’s long-standing collaboration with CERN provided an important spark in the formative years of the World Wide Web and led to SLAC’s launch of the first web server in the US. SLAC is also playing an important role in the ATLAS experiment at CERN’s LHC. In the area of synchrotron science, collaborations with US national laboratories and with overseas labs such as DESY in Germany and KEK in Japan have contributed greatly to the development of advanced tools and methodologies, with enormous scientific impact.
Expertise in particle detectors has even elevated the lab’s research into outer space. SLAC managed the development of the Large Area Telescope, the main instrument on board the Fermi Gamma-ray Space Telescope, which was launched into orbit in 2008 and continues to make numerous discoveries. The lab has also earned a role in building the world’s largest digital camera for an Earth-based observatory, the Large Synoptic Survey Telescope, with construction scheduled to begin in 2014 for eventual operation on a mountaintop in Chile.
Richter, who served as SLAC director from 1984 to 1999, has said that the fast-evolving nature of science necessitates a changing path and pace of research. “Labs can remain on the frontiers of science only if they keep up with the evolution of those frontiers,” remarks Richter. “SLAC has evolved over its first 50 years and is still a world leader in areas beyond what was thought of when it was first built. It is up to the scientists of today to keep it moving and keep it on some perhaps newly discovered frontiers for the next 50.”
This article is based on the one published on the SLAC News Centre.
Stanford University, in California, already has a leading position as far as linear accelerators are concerned. It operates a whole family of linacs, several of which are used for medical purposes. The 200 ft machine [Mark III] in operation there produces 700 MeV electrons and its energy will be stepped up to 1050 MeV.
Late in May, Stanford made the scientific headlines – again with a linac.
Addressing a science research symposium in Manhattan, President Eisenhower announced that he would recommend to the US Congress the financing of a “large new electron linear accelerator … a machine two miles long, by far the largest ever built”.
This machine, intended for Stanford University, would be one of the most spectacular atom smashers ever devised. Two parallel tunnels would have to be driven for two miles into the rock of a small mountain in the vicinity of Palo Alto. Such natural cover would, of course, stop any dangerous radiation. One of the tunnels, the smaller in diameter, would house the accelerator proper, while the bigger one would be used for maintenance purposes.
The proposed new linac for Stanford would initially produce 15 BeV (GeV) electrons; it is announced that this energy could later be raised to 40 BeV. It is believed that the machine would take six years to build, at a cost of 100 million dollars. Approval of the project, now only taken after Congressional hearings, depends on the decision to be held in July.
By Gordon Fraser Oxford University Press
Hardback: £25
Don’t be misled by the title of this book. It contains a surprising amount of information, much more than focusing on the exodus of Jewish scientists from Germany after the rise of the Nazi Party. The book puts anti-Semitism into a broad historical perspective, starting with the destruction of the Temple in Jerusalem, the expelling of the Jews all across Europe and the growth of a mild and sometimes hidden anti-Semitism. This existed in Germany in the 19th century and even to some extent under the Nazis, when the initial objective was to cleanse German culture of all non-Aryan influences. However, various phases led eventually to the Holocaust. A political spark was ignited when the parliamentary building in Berlin went up in flames in February 1933 and Adolf Hitler became Chancellor. The Civil Service Law was soon introduced that forbade Jews from being employed by the state, followed by the burning of books and the Kristallnacht, during which Jewish shops were destroyed – all of which were further steps towards the “final solution”.
In parallel to these political developments, Quantum Exodus describes the rise of quantum physics in Germany during the 19th century, with protagonists such as Alexander von Humboldt, Wilhelm Röntgen, Hermann von Helmholtz, Max Planck, Walther Nernst and Arnold Sommerfeld. They attracted many Jewish scientists from all over Europe, among them Hans Bethe, Max Born, Peter Debye, Albert Einstein, Lise Meitner, Leó Szilárd, Edward Teller and Eugene Wigner, who went on to become key players in 20th-century physics. Most of them left Germany, some at an early time, others escaping at the last moment and most of them going to the UK or US, often via Denmark, with Niels Bohr’s institute as a temporary shelter. An exodus also started from other countries such as Austria and Italy. The book recounts the adventurous and disheartening fates of many of these physicists. Arriving as refugees, they were initially often considered aliens and during the war sometimes even as spies. The author gives some spice to his narrative by adding amusing details from the private lives of some of the protagonists.
A detailed account is given of the Manhattan Project project and how the famous letter by Einstein to President Franklin Roosevelt initiated the building of the fission bomb. It was written as a result of pressure by Szilárd, the main mover behind the scenes. What is less known is the primordial importance of a paper by Otto Frisch and Rudolf Peierls in the UK, which already contained the detailed ideas of the fission bomb. Robert Oppenheimer, an American Jew, became scientific director of the Manhattan Project after his studies in Europe, bringing the European mindset to the US. He attracted many émigrés to the project, such as Bethe, Teller, Felix Bloch and Victor Weisskopf. The book relates vividly how Teller, because of his stubborn character, could not be well integrated into this project; rather, he pushed in parallel for the H-bomb.
The author implies, although somewhat indirectly, that the rise of Nazism and the development of the nuclear bomb have a deeper correlation, without giving convincing details. However, the interaction of science (its stars) and politics is well described. Bohr’s influence, although at the centre of nuclear physics, was limited – partly because of his mumbling and bad English (something that I witnessed at the Geneva Atoms for Peace Conference in 1957, where his allocution in English had to be translated simultaneously into English.)
Many of the exiled physicists who worked on the Manhattan Project developed considerable remorse after the events of Hiroshima and Nagasaki. When I invited Isidor Rabi to speak at the 30th anniversary of CERN he considered his involvement in the foundation of CERN as a kind of recompense for his wartime activities.
The descriptive account of science in the US and Europe after the Second World War is interesting. In the US, politicians’ interest in science decreased substantially and a change was introduced only when the shock of Sputnik led eventually to the “space race”. Basic science also benefited from this change, leading for example to the foundation of various national laboratories such as Fermilab. In Europe, a new stage for science emerged when a pan-European centre to provide resources on a continental rather than a national scale was proposed and CERN was founded in 1954.
The book benefits from the fact that the author is competent in physics, which he sometimes describes poetically, but never wrongly. He has done extremely careful research, giving many references and a long list of Jewish emigrants. I found few points to criticise. Minor objections concern passages about CERN, although the author knows the organization so well. For example, the response of CERN towards the Superconducting Super Collider was the final choice of the circumference of the LEP tunnel (27 km) in view of the possibility of a later proton–proton or proton–electron collider in the same tunnel, while the definite LHC proposal came only in 1987; and the LHC magnets are superconducting to achieve the necessary high magnetic fields and not so much to save electricity.
The various chapters are not written in chronological order, and political or scientific developments are integrated with human destinies. This assures easy and entertaining reading. Like me, older readers who have known many of the protagonists, will not avoid poignant emotions. For young readers, the book is recommended because they will learn many historical facts that should not be forgotten.
One intriguing question (probably unanswerable) that was not considered, is: what would have happened to US science without the contribution of Jewish immigrants?
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.