Comsol -leaderboard other pages

Topics

A Century of Physics

by D Allan Bromley, Springer-Verlag New York, ISBN 0387952470, $59.95.

71kSYqaPqNL

Senior statesman of US physics, Allan Bromley, has chosen the centenary of the American Physical Society to offer an illustrated review of the last 100 years of physics. At various times in his career, Professor Bromley has been president of the American Physical Society, the American Association for the Advancement of Science and the International Union of Pure and Applied Physics. He was also founder of Yale’s nuclear structure laboratory, and is Sterling professor of the sciences and dean of engineering at Yale. All of these achievements make him very well qualified to present a successful and accessible overview of 20th-century physics.

Pictures are the stars of this book, bringing the highly readable narrative to life. The reader’s eyes are spoiled by images such as that of J Robert Oppenheimer and Edward Teller shaking hands, despite their well publicized differences. The great advances in accelerators are brought home by the picture of the original Cockcroft-Walton machine, and the hilarious photo of Isidor Rabi cooking hot dogs on the coil head of the Columbia cyclotron demonstrates not only that cooling technology has improved over the years, but that physicists can have a delicious sense of humour.

What A Century of Physics necessarily lacks in depth, it more than makes up for in breadth. After covering events from the early part of the century, such as the Annus Mirabilis of 1932 and the Manhattan project, Bromley moves on to discuss post-war physics. He covers subjects as diverse as superconductivity and the evolution of computers, and he explains the Standard Model and covers the research activity of laboratories all around the world.

At the end of the book, Bromley draws connections between particle physics research and cosmology. In the book’s final breath, he goes back to the start of it all. Ten unanswered questions conclude his report, opening the door to a new century of physics.

The Casimir Effect: Physical Manifestation of Zero-Point Energy

by K A Milton, World Scientific, ISBN 9810243979 $87/£58.

cernbooks1_4-02

In 1948 Hendrik Casimir showed that, according to quantum electrodynamics (QED), two parallel conducting plates should exert a force on each other – an effect that now bears his name. This force, according to one of several possible interpretations, is a direct result of the existence of zero-point vacuum fluctuations of the electromagnetic field. In simple terms, as the plates are placed closer and closer together, more and more modes of the electromagnetic field are excluded, with a corresponding reduction in the (admittedly infinite) amount of zero-point energy between them and an associated (and, amazingly, finite) attractive force. Similar effects occur whenever boundaries are placed in the vacuum, and all are collectively considered to be manifestations of the Casimir effect.

Milton’s book reviews this remarkable phenomenon from a theoretical viewpoint. Starting with parallel conducting plates, he goes on to extensions to different geometries; partially conductive and dielectric materials in place of conductors; the relation to van der Waals forces; dimensions more and less than the usual 3+1; contributions due to fermion fields; finite temperature effects; radiative corrections; and implications for hadronic physics and even cosmology.

With all these calculations and applications one might imagine that the Casimir effect would be well understood, but hardly anything could be further from the truth. For example, Casimir forces can be repulsive; they tend to expand a spherical shell, which is by no means intuitively obvious and in fact is a bit of a pity. Anticipating a force of the opposite sign, Casimir had hoped that they might supply the Poincaré stresses needed to stabilize a model of an electron as a tiny spherical shell of charge, and even lead to a calculation of the numerical value of the fine structure constant.

If the sign of the Casimir effect for a spherical shell is somewhat surprising, what happens in other cases can be even stranger. Change a spherical shell to a cubical box and it still tries to expand, but make it a long thin rectangular box and it tends to collapse. Go to an even number of space dimensions and the force on a hyperspherical shell becomes infinite. It’s all wonderfully bewildering.

Perhaps the most interesting recently recognized manifestation of the Casimir effect – if indeed that’s what it is – is the phenomenon of sonoluminescence, in which an acoustically tickled bubble of air in water releases visible light in 100 picosecond bursts. While the jury is still out on what exactly is going on, there are calculations suggesting that this could be due to a dynamical version of the Casimir effect in which vibrations of the bubble excite the QED vacuum. Here, however, the theory is much more difficult to work out, and different approximations lead to wildly differing estimates of how big the effect ought to be.

The Casimir effect is about a lot more than a force between two metal plates, and Milton’s book offers a great opportunity to read about it and learn the techniques by which it can be calculated. My one criticism of the book, which is probably not really fair given that the author is a theorist, is that it would be beneficial to have a discussion of the techniques by which the effect is observed in the laboratory. That said, the book is very comprehensive, clearly written and filled with wonderful physics.

Quantum Electrodynamics

by V Gribov and J Nyiri, 2001 Cambridge University Press (Cambridge monographs on particle physics, nuclear physics and cosmology no. 13), ISBN 0521662281, £55/$80.

41MDpvktmtL._SX333_BO1,204,203,200_

This short book is based on the lectures of Vladimir Gribov that were given in Leningrad in 1974. It was completed, after his death in 1997, by his collaborator Julia Nyiri and it provides a pleasant introduction to the basics of field theory and quantum electrodynamics (QED). One of the book’s strengths is its intuitive and relatively leisurely introduction to quantum field theory (QFT) via the Feynman propagator and diagram approach that is particularly suited to students on their first approach to the forbidding machinery of modern QFT. Indeed, in its treatment of elementary but fundamental topics – such as the construction of the scattering amplitude; the relation between causality, unitarity and analyticity in the Mandelstam plane; and tree-level processes such as the Compton effect or soft electron bremsstrahlung – it can be compared to two of the best older texts on quantum electrodynamics – Feynman’s own book of this title and the volume on QED of the Landau and Lifshitz series. Unfortunately it also inherits deficiencies from its origins in the early 1970s.

Though the last two chapters discuss radiative corrections in QED and some aspects of renormalization theory, such as Ward identities, no mention is made of the central topic of the renormalization group, either in its older Gell-Mann-Low form or in the more modern Wilsonian guise within the effective field theory picture. Thus there are no anomalous dimensions of operators or running couplings as encapsulated in beta-functions – apart from what the student may find rather confusing remarks on the “zero charge problem”. Without these crucial tools a student is ill-prepared to explore the deeper properties of quantum field theory.

In addition there is no discussion of spontaneous symmetry breaking, the Higgs mechanism, Yang-Mills theory, ghosts, dimensional regularization, anomalies, or the operator product expansion. Therefore none of the physics of the theory of the strong or weak interactions can be discussed. So sadly, despite its pleasing and pedagogical introduction to the basics of QED, it can’t compete with modern quantum field theory texts, such as Peskin’s “Introduction to Quantum Field Theory”, as a full introductory course. However, I can recommend it as an enjoyable basic supplement to more complete texts.

How US physicists first came to work at CERN

cernus1_4-02

In the late 1940s, Europe was struggling to emerge from the ruins of the Second World War. The US had played a vital role in the conflict, but had been less affected materially, and a shining vision of life across the Atlantic was a beacon of hope for millions of Europeans living in austerity, if not misery.

In a speech at Harvard on 5 June 1947, US Secretary of State George C Marshall said that the US should help to “assist in the return of normal economic health in the world”. North American “Marshall aid” was a major factor in restoring European economic health and dignity.

During the global conflict, many eminent European scientists had been drawn into the Manhattan Project at Los Alamos. Post-war, US science remained pre-eminent. Anxious to stem a “brain drain” of talent, farsighted pioneers saw that Europe needed a comparable scientific focus. This was the seed of an idea for a European centre for atomic research.

UNESCO role

One of the organizations established in the wake of the Second World War to help promote world peace and co-operation was the United Nations Educational, Scientific and Cultural Organization (UNESCO). At the UNESCO General Conference in Florence, Italy, in June 1950, the idea for a European scientific laboratory still lay dormant. Among the US delegation at Florence was Isidor Rabi, who had won a 1944 Nobel prize for his work on the magnetic properties of nuclei. Rabi had played a key wartime role at the MIT Radiation Laboratory, and understood how pressing scientific needs could be transformed into major new projects. After the war Rabi played a major role in establishing the US Brookhaven National Laboratory.

The establishment of an analogous European laboratory was to Rabi a natural and vital need. However, on arrival in Florence he was disturbed to find that there was no mention of this idea on the agenda. Two Europeans, Pierre Auger (then UNESCO’s director of exact and natural sciences) and Edoardo Amaldi, who was to be a constant driving force, helped Rabi through the intricacies of European committee formalities. So the European seed was fertilized and within a few years CERN was born.

Another major, and very different, US contribution to CERN came two years after the Florence meeting. In 1952 a group of European accelerator specialists – Odd Dahl of Norway, Frank Goward of the UK and Rolf Wideröe of Germany – visited Brookhaven. CERN’s initial goal was to build a scaled-up version of Brookhaven’s new synchrotron – the Cosmotron – and the CERN group were anxious to admire the highest-energy accelerator in the world at that time.

To prepare a welcome for the European visitors, Stanley Livingston at Brookhaven called together his accelerator specialists to see how they could help the Europeans. During one of these meetings Livingston pointed out that all of the machine’s C-shaped focusing magnets faced outwards. Why not make some of them face inwards? Quickly Ernest Courant and the rest of the Brookhaven team saw that arranging the magnets to face alternately inward and outward could increase the focusing power of the synchrotron. The European visitors arrived just as the implications of the “alternating gradient” idea were being appreciated.

The CERN team took the idea back to Europe and immediately incorporated it into their new synchrotron design. Two members of the Brookhaven team – John and Hildred Blewett – later went to CERN and played a major role in ensuring that the new CERN Proton Synchrotron (PS) delivered its first high-energy protons in November 1959, several months before Brookhaven’s Alternating Gradient Synchrotron. This was the start of a long tradition of US-Europe collaboration in development work for major particle beam machines, which continues to this day.

Ernest Courant later made important contributions to CERN’s Intersecting Storage Rings (ISR) project, helping to convince accelerator physicists that beams in a proton collider could remain stable for long periods. In the early 1970s, CERN specialists came to Fermilab to help build and commission the big new US synchrotron. A few years later, US machine physicists came to CERN when the comparable Super Proton Synchrotron was getting under way. In the mid-1970s, Burt Richter of SLAC, during a sabbatical sojourn at CERN, helped set the scale for CERN’s LEP electron-positron collider, which had to be built as large as possible to minimize the losses caused by synchrotron radiation. The design eventually settled on a circumference of 27 km.

CERN officially came into being in 1954 when its convention document was ratified by the founding member states. Three years later, its first particle accelerator, a 600 MeV synchrocyclotron (SC), began operations. SC experiments on pion decay soon began to make their mark on the world particle physics scene. Slower to get going at the SC was a major effort to precision measure the magnetic moment of the muon – the famous g-2 experiment.
CERN experiments and major international physics conferences at CERN and in Geneva in the late 1950s introduced many US experimentalists to the attractions of Europe for a short visit or a longer sabbatical stay. One of these was Leon Lederman, newly tenured at Columbia, who made many useful contacts during his first stay at CERN and who left resolved to return. The SC g-2 experiment involved a lot of physicists by the standards of the day and attracted several other major US figures. Some also collaborated in bubble-chamber studies at CERN to determine particle properties vital for the emerging particle classification schemes based on internal symmetry.

As one of CERN’s main aims was to stem the tide of scientific migration westwards across the Atlantic, it was natural for the laboratory to headhunt Europeans who had made the move to the US. So CERN’s first director-general was the Swiss physicist Felix Bloch, who had left Europe in 1933 and went on to win a 1952 Nobel prize for measurements of nuclear magnetism. However, Bloch’s move to CERN was not a success.

Another contemporary colossus straddling the Atlantic was Victor Weisskopf. Austrian by birth, Weisskopf had made pioneering contributions to quantum mechanics in Europe in the 1930s, and, like Bloch and many others, had fled to the US to escape Nazi persecution, eventually making his way to Rochester. During the Second World War Weisskopf had worked at Los Alamos as deputy to theory division leader Hans Bethe. At Los Alamos, Weisskopf developed a flair for 20th-century “big science”.

When CERN was looking for a new director-general in the early 1960s, Weisskopf was a natural candidate, a distinguished European with experience in the management of big physics projects. Despite his protests that he knew little about administration, he was pushed into the job and CERN flourished. During Weisskopf’s mandate CERN developed a strong sense of purpose, and ambitious new projects for the future were authorized.

cernus3_4-02

Younger Europeans who had been working in the US also chose CERN as their research base for a return to Europe in the early 1960s. Several were to go on to become very influential. Jack Steinberger emigrated to the US in 1934 and went on to make landmark contributions, mainly with bubble chambers, at the new generation of post-war accelerators at Berkeley, Columbia and Brookhaven. At CERN, Steinberger switched to electronic detectors.

After completing his degree at Pisa, Carlo Rubbia moved to Columbia for a taste of front-line research in weak interaction physics before moving to the SC at CERN. In their subsequent careers, Rubbia and Steinberger were highly visible from either side of the Atlantic. Both these physicists participated in the first studies at CERN of the phenomenon of CP violation, discovered at Brookhaven in 1964.

The Ford Foundation provided generous funding so that scientists from nations that were not signatories to the CERN Convention could participate in the laboratory’s research programme. In this way, more young US researchers were able to visit. One was Sam Ting, who worked at the PS as a Ford Foundation fellow with Giuseppe Cocconi.

ISR testbed

cernus2_4-02

Under Weisskopf, CERN’s next major project was the innovative ISR, the world’s first proton-proton colliding beam machine, which came into operation in 1971. It was unique. For the first time, Europe had a kind of front-line particle physics machine that the US didn’t, attaining a totally new energy range, and many scientists were keen to see what it could do. Among the first to make the eastward pilgrimage to Geneva were Leon Lederman of Columbia and Rod Cool of Rockefeller.

Working at Brookhaven, Lederman had studied the production of muon pairs, initially hunting for the intermediate boson, the carrier of the weak nuclear interaction. This hunt was some 15 years premature, but on the other hand it convinced Lederman, and others, of the value of lepton pairs as a signature of basic interactions.

cernus4_4-02

At the ISR, a Europe-Columbia-Rockefeller collaboration was among the first to see that under ISR conditions some high-energy particles emerged at wide angles to the direction of the colliding beams. This suggested that occasionally something violent happened when the proton beams clashed together. It was a few years after the historic experiments at SLAC, which had used electrons to probe deep inside the proton and see that it contained hard scattering centres, but the ISR experiments saw the constituents deep inside protons colliding with each other.

Over the lifetime of the ISR (1971-84), US participation in experiments at CERN developed from small bands of intrepid pioneers to major groups. Other active collaborations involved researchers from Brookhaven, Harvard, MIT, Northwestern, Riverside, Stony Brook, Syracuse and UCLA.

A major US contribution at CERN was the 1979 discovery at the ISR of direct single photons from quark processes – the first sighting of electromagnetic radiation from quarks. Playing an important role here was Bob Palmer, a European migrant to the US who retained an attachment to CERN.

Over its first decade of operation the ISR made it clear that “keyhole physics”, using just a small sample of the produced particles, was not the only way to go, and colliding beam machines needed big detectors to intercept as many as possible of the emerging particles. With their ISR apprenticeship, US physicists learned this lesson early.

The second half of this history will look at US involvement in modern collider physics at CERN.

Forty years of research on the structure of matter

cernslac1_4-02

Some 50 km south of San Francisco, a long, low structure stretches for 3 km through the rolling, oak-studded hills behind the Stanford University campus to the base of the Santa Cruz mountains. This curious feature is the klystron gallery of the Stanford Linear Accelerator Center (SLAC) – by far the world’s largest electron microscope. It is one of the longest buildings on the surface of the Earth.

cernslac2_4-02

Ever since this powerful scientific instrument began operating in the mid-1960s, SLAC has been generating intense, high-energy beams of electrons and photons for research on the structure of matter. Physicists using its facilities have received three Nobel prizes for the discovery of the quarks and the tau lepton, both recognized today as fundamental building blocks of matter. Led by Wolfgang Panofsky and Burton Richter, its first two directors, the centre has also played a leading role in developing electron-positron storage rings and large “4p” detectors to observe subatomic debris spewing out from high-energy particle collisions.

Since the mid-1970s, other scientists have employed SLAC’s ultrabright X-ray beams to study the structure and behaviour of matter at atomic and molecular scales in the Stanford Synchrotron Radiation Laboratory (SSRL), now a division of SLAC. Molecular biologists, for example, have used these X-ray beams to determine the detailed structures of important biological molecules such as HIV protease and RNA polymerase. Still others have examined the behaviour of catalysts, semiconductors, superconductors, and the endless variety of advanced materials that are becoming increasingly essential in today’s high-tech industries.

SLAC is a national laboratory operated by Stanford University on behalf of the US Department of Energy (DOE), which supports its operations. The National Institutes of Health and the National Science Foundation provide additional funding for specific equipment and experiments. Use of SLAC’s facilities is available to qualified researchers from around the world; about 3000 users come to the centre each year from more than 20 countries to perform research in groups ranging in size from a few to several hundred scientists. In addition, SLAC has a staff of about 1400, of whom more than 300 are scientists involved in ongoing research. The results of all research performed at the laboratory are published openly in scientific and technical journals; no classified research is carried out on the premises.

Three Nobel prizes

cernslac5_4-02

The principal focus of research at SLAC is elementary particle physics – the field in which the centre has earned its three Nobel prizes. The first of these went to Richter, who shared the 1976 prize with Sam Ting for the discovery of the famous J/psi particle (which was eventually found to be made of charm quarks) two years earlier. In 1990, Jerome Friedman, Henry Kendall and Richard Taylor shared the prize for uncovering the quark substructure of protons and neutrons by studying deep-inelastic electron scattering from these targets in the late 1960s and early 1970s. SLAC’s third Nobel prize was awarded to Martin Perl in 1995 for his discovery of the tau lepton in the mid-1970s.

Stanford and SLAC physicists have also spearheaded the development of linear electron accelerators since the late 1940s. In the past two decades they have pioneered the development of linear electron-positron colliders. This work began in the early 1980s when SLAC upgraded its linear accelerator and converted it into the Stanford Linear Collider (SLC). Whereas CERN’s Large Electron-Positron (LEP) collider achieved higher energies, beam polarization proved to be the SLC’s forte, allowing researchers to probe subtle phenomena in the dominant Standard Model of particle physics. During the late 1980s and early 1990s, experiments at the SLC and LEP studying the decays of massive Z particles pinned down the exact number of light neutrino species and measured many key parameters of the Standard Model – especially the weak mixing angle – to high levels of precision. Since the shutdown of LEP in 2000, SLAC has been generating the highest-energy electron and positron beams in the world.

cernslac4_4-02

Working with colleagues from other high-energy physics laboratories in Japan, Europe and the US, SLAC physicists have developed accelerator technology for a next-generation instrument called the Next Linear Collider, which will be 30 km long. In January 2002, the US High-Energy Physics Advisory Panel recommended that US physicists play a leading role in an international effort to design and build such a linear collider.

Current research programme

cernslac3_4-02

Today the SLAC high-energy physics programme pivots around the PEP-II B-Factory. This facility was built during the mid-1990s under the leadership of SLAC’s current director Jonathan Dorfan as an upgrade of the original PEP storage ring. The electron-positron collider resides in a roughly circular tunnel that courses for 2200 m under one end of the 450 acre site. Inside the sophisticated 1200 ton BaBar particle detector, beams of electrons and positrons collide at unequal energies – 9.0 and 3.1 GeV – creating millions of pairs of B mesons per month. An international collaboration – involving about 550 physicists from more than 70 institutions in nine countries – is examining how these particles disintegrate and searching for subtle differences between matter and antimatter. During the summer of 2001, they uncovered conclusive evidence for such an asymmetry, known as CP violation, in certain specific decays of neutral B mesons. The BaBar collaboration is continuing to seek further examples of this rare phenomenon, which is widely believed to be responsible for the great preponderance of matter in the universe.

Physics research continues to thrive in SLAC’s cavernous End Station A. Since the landmark discovery of quarks there, nuclear and high-energy physicists have used this fixed-target experimental facility to study the substructure of nuclear matter in great detail, most recently with polarized beams of 50 GeV electrons that became available after the construction of the SLC. They are now using these beams to make an exacting measurement of the weak mixing angle by scattering polarized electrons from atomic electrons and measuring the extremely slight asymmetries that are expected to occur. These physicists are also developing high-energy beams of polarized photons to continue their research on the quark-gluon substructure of protons and neutrons.

Looking to the long-term future of high-energy physics research, a group of SLAC physicists and engineers has been working for several years on advanced particle acceleration techniques. In collaboration with university researchers, for example, the group is developing laser-induced plasmas that can boost the energy of an electron beam substantially over very short distances. This team has worked on “plasma lenses” to focus and accelerate particle beams.

SLAC has also been moving aggressively into the closely related fields of particle astrophysics and cosmology, using insights and techniques from particle physics to study the heavens. (In fact, the leading cosmological theory of inflation was conceived at SLAC in 1980 by Alan Guth, then a postdoctoral researcher.) Aided by scientists from universities and laboratories in Europe, Japan and the US, SLAC physicists are designing and building the Gamma-ray Large Area Space Telescope (GLAST), a high-resolution detector of energetic (up to about 300 MeV) photons scheduled for launch into Earth orbit in 2005. GLAST will employ sophisticated particle-detection and data-acquisition techniques that were originally developed for ground-based particle-physics experiments. Jointly funded by the DOE, the National Aeronautics and Space Administration and foreign scientific agencies, this satellite will examine sudden outbursts of gamma rays from black holes and other exotic astrophysical sources.

Synchrotron radiation research

Cutting-edge research into the atomic and molecular structure of matter occurs at SSRL. Using the SPEAR storage ring, which was adapted to function as a dedicated synchrotron radiation source, scientists generate intense X-ray beams from a circulating 3 GeV electron beam. Each year, more than 1600 scientists from many different disciplines use this radiation for research in such areas as designing new drugs, developing advanced information technologies (for example flat-panel computer displays and high-density microchips) and remediation of environmental contamination. Since its inception in 1974, SSRL has pioneered this burgeoning field of synchroton radiation research by developing equipment and experimental techniques commonly used today in nearly 50 such laboratories around the world. A major upgrade of the SPEAR facility that is currently under way will greatly increase the brightness of its X-ray beams and help to keep SSRL competitive with these other facilities.

Today, SLAC and SSRL are poised to begin building a next-generation facility, to be called the Linac Coherent Light Source, that will help to roll back the frontiers of X-ray research. Electrons accelerated in the final third of the linear accelerator will be compressed into tiny bunches that will then be directed through a special magnet array to produce laser-like X-ray beams of unparalleled brilliance. This unique instrument should open up new avenues of scientific research on such topics as ultrafast chemical reactions.

SLAC is also the world’s leader in developing high-power klystrons, which generate the microwaves used to accelerate electrons. Invented in 1937 at Stanford University, klystrons are also used to power radar arrays and for medical accelerators employed in cancer therapy. For decades, SLAC and the nearby Varian Corporation shared people, designs and ideas in a symbiotic relationship that has steadily advanced klystron technology. Medical accelerators are now a billion-dollar industry; they are used to give cancer treatments to more than 100,000 people every day. In addition, a software program called EGS (for Electron-Gamma Shower), developed by SLAC’s Ralph Nelson to simulate showers of subatomic particles, is used by hundreds of hospitals throughout the world to plan radiation dosages for cancer therapy.

Computers and telecommunications are other areas where SLAC research has strongly affected both the US and world economies. In December 1991, Paul Kunz expanded the then-fledgling World Wide Web (invented at CERN by Tim Berners-Lee) to North America, establishing the first US website at SLAC and making its popular SPIRES database easily accessible. The following year, another SLAC physicist developed an influential graphical Web browser to help communicate the reams of data and publications that are produced in the field every year.

Scientific education is ultimately one of SLAC’s most important goals. The thousands of students who have come to the laboratory to participate in advanced research have learned from working side by side with some of the best scientists on the planet, helping to push back the frontiers of their disciplines. They return to universities across the country and around the world – or take positions in industry or government – with a much better understanding of what it means to carry out scientific research.

Les neutrinos vont-ils au paradis? [Do Neutrinos Go to Heaven?]

by François Vannucci. Published in French by EDPSciences, ISBN 2 86883 559 7, 18.

François Vannucci presented me with his manuscript as his first detective story. I am not surprised that his literary debut takes this form. Vannucci’s early career at CERN was followed by a stint at the Stanford linear accelerator. He has been lucky enough to have worked on experiments led by great figures of science in research leading to key discoveries.

That Vannucci is bent wholly on the pursuit of rare game comes as no surprise, and neutrinos with their mysteries were an ideal hunting ground. The nature of the prey lent flavour to the chase. As the secrets of each particle discovered were revealed, they shed new light on the structure of matter at the smallest scale, and on the micro-instants that followed the Big Bang.

Research went on in powerful groups, sometimes numbering several hundred, with ever-more burgeoning budgets. The sociology of this special world had little in common with the atmosphere prevailing in the small university labs of yesteryear. The groups were often led by outstanding physicists whose laurels had often been won as a result of remarkable discoveries – frequently the combined fruit of a great theoretical background and an encyclopaedic knowledge of engineering methods.

But humans being humans, with a genetic baggage built up through scores of millennia of the struggle for life, high-energy physics has included in its ranks the same numbers of men of intelligence, geniuses, madmen, the generous, the envious, the selfish, the disinterested, the brutes, the power-hungry, the poets, the mystics and the cynics as any other group of humanity swept up in any adventure on an equivalent scale. Vannucci introduces us to the way in which the never-ending human comedy is played out at any elementary particle research centre. He brings us into a little world of Parisian physicists headed by a boss with boundless ambition and a fascination for neutrinos. This individual suffers from the shortcomings often found in such people – he is the monster whom no boss would admit to being, yet many ranking physicists will find he has features that smack of their own bosses.

In this book, the research ends in a fiasco that is both material and social. The writer draws a ferocious and desperate fictional portrait of the lives of this team, worn out by failure and disappointed by a leader who had nonetheless fascinated them. Some are still neurotically attached to their boss despite the blind alley that he has led them into. He hangs himself, and we anxiously follow the narrator’s efforts to escape the spell that he has cast.

I hope Vannucci’s new-found narrative gift will persuade him to inform the public of other secrets from the world of the physicist.

Great Physicists: the Life and Times of Leading Physicists from Galileo to Hawking

by William H Cropper, OUP, ISBN 0 19 513748 5, £24.95.

61+pG6p-puL

Physics is the study and formulation by physicists of how nature works. Without physicists, nature would still work but there would be nothing to describe it. Few, even among the physics community, know much about physicists, other than some hype about cult figures like Einstein, Feynman and Hawking.

Only a handful of geniuses, active at a time when their talents can bear fruit, can achieve the milestone discoveries or reveal the new insights that make science history. Here, William Cropper provides biographical snapshots of 30 famous physicists (in 29 chapters – Erwin Schrödinger and Louis de Broglie share a bed), extending through time from Galileo to Hawking, who was born on 8 January 1942, exactly 300 years after Galileo’s death. Hawking himself has remarked on this coincidence, and the fact that these dates provide the parameters of this study reflects the book as a whole.

The portraits are drawn from standard biographies, and those who are acquainted with these works will find nothing new. As Cropper explains in his preface, “No claim is made that this is a comprehensive or scholarly study…Read these chapters casually and for entertainment, and learn the lesson that science is a human endeavour.”

The first section covers the giant figures of Galileo and Newton, who centuries later still tower over the subject. Subsequent sections deal with thermodynamics (from Carnot to Nernst); 19th-century electromagnetism (Faraday and Maxwell); statistical mechanics (Boltzmann alone); relativity (Einstein supreme); quantum mechanics; nuclear physics (Curie, Rutherford, Meitner, and Fermi); particle physics (Dirac, Feynman and Gell-Mann); and astronomy, astrophysics and cosmology (Hubble, Chandrasekhar and Hawking). Most of the book, therefore, deals with 20th-century figures.

The cast of characters is Cropper’s choice and spans the whole spectrum of personality and destiny: tragic figures like Boltzmann, victims of fate like Meitner and Planck, ascetics like Dirac, the flamboyant Feynman, intellectual aristocrats like Gell-Mann and simple geniuses like Rutherford.

The book’s subjects include two women (Curie and Meitner) but are mainly confined to Europe and North America. The exceptions are Chandrasekhar, born in India, who spent his research career in Europe and the US; and Rutherford, born in New Zealand, who spent his research career in England and Canada. There are no Russians, which is a pity, considering the wealth of contributions to physics made by scientists in that country and who have been well represented by Nobel awards.

Each biographical snapshot is prefaced by a useful summary, before a fuller account and an appraisal of the science (including some assimilable equations). Each is also labelled by a thumbnail portrait illustration, but otherwise there are no photographs of events (other than a bubble chamber). There is a separate chronology of the main events of the period covered, but there is no systematic indication of exact dates of birth and death, such as in Asimov’s work.

However, Cropper has done physics a great service by compiling this book, which compresses between two covers valuable material that would otherwise need a small library.

It Must be Beautiful: Great Equations of Modern Science

edited by Graham Farmelo, Granta Books, ISBN 1 86207 479 8, £20.

In this lively volume of semipopular essays, 12 leading scientists, historians of science and science writers discuss “beautiful” equations of 20th-century science. Some of the essays are elegant and revealing discourses centring on the equations themselves; others are equally interesting but more historical in nature, sometimes verging on the biographical. Almost all are accessible to a broad audience with a little scientific background.

cernbooks1_3-02

Roger Penrose and Frank Wilczek thoughtfully discuss the meaning of Einstein’s equations of general relativity and Dirac’s equation respectively. Steven Weinberg, in his extended afterword, also discusses the Dirac equation, and both Wilczek and Weinberg focus on how the equation has survived despite our significantly altered understanding of its meaning since Dirac’s time.

The meaning of the possibly less well known, but certainly beautiful, equations of Yang-Mills theory (as well as such topics as the Higgs mechanism) are also nicely introduced by Christine Sutton. Igor Aleksander provides a rewarding piece on Claude Shannon’s great work founding information theory, and John Maynard-Smith discusses some fascinating aspects of the theory of evolution (including his own use of the theory of games in evolution theory, to which this essay provides a good introduction), while Robert May introduces the deceivingly simple logistic equation with its chaotic solutions.

The essays from Graham Farmelo, Peter Galison, Aisling Irwin and Arthur Miller are also stimulating. Since they tended to be less centred on the equations, they leave room for dispute. For example, Arthur Miller makes a remark near the end of “Erotica, aesthetics and Schrodinger’s wave equation” (Schrodinger’s erotic life is endlessly fascinating to historians) that “the Heisenberg-Schroedinger dispute…was fundamentally one of aesthetic choice” and he points out that physicists use Schrodinger’s formalism rather than Heisenberg’s matrix mechanics for aesthetic reasons. But Born’s great work on the probability interpretation showed that Schrodinger’s interpretation of the wavefunction was incorrect, giving, for example, no understanding of the interference terms in a sum of wavefunctions. Furthermore, surely the reason Schrodinger’s wavefunction (given the correct interpretation) is so popular is because it is easier to use than matrix mechanics, and because it stimulates visualization in the reader, which ultimately leads to suggestions for applications.

Surprisingly, the contents include an essay on Drake’s equation. This is the formula for the number of technological civilizations in our galaxy, depending on such things as the rate of star formation, the likelihood of intelligent life evolving and, least knowable of all, the typical lifespan of a technological civilization. This sums up this collection nicely – you can expect to be entertained and informed in equal measure, often by surprise, and hopefully its success will lead to a second volume.

Hands-on particles appeal to students

Elementary particles are the universe’s simplest constituents, but their interactions are far from simple. When two elementary particles collide, all sorts of things can happen. Viewed through the eye of a big detector, the outcome is usually a convoluted maze of secondary particles and it is difficult to see at first or even at nth glance what is going on. Usually the experiment’s computers have to use complex pattern recognition procedures to join up the individual read-out “dots” and reveal the underlying particle tracks. Even then, additional complex analysis is needed.

Occasionally, however, the interactions recorded by the detector are particularly simple, especially for collision scenarios like those at CERN’s LEP collider, which from 1989 to 2001 threw high-energy beams of electrons and positrons together.

cernedu1_3-02

Electrons and positrons, unlike protons, are truly elementary and contain no constituents (at least as far as we know). They are also particle and antiparticle of each other, and they can mutually annihilate to produce another particle-antiparticle pair, such as two oppositely charged muons. These rare but simple processes provide a direct window into the most basic interactions of nature.

The other tool that can be brought to bear is what CERN researcher Erik Johansson calls “topology”. The computers’ pattern-recognition programs can reveal regularities in the way the produced particles emerge. These patterns reflect the elementary particle interactions in a direct way.

If an underlying scheme is not clear, information quickly becomes baffling and incomprehensible. An example is the subway system of a large city like London or Paris. Street maps of big cities are fine for finding one’s way around on foot. They also usually show where the underground stations are, but this does not make clear how the lines are arranged, so it is not easy to plan an underground journey from such a map.

The key is to use a different map, in which the streets have been thrown away and the station dots are connected by different-coloured lines. Immediately, everything becomes clear – to get from A to B, take the red line and transfer at C onto the blue line. The detailed paths taken by the underground lines are not of vital importance to the traveller, only their general direction and interconnections, so such maps are often schematic. This simplification is a great help to understanding.

So it is with elementary particles. The most versatile elementary particles are the quarks, of which there are six varieties, or “flavours”, arranged in three pairs – up and down; strange and charmed; and beauty (or bottom) and top. Quarks are (as far as we know) the ultimate layer in the structure of matter. Substances are made up in turn of molecules, then atoms, then electrons and nuclei, the last being composed of protons and neutrons, and these in turn being built of quarks.

cernedu2_3-02

Quarks are different from all of the other constituents of matter. Molecules can be broken into atoms, atoms into electrons and nuclei, and nuclei into protons and neutrons. We know that protons and neutrons are built of quarks, but quarks cannot be isolated. Although we can see that protons or neutrons each contain three quarks, under ordinary conditions, quarks cannot exist on their own as free particles. How, then, can we explore them?

Mapping quark jets

When an electron and a positron annihilate, one possibility is for them to create a quark-antiquark pair (in fact the electron and positron first annihilate into a neutral Z boson, which within 10_24 s materializes into the quark-antiquark pair.) However, because quarks and antiquarks cannot exist as free particles, the emergent quark-antiquark pair manifests itself as two narrow sprays (“jets”) of subnuclear particles. These jets, the progeny of the produced quarks, fly off back-to-back, each jet confined around the direction of the parent quark. Mapping these jets thus reveals how the parent quarks were produced.

An electron and a positron can also produce other quark and antiquark arrangements, both with and without accompanying gluons (the particles that transmit the forces between quarks). Each quark-gluon arrangement produces a characteristic jet pattern. For example, a quark-antiquark pair that is accompanied by a single gluon will produce three jets of subnuclear particles.

These jet patterns, particularly when emphasized by the use of colour, are the quark-gluon physics equivalent of the subway map.

It is rather like monitoring an underground/subway/metro system by watching how the passengers emerge above ground. A burst of passengers means that a train has recently stopped underground.

Johansson’s idea is to select LEP events and prepare them in a way that appeals to 15- to 18-year-old students. He uses electron-positron interactions recorded by the Delphi detector at LEP and presented via a special “Hands-on CERN” Web site. The Web site also contains introductory material and explanations, together with supplementary material about subjects such as particles and their interactions, particle accelerators and Nobel Prizes. The interactions have been “cleaned up” to delete information that is only of interest to physics researchers and optimized for Internet access.

cernedu3_3-02

From these displays, students can quickly see how nature works at the most fundamental level. The displays show basic information, such as collision energy, the various particles produced and their energy and momentum. Analysing these collisions does not require any knowledge of quantum chromodynamics or any other exotic concept. Simple billiard-ball kinematics, with maybe a pinch of relativity, is all that is needed.

In this way, students all over the world can access frontier research data, but nothing can substitute for a visit to a major accelerator laboratory. Only in this way can students fully appreciate what large and complex instruments are needed to probe the smallest constituents of matter. Special programmes of lectures and study are regularly arranged for Swedish students by the Swedish Research Council secretariat at CERN along with Swedish CERN researchers. During a short stay at CERN, students get first-hand experience of how science works. “That day I thought I found the Higgs boson,” remarked one recent visitor.

Following a suggestion from Johannson, an extra dimension was recently added to a visit during which students from the UK joined their Swedish counterparts at CERN – at a time in their careers that is useful for future networking. These special programmes aim to rectify the lack of exposure to modern physics in many of the school curricula.

Further information

A keyhole to the birth of time

Building on Johansson’s original idea, CERN’s James Gillies and Richard Jacobsson produced an educational CD-ROM physics analysis package for schools. Particle Physics – a Keyhole to the Birth of Time contains the same real data and analysis tools as the “Hands-on CERN” Web site and comes complete with a comprehensive and visually attractive tutorial package. The authors’ goal is to provide a stand-alone product that teachers can use in class without detailed prior knowledge of particle physics. Bielefeld computer scientist Olaf Lenz designed an easily navigable structure and award-winning American cartoonist Nina Paley created the characters – Malard Decoy and Phyllis Ducque – who act as guides through the content of the CD-ROM. The package is available free of charge to schoolteachers on request to the authors at CERN (e-mail James Gillies or Richard Jacobsson.

Accelerator tradition is thriving at Brookhaven

Brookhaven, funded today by the US Department of Energy, was born of the dreams of scientists returning from Los Alamos after the Second World War. They were looking for facilities to continue their research into the mysteries of the atom and they were unable to find them at their home universities. Soon, championed by Columbia physicists Isidor Rabi and Norman Ramsay, the idea of universities coming together to build a common research institute began to take shape. In 1947, nine north-eastern US universities clubbed together to form Associated Universities, Inc. with the goal of establishing a laboratory, and the model for many of today’s major laboratories, including CERN, was set.

Man-made cosmic rays

Not long after, plans for the Cosmotron – Brookhaven’s first particle accelerator – were laid. Taking its name from the cosmic rays that constantly shower down on Earth, the Cosmotron was the first accelerator to break the giga-electronvolt barrier, reaching energies as high as 3.3 GeV before it was switched off in 1966. The fact that it was also the first accelerator in the world to provide an extracted beam led to the Cosmotron being dubbed the world’s biggest slingshot by Popular Science magazine.

cernbrook1_3-02

Scientifically, the Cosmotron lived up to its name, allowing all kinds of particle formerly seen only in cosmic rays to be studied in the laboratory. It was also the machine behind Brookhaven’s first Nobel prize. Two guest scientists working at the laboratory in 1956 – T D Lee and C N Yang – interpreted Cosmotron data as arising from parity violation in weak interactions, earning themselves a trip to Stockholm just one year later.

It is perhaps to accelerator physics that the Cosmotron left its greatest legacy, however. From the start, the Cosmotron’s builders recognized the limitations of synchrotrons as they were used at the time. In such machines, increasing particle energy was invariably accompanied by increasing orbit instability in the horizontal plane. To build a more powerful machine would require more powerful – and vastly heavier – magnets, imposing a practical upper limit on the energy achievable. The solution, developed by Ernest Courant, Stanley Livingston and Hartland Snyder in the 1950s, was to alternate the horizontal orientation of the bending magnets so that the field gradients in the horizontal plane also alternated. This principle became known as strong focusing and it opened the door to much higher energies.

By this time, Europe’s new laboratory, CERN, was getting off the ground. It was founded on the Brookhaven collegiate model with member states taking the place of Brookhaven’s universities. Links between the two laboratories were close and news of the strong focusing idea reached the European laboratory in time for it to recast its new proton synchrotron (PS) as a strong focusing machine. The CERN PS duly became the first operational, strong-focusing proton synchrotron in the world with a design energy of 25 GeV instead of the 10 GeV that would otherwise have been possible. Brookhaven’s Alternating Gradient Synchrotron (AGS) came on stream soon after and these two machines remain at the heart of the two laboratories’ accelerator complexes to this day.

cernbrook2_3-02

The AGS has provided a rich harvest of physics for Brookhaven, earning the Nobel prize three times. Leon Lederman, Melvin Schwartz and Jack Steinberger had to wait until 1988 to receive the prize for their 1962 discovery of the muon neutrino. James Cronin and Val Fitch had a shorter wait, receiving the call to Stockholm in 1980 for their 1963 observation of CP-violation. Sam Ting picked up the prize for his 1974 discovery of the J/psi particle, along with Burton Richter of California’s SLAC laboratory, just two years later.

Physics in collision

Flush with the success of the AGS, Brookhaven set its sights high and in 1970 accelerator physicist John Blewett revived an earlier idea of building a machine to store and collide proton beams. Named project ISABELLE, Blewett’s plan was to build a pair of intersecting storage rings using the AGS as the injector. R&D for the new machine soon got under way and a ground-breaking ceremony was held in 1978. Soon after, however, a mixture of technical problems and changing political winds led to ISABELLE being dropped in favour of an even more ambitious project elsewhere – the Superconducting Super Collider – the downfall of which was later to send shockwaves around the world’s particle physics community.

Brookhaven persevered and was soon back with a new proposal to build a relativistic heavy-ion collider (RHIC) on the ISABELLE site. RHIC’s main aim would be to seek out and explore the exotic states of matter produced when heavy ions collide at huge energy densities. Using the AGS as the injector, RHIC would also serve as a proton-proton collider with the ability to collide polarized protons, helping to unravel the long-running mystery of nucleon spin. This part of the programme led to the establishment of a joint research initiative between Brookhaven and the Japanese Institute of Physical and Chemical Research (RIKEN) in 1995. The RIKEN-Brookhaven Research Center is now involved in the full RHIC programme and is also building a high-performance supercomputer for lattice QCD. Set to start up in March 2003, the new machine will reach 10 teraflops for the modest price tag of $5 million (about 50 cents per megaflop).

cernbrook3_3-02

Accelerator-based programme

RHIC was switched on in 2000 following a decade of development and construction. The timing interlocked perfectly with a fixed-target, heavy-ion programme at CERN and provided a new focus for this line of research. RHIC’s first polarized protons were injected in December last year. With the AGS fixed-target research programme running concurrently with RHIC for the first time in 2001, Brookhaven’s accelerator-based programme is in rude health. The major thrust of the AGS programme is a long-running rare kaon decay experiment that published results recently on statistics of one event seen in 6 trillion kaon decays.

Particle physics at another AGS experiment was also recently in the spotlight with a new measurement of the muon’s magnetism, which appeared at first to be at odds with the Standard Model. A closer look at the theory, however, showed that it was the Standard Model that was at odds with the Brookhaven measurement. Science proceeds as an ongoing debate between experiment and theory. However, in modern-day particle physics it is rare that experiment leads the discussion.

National light source

Synchrotron radiation at Brookhaven could not come with a better pedigree. The phenomenon was predicted in the 1940s by, among others, the influential Brookhaven accelerator physicist John Blewett, who was then working for the General Electric Company. It was not until 1978, however, that synchrotron light research first made an appearance at the laboratory. Then, when the Department of Energy recognized the need for a national, second-generation light source, Brookhaven was chosen as the site. The National Synchrotron Light Source (NSLS) produced its first light in 1982 from a vacuum ultraviolet ring. An X-ray ring came on stream a few years later, and between them these two synchrotrons provide X-ray, ultraviolet, visible and infrared light to around 100 beamlines.

For the future, a proposed Center for Functional Nanomaterials will complement the NSLS. This will provide researchers with tools to make and study functional nanoscale materials. Functional materials are those that exhibit a predetermined chemical or physical response to external stimuli. The centre aims to achieve a basic understanding of how these materials respond when in nanoscale form. Nanomaterials offer different chemical and physical properties from bulk materials and have the potential to form the basis of new technologies.

cernbrook4_3-02

Accelerating to the future

Today, Brookhaven is a laboratory relying heavily on its accelerator facilities for particle and nuclear physics as well as synchrotron radiation research. Keeping an eye on the future of these fields, the laboratory maintains an accelerator test facility (ATF) with a mission to explore new ideas on how to accelerate particles to higher energies and produce X-ray beams of greater brightness than ever. The ATF puts a range of accelerator and laser components at the disposal of a user community investigating the possibilities of novel acceleration techniques that will be necessary in the long term as experimental demands outstrip the possibilities of current-day technology.

The other main string to the fledgling Brookhaven laboratory’s bow was reactor-based physics. The laboratory’s first reactor – the Brookhaven Graphite Research Reactor (BGRR) – was developed at the same time as the Cosmotron. When it came on stream in 1950, it was the first peacetime reactor to be built in the US after the Second World War. Its dual mission was to produce neutrons for experiments and to refine reactor technology. The BGRR pursued a more applied line of research than its sister facility, the Cosmotron, leading, among other things, to the development of multigrade motor oils through the study of wear in engine components.

Reactor technology moved on and by the late 1950s, Brookhaven embarked on the construction of a new reactor capable of delivering much higher neutron fluxes than the BGRR. The High-Flux Beam Reactor (HFBR) produced its first self-sustaining reaction in 1965. Three years later the BGRR was shut down. HFBR research covered topics as diverse as basic nuclear physics and the development of radioactive isotopes for use in medicine.

cernbrook5_3-02

The HFBR’s illustrious scientific career was marred by an unfortunate end in 1997 when a tritium leak was discovered at Brookhaven, leading to the most delicate period of the laboratory’s history. The tritium came from a leak in the HFBR’s spent-fuel pool and had remained undetected for many years. Careful sampling showed that the leak was confined and posed no danger to Brookhaven employees or the public. Brookhaven, however, found itself in the spotlight, both locally and globally, and implemented a strongly proactive community and media relations programme. Its image recovered, but too late for the HFBR, which has been permanently shuttered since 1999. The closure of a smaller reactor dedicated to medical research in 2000 marked the end of reactors at Brookhaven.

To physicists, Brookhaven is best known for its accelerator and reactor-based research, but as a multidisciplinary laboratory it also supports major programmes in life sciences and energy research. It was at Brookhaven that Lewis Dahl first identified the link between salt and high blood pressure in 1952. Also in the 1950s, Brookhaven scientists Walter Tucker and Powell Richards developed technetium-99m, the world’s most commonly used medical tracer. In the 1960s, George Cotzias began a programme of research at Brookhaven that led to the use of dopamine in the treatment of Parkinson’s disease.

cernbrook6_3-02

In energy research, one of the laboratory’s highlights was triggered by the 1973 OPEC oil embargo when the US government turned to the laboratory for energy conservation solutions. This led to the Brookhaven energy house – a design concept aimed at reducing energy consumption in a family home simply by using conventional technology wisely. Built in 1980, the house uses solar energy and thermal storage to achieve dramatic energy savings. Its design has been widely imitated.

cernbrook7_3-02

Today, Brookhaven is managed by Brookhaven Science Associates. It hosts a thriving research community at its flagship facility, RHIC. In particle physics, the laboratory is also the focal point for US participation in the ATLAS experiment at CERN. Brookhaven’s proud tradition in accelerator physics continues at the ATF, and the NSLS supports a thriving user community of some 2500 researchers. Brookhaven has also become a focus for the local community. Its mere presence on Long Island put more than $24 million into the local economy in 2001. More importantly for the laboratory and for the image of science locally, Brookhaven has become a centre of culture. A visit to its Web site at the time of writing revealed not only news about the molecular structure of cancer-related proteins unravelled at the NSLS, but also about an extravaganza of gospel music in an auditorium more used to the somewhat more sober proceedings of scientific colloquia.

bright-rec iop pub iop-science physcis connect