by François Vannucci. Published in French by EDPSciences, ISBN 2 86883 559 7, 18.
François Vannucci presented me with his manuscript as his first detective story. I am not surprised that his literary debut takes this form. Vannucci’s early career at CERN was followed by a stint at the Stanford linear accelerator. He has been lucky enough to have worked on experiments led by great figures of science in research leading to key discoveries.
That Vannucci is bent wholly on the pursuit of rare game comes as no surprise, and neutrinos with their mysteries were an ideal hunting ground. The nature of the prey lent flavour to the chase. As the secrets of each particle discovered were revealed, they shed new light on the structure of matter at the smallest scale, and on the micro-instants that followed the Big Bang.
Research went on in powerful groups, sometimes numbering several hundred, with ever-more burgeoning budgets. The sociology of this special world had little in common with the atmosphere prevailing in the small university labs of yesteryear. The groups were often led by outstanding physicists whose laurels had often been won as a result of remarkable discoveries – frequently the combined fruit of a great theoretical background and an encyclopaedic knowledge of engineering methods.
But humans being humans, with a genetic baggage built up through scores of millennia of the struggle for life, high-energy physics has included in its ranks the same numbers of men of intelligence, geniuses, madmen, the generous, the envious, the selfish, the disinterested, the brutes, the power-hungry, the poets, the mystics and the cynics as any other group of humanity swept up in any adventure on an equivalent scale. Vannucci introduces us to the way in which the never-ending human comedy is played out at any elementary particle research centre. He brings us into a little world of Parisian physicists headed by a boss with boundless ambition and a fascination for neutrinos. This individual suffers from the shortcomings often found in such people – he is the monster whom no boss would admit to being, yet many ranking physicists will find he has features that smack of their own bosses.
In this book, the research ends in a fiasco that is both material and social. The writer draws a ferocious and desperate fictional portrait of the lives of this team, worn out by failure and disappointed by a leader who had nonetheless fascinated them. Some are still neurotically attached to their boss despite the blind alley that he has led them into. He hangs himself, and we anxiously follow the narrator’s efforts to escape the spell that he has cast.
I hope Vannucci’s new-found narrative gift will persuade him to inform the public of other secrets from the world of the physicist.
by William H Cropper, OUP, ISBN 0 19 513748 5, £24.95.
Physics is the study and formulation by physicists of how nature works. Without physicists, nature would still work but there would be nothing to describe it. Few, even among the physics community, know much about physicists, other than some hype about cult figures like Einstein, Feynman and Hawking.
Only a handful of geniuses, active at a time when their talents can bear fruit, can achieve the milestone discoveries or reveal the new insights that make science history. Here, William Cropper provides biographical snapshots of 30 famous physicists (in 29 chapters – Erwin Schrödinger and Louis de Broglie share a bed), extending through time from Galileo to Hawking, who was born on 8 January 1942, exactly 300 years after Galileo’s death. Hawking himself has remarked on this coincidence, and the fact that these dates provide the parameters of this study reflects the book as a whole.
The portraits are drawn from standard biographies, and those who are acquainted with these works will find nothing new. As Cropper explains in his preface, “No claim is made that this is a comprehensive or scholarly study…Read these chapters casually and for entertainment, and learn the lesson that science is a human endeavour.”
The first section covers the giant figures of Galileo and Newton, who centuries later still tower over the subject. Subsequent sections deal with thermodynamics (from Carnot to Nernst); 19th-century electromagnetism (Faraday and Maxwell); statistical mechanics (Boltzmann alone); relativity (Einstein supreme); quantum mechanics; nuclear physics (Curie, Rutherford, Meitner, and Fermi); particle physics (Dirac, Feynman and Gell-Mann); and astronomy, astrophysics and cosmology (Hubble, Chandrasekhar and Hawking). Most of the book, therefore, deals with 20th-century figures.
The cast of characters is Cropper’s choice and spans the whole spectrum of personality and destiny: tragic figures like Boltzmann, victims of fate like Meitner and Planck, ascetics like Dirac, the flamboyant Feynman, intellectual aristocrats like Gell-Mann and simple geniuses like Rutherford.
The book’s subjects include two women (Curie and Meitner) but are mainly confined to Europe and North America. The exceptions are Chandrasekhar, born in India, who spent his research career in Europe and the US; and Rutherford, born in New Zealand, who spent his research career in England and Canada. There are no Russians, which is a pity, considering the wealth of contributions to physics made by scientists in that country and who have been well represented by Nobel awards.
Each biographical snapshot is prefaced by a useful summary, before a fuller account and an appraisal of the science (including some assimilable equations). Each is also labelled by a thumbnail portrait illustration, but otherwise there are no photographs of events (other than a bubble chamber). There is a separate chronology of the main events of the period covered, but there is no systematic indication of exact dates of birth and death, such as in Asimov’s work.
However, Cropper has done physics a great service by compiling this book, which compresses between two covers valuable material that would otherwise need a small library.
edited by Graham Farmelo, Granta Books, ISBN 1 86207 479 8, £20.
In this lively volume of semipopular essays, 12 leading scientists, historians of science and science writers discuss “beautiful” equations of 20th-century science. Some of the essays are elegant and revealing discourses centring on the equations themselves; others are equally interesting but more historical in nature, sometimes verging on the biographical. Almost all are accessible to a broad audience with a little scientific background.
Roger Penrose and Frank Wilczek thoughtfully discuss the meaning of Einstein’s equations of general relativity and Dirac’s equation respectively. Steven Weinberg, in his extended afterword, also discusses the Dirac equation, and both Wilczek and Weinberg focus on how the equation has survived despite our significantly altered understanding of its meaning since Dirac’s time.
The meaning of the possibly less well known, but certainly beautiful, equations of Yang-Mills theory (as well as such topics as the Higgs mechanism) are also nicely introduced by Christine Sutton. Igor Aleksander provides a rewarding piece on Claude Shannon’s great work founding information theory, and John Maynard-Smith discusses some fascinating aspects of the theory of evolution (including his own use of the theory of games in evolution theory, to which this essay provides a good introduction), while Robert May introduces the deceivingly simple logistic equation with its chaotic solutions.
The essays from Graham Farmelo, Peter Galison, Aisling Irwin and Arthur Miller are also stimulating. Since they tended to be less centred on the equations, they leave room for dispute. For example, Arthur Miller makes a remark near the end of “Erotica, aesthetics and Schrodinger’s wave equation” (Schrodinger’s erotic life is endlessly fascinating to historians) that “the Heisenberg-Schroedinger dispute…was fundamentally one of aesthetic choice” and he points out that physicists use Schrodinger’s formalism rather than Heisenberg’s matrix mechanics for aesthetic reasons. But Born’s great work on the probability interpretation showed that Schrodinger’s interpretation of the wavefunction was incorrect, giving, for example, no understanding of the interference terms in a sum of wavefunctions. Furthermore, surely the reason Schrodinger’s wavefunction (given the correct interpretation) is so popular is because it is easier to use than matrix mechanics, and because it stimulates visualization in the reader, which ultimately leads to suggestions for applications.
Surprisingly, the contents include an essay on Drake’s equation. This is the formula for the number of technological civilizations in our galaxy, depending on such things as the rate of star formation, the likelihood of intelligent life evolving and, least knowable of all, the typical lifespan of a technological civilization. This sums up this collection nicely – you can expect to be entertained and informed in equal measure, often by surprise, and hopefully its success will lead to a second volume.
Elementary particles are the universe’s simplest constituents, but their interactions are far from simple. When two elementary particles collide, all sorts of things can happen. Viewed through the eye of a big detector, the outcome is usually a convoluted maze of secondary particles and it is difficult to see at first or even at nth glance what is going on. Usually the experiment’s computers have to use complex pattern recognition procedures to join up the individual read-out “dots” and reveal the underlying particle tracks. Even then, additional complex analysis is needed.
Occasionally, however, the interactions recorded by the detector are particularly simple, especially for collision scenarios like those at CERN’s LEP collider, which from 1989 to 2001 threw high-energy beams of electrons and positrons together.
Electrons and positrons, unlike protons, are truly elementary and contain no constituents (at least as far as we know). They are also particle and antiparticle of each other, and they can mutually annihilate to produce another particle-antiparticle pair, such as two oppositely charged muons. These rare but simple processes provide a direct window into the most basic interactions of nature.
The other tool that can be brought to bear is what CERN researcher Erik Johansson calls “topology”. The computers’ pattern-recognition programs can reveal regularities in the way the produced particles emerge. These patterns reflect the elementary particle interactions in a direct way.
If an underlying scheme is not clear, information quickly becomes baffling and incomprehensible. An example is the subway system of a large city like London or Paris. Street maps of big cities are fine for finding one’s way around on foot. They also usually show where the underground stations are, but this does not make clear how the lines are arranged, so it is not easy to plan an underground journey from such a map.
The key is to use a different map, in which the streets have been thrown away and the station dots are connected by different-coloured lines. Immediately, everything becomes clear – to get from A to B, take the red line and transfer at C onto the blue line. The detailed paths taken by the underground lines are not of vital importance to the traveller, only their general direction and interconnections, so such maps are often schematic. This simplification is a great help to understanding.
So it is with elementary particles. The most versatile elementary particles are the quarks, of which there are six varieties, or “flavours”, arranged in three pairs – up and down; strange and charmed; and beauty (or bottom) and top. Quarks are (as far as we know) the ultimate layer in the structure of matter. Substances are made up in turn of molecules, then atoms, then electrons and nuclei, the last being composed of protons and neutrons, and these in turn being built of quarks.
Quarks are different from all of the other constituents of matter. Molecules can be broken into atoms, atoms into electrons and nuclei, and nuclei into protons and neutrons. We know that protons and neutrons are built of quarks, but quarks cannot be isolated. Although we can see that protons or neutrons each contain three quarks, under ordinary conditions, quarks cannot exist on their own as free particles. How, then, can we explore them?
Mapping quark jets
When an electron and a positron annihilate, one possibility is for them to create a quark-antiquark pair (in fact the electron and positron first annihilate into a neutral Z boson, which within 10_24 s materializes into the quark-antiquark pair.) However, because quarks and antiquarks cannot exist as free particles, the emergent quark-antiquark pair manifests itself as two narrow sprays (“jets”) of subnuclear particles. These jets, the progeny of the produced quarks, fly off back-to-back, each jet confined around the direction of the parent quark. Mapping these jets thus reveals how the parent quarks were produced.
An electron and a positron can also produce other quark and antiquark arrangements, both with and without accompanying gluons (the particles that transmit the forces between quarks). Each quark-gluon arrangement produces a characteristic jet pattern. For example, a quark-antiquark pair that is accompanied by a single gluon will produce three jets of subnuclear particles.
These jet patterns, particularly when emphasized by the use of colour, are the quark-gluon physics equivalent of the subway map.
It is rather like monitoring an underground/subway/metro system by watching how the passengers emerge above ground. A burst of passengers means that a train has recently stopped underground.
Johansson’s idea is to select LEP events and prepare them in a way that appeals to 15- to 18-year-old students. He uses electron-positron interactions recorded by the Delphi detector at LEP and presented via a special “Hands-on CERN” Web site. The Web site also contains introductory material and explanations, together with supplementary material about subjects such as particles and their interactions, particle accelerators and Nobel Prizes. The interactions have been “cleaned up” to delete information that is only of interest to physics researchers and optimized for Internet access.
From these displays, students can quickly see how nature works at the most fundamental level. The displays show basic information, such as collision energy, the various particles produced and their energy and momentum. Analysing these collisions does not require any knowledge of quantum chromodynamics or any other exotic concept. Simple billiard-ball kinematics, with maybe a pinch of relativity, is all that is needed.
In this way, students all over the world can access frontier research data, but nothing can substitute for a visit to a major accelerator laboratory. Only in this way can students fully appreciate what large and complex instruments are needed to probe the smallest constituents of matter. Special programmes of lectures and study are regularly arranged for Swedish students by the Swedish Research Council secretariat at CERN along with Swedish CERN researchers. During a short stay at CERN, students get first-hand experience of how science works. “That day I thought I found the Higgs boson,” remarked one recent visitor.
Following a suggestion from Johannson, an extra dimension was recently added to a visit during which students from the UK joined their Swedish counterparts at CERN – at a time in their careers that is useful for future networking. These special programmes aim to rectify the lack of exposure to modern physics in many of the school curricula.
Further information
A keyhole to the birth of time
Building on Johansson’s original idea, CERN’s James Gillies and Richard Jacobsson produced an educational CD-ROM physics analysis package for schools. Particle Physics – a Keyhole to the Birth of Time contains the same real data and analysis tools as the “Hands-on CERN” Web site and comes complete with a comprehensive and visually attractive tutorial package. The authors’ goal is to provide a stand-alone product that teachers can use in class without detailed prior knowledge of particle physics. Bielefeld computer scientist Olaf Lenz designed an easily navigable structure and award-winning American cartoonist Nina Paley created the characters – Malard Decoy and Phyllis Ducque – who act as guides through the content of the CD-ROM. The package is available free of charge to schoolteachers on request to the authors at CERN (e-mail James Gillies or Richard Jacobsson.
Brookhaven, funded today by the US Department of Energy, was born of the dreams of scientists returning from Los Alamos after the Second World War. They were looking for facilities to continue their research into the mysteries of the atom and they were unable to find them at their home universities. Soon, championed by Columbia physicists Isidor Rabi and Norman Ramsay, the idea of universities coming together to build a common research institute began to take shape. In 1947, nine north-eastern US universities clubbed together to form Associated Universities, Inc. with the goal of establishing a laboratory, and the model for many of today’s major laboratories, including CERN, was set.
Man-made cosmic rays
Not long after, plans for the Cosmotron – Brookhaven’s first particle accelerator – were laid. Taking its name from the cosmic rays that constantly shower down on Earth, the Cosmotron was the first accelerator to break the giga-electronvolt barrier, reaching energies as high as 3.3 GeV before it was switched off in 1966. The fact that it was also the first accelerator in the world to provide an extracted beam led to the Cosmotron being dubbed the world’s biggest slingshot by Popular Science magazine.
Scientifically, the Cosmotron lived up to its name, allowing all kinds of particle formerly seen only in cosmic rays to be studied in the laboratory. It was also the machine behind Brookhaven’s first Nobel prize. Two guest scientists working at the laboratory in 1956 – T D Lee and C N Yang – interpreted Cosmotron data as arising from parity violation in weak interactions, earning themselves a trip to Stockholm just one year later.
It is perhaps to accelerator physics that the Cosmotron left its greatest legacy, however. From the start, the Cosmotron’s builders recognized the limitations of synchrotrons as they were used at the time. In such machines, increasing particle energy was invariably accompanied by increasing orbit instability in the horizontal plane. To build a more powerful machine would require more powerful – and vastly heavier – magnets, imposing a practical upper limit on the energy achievable. The solution, developed by Ernest Courant, Stanley Livingston and Hartland Snyder in the 1950s, was to alternate the horizontal orientation of the bending magnets so that the field gradients in the horizontal plane also alternated. This principle became known as strong focusing and it opened the door to much higher energies.
By this time, Europe’s new laboratory, CERN, was getting off the ground. It was founded on the Brookhaven collegiate model with member states taking the place of Brookhaven’s universities. Links between the two laboratories were close and news of the strong focusing idea reached the European laboratory in time for it to recast its new proton synchrotron (PS) as a strong focusing machine. The CERN PS duly became the first operational, strong-focusing proton synchrotron in the world with a design energy of 25 GeV instead of the 10 GeV that would otherwise have been possible. Brookhaven’s Alternating Gradient Synchrotron (AGS) came on stream soon after and these two machines remain at the heart of the two laboratories’ accelerator complexes to this day.
The AGS has provided a rich harvest of physics for Brookhaven, earning the Nobel prize three times. Leon Lederman, Melvin Schwartz and Jack Steinberger had to wait until 1988 to receive the prize for their 1962 discovery of the muon neutrino. James Cronin and Val Fitch had a shorter wait, receiving the call to Stockholm in 1980 for their 1963 observation of CP-violation. Sam Ting picked up the prize for his 1974 discovery of the J/psi particle, along with Burton Richter of California’s SLAC laboratory, just two years later.
Physics in collision
Flush with the success of the AGS, Brookhaven set its sights high and in 1970 accelerator physicist John Blewett revived an earlier idea of building a machine to store and collide proton beams. Named project ISABELLE, Blewett’s plan was to build a pair of intersecting storage rings using the AGS as the injector. R&D for the new machine soon got under way and a ground-breaking ceremony was held in 1978. Soon after, however, a mixture of technical problems and changing political winds led to ISABELLE being dropped in favour of an even more ambitious project elsewhere – the Superconducting Super Collider – the downfall of which was later to send shockwaves around the world’s particle physics community.
Brookhaven persevered and was soon back with a new proposal to build a relativistic heavy-ion collider (RHIC) on the ISABELLE site. RHIC’s main aim would be to seek out and explore the exotic states of matter produced when heavy ions collide at huge energy densities. Using the AGS as the injector, RHIC would also serve as a proton-proton collider with the ability to collide polarized protons, helping to unravel the long-running mystery of nucleon spin. This part of the programme led to the establishment of a joint research initiative between Brookhaven and the Japanese Institute of Physical and Chemical Research (RIKEN) in 1995. The RIKEN-Brookhaven Research Center is now involved in the full RHIC programme and is also building a high-performance supercomputer for lattice QCD. Set to start up in March 2003, the new machine will reach 10 teraflops for the modest price tag of $5 million (about 50 cents per megaflop).
RHIC was switched on in 2000 following a decade of development and construction. The timing interlocked perfectly with a fixed-target, heavy-ion programme at CERN and provided a new focus for this line of research. RHIC’s first polarized protons were injected in December last year. With the AGS fixed-target research programme running concurrently with RHIC for the first time in 2001, Brookhaven’s accelerator-based programme is in rude health. The major thrust of the AGS programme is a long-running rare kaon decay experiment that published results recently on statistics of one event seen in 6 trillion kaon decays.
Particle physics at another AGS experiment was also recently in the spotlight with a new measurement of the muon’s magnetism, which appeared at first to be at odds with the Standard Model. A closer look at the theory, however, showed that it was the Standard Model that was at odds with the Brookhaven measurement. Science proceeds as an ongoing debate between experiment and theory. However, in modern-day particle physics it is rare that experiment leads the discussion.
National light source
Synchrotron radiation at Brookhaven could not come with a better pedigree. The phenomenon was predicted in the 1940s by, among others, the influential Brookhaven accelerator physicist John Blewett, who was then working for the General Electric Company. It was not until 1978, however, that synchrotron light research first made an appearance at the laboratory. Then, when the Department of Energy recognized the need for a national, second-generation light source, Brookhaven was chosen as the site. The National Synchrotron Light Source (NSLS) produced its first light in 1982 from a vacuum ultraviolet ring. An X-ray ring came on stream a few years later, and between them these two synchrotrons provide X-ray, ultraviolet, visible and infrared light to around 100 beamlines.
For the future, a proposed Center for Functional Nanomaterials will complement the NSLS. This will provide researchers with tools to make and study functional nanoscale materials. Functional materials are those that exhibit a predetermined chemical or physical response to external stimuli. The centre aims to achieve a basic understanding of how these materials respond when in nanoscale form. Nanomaterials offer different chemical and physical properties from bulk materials and have the potential to form the basis of new technologies.
Today, Brookhaven is a laboratory relying heavily on its accelerator facilities for particle and nuclear physics as well as synchrotron radiation research. Keeping an eye on the future of these fields, the laboratory maintains an accelerator test facility (ATF) with a mission to explore new ideas on how to accelerate particles to higher energies and produce X-ray beams of greater brightness than ever. The ATF puts a range of accelerator and laser components at the disposal of a user community investigating the possibilities of novel acceleration techniques that will be necessary in the long term as experimental demands outstrip the possibilities of current-day technology.
The other main string to the fledgling Brookhaven laboratory’s bow was reactor-based physics. The laboratory’s first reactor – the Brookhaven Graphite Research Reactor (BGRR) – was developed at the same time as the Cosmotron. When it came on stream in 1950, it was the first peacetime reactor to be built in the US after the Second World War. Its dual mission was to produce neutrons for experiments and to refine reactor technology. The BGRR pursued a more applied line of research than its sister facility, the Cosmotron, leading, among other things, to the development of multigrade motor oils through the study of wear in engine components.
Reactor technology moved on and by the late 1950s, Brookhaven embarked on the construction of a new reactor capable of delivering much higher neutron fluxes than the BGRR. The High-Flux Beam Reactor (HFBR) produced its first self-sustaining reaction in 1965. Three years later the BGRR was shut down. HFBR research covered topics as diverse as basic nuclear physics and the development of radioactive isotopes for use in medicine.
The HFBR’s illustrious scientific career was marred by an unfortunate end in 1997 when a tritium leak was discovered at Brookhaven, leading to the most delicate period of the laboratory’s history. The tritium came from a leak in the HFBR’s spent-fuel pool and had remained undetected for many years. Careful sampling showed that the leak was confined and posed no danger to Brookhaven employees or the public. Brookhaven, however, found itself in the spotlight, both locally and globally, and implemented a strongly proactive community and media relations programme. Its image recovered, but too late for the HFBR, which has been permanently shuttered since 1999. The closure of a smaller reactor dedicated to medical research in 2000 marked the end of reactors at Brookhaven.
To physicists, Brookhaven is best known for its accelerator and reactor-based research, but as a multidisciplinary laboratory it also supports major programmes in life sciences and energy research. It was at Brookhaven that Lewis Dahl first identified the link between salt and high blood pressure in 1952. Also in the 1950s, Brookhaven scientists Walter Tucker and Powell Richards developed technetium-99m, the world’s most commonly used medical tracer. In the 1960s, George Cotzias began a programme of research at Brookhaven that led to the use of dopamine in the treatment of Parkinson’s disease.
In energy research, one of the laboratory’s highlights was triggered by the 1973 OPEC oil embargo when the US government turned to the laboratory for energy conservation solutions. This led to the Brookhaven energy house – a design concept aimed at reducing energy consumption in a family home simply by using conventional technology wisely. Built in 1980, the house uses solar energy and thermal storage to achieve dramatic energy savings. Its design has been widely imitated.
Today, Brookhaven is managed by Brookhaven Science Associates. It hosts a thriving research community at its flagship facility, RHIC. In particle physics, the laboratory is also the focal point for US participation in the ATLAS experiment at CERN. Brookhaven’s proud tradition in accelerator physics continues at the ATF, and the NSLS supports a thriving user community of some 2500 researchers. Brookhaven has also become a focus for the local community. Its mere presence on Long Island put more than $24 million into the local economy in 2001. More importantly for the laboratory and for the image of science locally, Brookhaven has become a centre of culture. A visit to its Web site at the time of writing revealed not only news about the molecular structure of cancer-related proteins unravelled at the NSLS, but also about an extravaganza of gospel music in an auditorium more used to the somewhat more sober proceedings of scientific colloquia.
by Ray Mackintosh, Jim Al-Khalili, Björn Jonson and Teresa Peña, Canopus Publishing, ISBN 0953786838, £14.95.
A lot of care and attention have gone into this attractive book. The authors are all eminent nuclear physicists who have developed an interest in outreach and public awareness of science, and it shows. Around their text, the book has been tastefully designed and illustrated, depicting the quest to unravel the ultramicroscopic structure of matter, particularly during the 20th century.
The keynote is the book’s nuclear physics standpoint. Fundamental particles and their constituent quarks and gluons are only mentioned in passing, but this is no obstacle.
Physics is a collection of natural phenomena, but it needs physicists to interpret it and to make it understandable, and the book continually underlines the role played by pioneers like Rutherford, Hofstadter and Mottelson. After introducing the nuclear structure of our everyday world, it goes on to point out a wider nuclear landscape – the wealth of synthetic unstable isotopes and their production, behaviour and properties.
As well as its ominous implications for warfare and its still-considerable contribution to power supply, nuclear physics has a range of applications in medicine, industry, the environment and even the home, essentially through the manufacture of radioisotopes, which again are well documented and illustrated in this book.
Nuclear physics provides a prolific source of power on many different scales. At the beginning of the 20th century, physicists did not even understand what made the Sun shine. The subsequent understanding of the role of nuclear mechanisms in astrophysics evolved slowly through the work of major figures, like Hans Bethe and William Fowler. At this point of history the book strangely deviates from its policy of presenting cameo portraits of key researchers.
Modern cosmologists try to work out what happened in the first tiny fraction of a second after the Big Bang. The universe had reached the ripe old age of one second before any nuclei appeared on the scene. Having carefully traced the role of nuclei back to this entry point, the book ironically ends.
There are a few minor errors – Rutherford is introduced twice and a bubble chamber photograph of the discovery of the positron is upside down. Perhaps the advertised selling price is another error. How can such an attractive book be made available so cheaply? Nucleus is one of the first books to be produced by Canopus, a new force in popular science publishing. Buy it while stocks last.
by G C Lowenthal and P L Airey, Cambridge University Press, ISBN 0521553059.
At a time when applications of nuclear and particle physics are gaining ground in environmental sciences, life sciences, nuclear energy and materials research, this book is a welcome new arrival. Mainly it addresses practical applications in industry, but it also covers medical and energy sciences. The development of novel concepts in radiation detection and particle accelerators for basic subatomic physics, with potential for future applied use, make it particularly timely.
The first half of the book is dedicated to the basics of nuclear physics for non-expert readers. For this reason, some of the latest phenomena are not covered and the terminology is not always in line with that of the most recent literature of nuclear science. The book provides information for practical work with radionuclides, including the basics of radioactive decays, interaction of radiation with matter and radiation detectors. The guiding principles for working with radioactive sources in both industrial and laboratory environments are well covered. Procedures for estimating dose rates in different environments are also discussed. Measurements and results receive attention, with the book providing guidance for data analysis from radiation measurements.
Applications in industry and the environment are covered in the second half of the book. There is a detailed description of techniques based on the interaction of radiation with matter, using examples covering transmission, scattering, absorption and activation by beta particles, protons, alpha particles, photons and neutrons. Applications discussed include paper analysis, moisture meters, neutron radiography, multi-element analysis, sterilization and polymerization. Radiotracer techniques are also broadly covered, with detailed formulation in flow measurements with radioactive tracer isotopes. Radionuclides in the environment are covered, both for naturally occurring and for man-made radioisotopes.
In short, this book is a sound addition to the limited literature dealing with applications of radioactivity and radionuclides. It also serves as a useful reference source for those working professionally with accelerators and radioisotopes.
by Peter K F Grieder, Elsevier Science, ISBN 0444507108, 190.59 euros/$207.
This book is a remarkable collection of graphs, tables, data and relevant discussions about cosmic-ray physics. As the subtitle, Researcher’s Reference Manual and Data Book, suggests, this is neither a text nor a tutorial, but a valuable resource for scientists in cosmic-ray research and related fields of physics and astrophysics.
In 1984 Peter Grieder of Bern co-wrote, with O C Allkofer of Kiel, a 379-page reference manuscript in the Physics Data series of the Karlsruhe Fach-informations-zentrum (Cosmic Rays on Earth ISBN 03448401) which contained much useful data and has been widely used. Following the death of his co-writer, Grieder has undertaken the daunting task of revising, updating and expanding the work by himself. In view of the significant quantity of new data appearing over the past two decades, this is most appropriate. The result is a comprehensive 1112-page, hardcover volume.
Over the past two decades there has been a significant migration of physicists away from traditional accelerator-based particle physics into particle astrophysics (the current, more erudite, term for cosmic-ray physics). This is perhaps nowhere more apparent than in the formation and direction of major cosmic-ray collaborations by Jim Cronin and Sam Ting, both of whom are Nobel laureates in accelerator-based particle physics.
Mature physicists moving into cosmic-ray-related research from other areas should find this book a particularly valuable source of cosmic-ray knowledge. Of course, it is not intended to replace the role of classic texts such as Thomas Gaisser’s Cosmic Rays and Particle Physics. It covers cosmic rays in the atmosphere, at sea level, underground, underwater and under ice; the primary radiation; solar phenomena; and cosmic-ray history.
In view of the current lively interest in neutrino oscillations and the interpretation of data from Kamioka, Gran Sasso, Homestake, Soudan, Lake Baikal, Antarctica, the Mediterranean Sea and elsewhere, these detailed discussions are highly useful. They may also reflect Grieder’s close connections with the DUMAND and NESTOR underwater detector programmes.
There is a substantive discussion of, and data compilation related to, neutrinos, including both atmospheric and solar neutrinos, the latter as part of the chapter on heliospheric phenomena.
There are also comprehensive chapters in each section devoted to protons, neutrons, heavier nuclei, electrons, positrons and gamma rays (as well as muons and neutrinos). The existing data relevant to major problems and projects currently in progress are well presented. This includes such issues as the primary cosmic-ray spectrum and composition above the GZK cut-off; the confusion concerning the elemental composition in the neighbourhood of the “knee”; questions surrounding the primary antiproton flux; and the limits on primordial antimatter cosmic rays. The author presents this vast accumulation of data without editorial prejudice, so he resists noting which data should be regarded as most reliable and which should be accepted with scepticism.
The latter portion of the book includes useful reference material, such as the “optical, etc properties of water and ice”; parameters of the atmosphere (pressure and temperature versus height above sea level); the “solar elemental abundances”; and tables and graphs of muon dE/dx versus energy in various substances. There is even a full-page table of the many cosmic-ray observing stations around the globe – past and present – together with the altitude and atmospheric overburden of each.
This book is certain to become a standard reference for scholars in the cosmic-ray community, as well as for students and for other physicists and astrophysicists whose activities interface with cosmic-ray issues. Certainly, many of this book’s graphs and tables will be superseded by more precise, forthcoming data from Superkamiokande, Kascade, ACCESS, IceCube, the Pierre Auger project and the many other ongoing and future cosmic-ray research activities, but its value as a reference will surely continue well into the 21st century – until someone else with Grieder’s breadth of comprehension and boundless energy steps forward to undertake another revision.
by Edmund Wilson, 2001 Oxford University Press, ISBN 0198508298 (pbk), ISBN 0198520549 (hbk).
Having designed real accelerators himself and having spent the last nine years at the helm of the CERN Accelerator School, Edmund (Ted) Wilson is well placed to write this book, which is an excellent introduction to a fascinating field of activity.
The book provides students with an understanding of the basic physics of particle accelerators and conveys the flavour of their technology and applications. As such it fills a useful gap between journalistic descriptive works and landmark reference books, in that it treats the reader as an intelligent scientist or engineer, willing to invest some time in the understanding of the principles invoked, yet presents the information in an attractive and digestible form.
In this respect the introduction to the subject via the history of accelerators is certainly a good way to keep the reader interested while nevertheless introducing essential concepts. But, as we all know, to understand does not necessarily mean to learn, and the inclusion of a small set of exercises at the end of each chapter is an effective way of encouraging those who really want to learn about, rather than become simply acquainted with, the subject. Having the answers at the end of the book is a real encouragement to try the exercises.
After the historical introduction to the subject, the main body of the book is devoted to the behaviour of beams of particles and the methods that are used to focus, bend, accelerate and control them. In addition to classical linear theory, the mechanisms and problems associated with nonlinearities, resonances, space charge, instabilities and synchrotron radiation are all introduced.
There follows a well balanced description of the increasingly varied applications of these devices. The final chapter, giving an outline of promising ideas for accelerating beams of particles that have not yet resulted in practical machines, should stimulate students who are interested in pursuing this path into adopting or inventing new techniques to achieve evermore efficient machines.
This is not the only book on the subject, but it does serve as a well written and well balanced introduction – not only for students, but also for anyone drawn into the field in a related scientific, engineering or administrative capacity. The layout of the book is clear and the text is backed up with a wealth of good illustrations. Readers requiring a deeper insight into one aspect or another of the subject are invited to consult more specialized works, all of which are cited in the excellent bibliography.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.