Comsol -leaderboard other pages

Topics

Particle physics: a world without borders

cernview1_9-02

Belorussian, Catalan, Taiwanese, Afrikaans, Japanese, Persian, Russian, Mandarin, Hebrew, Italian, Tagalog, Croatian, Malayalam, Serbian, German, Korean, Swedish, Cantonese, Turkish, Arabic, Romanian, Gujarati, Welsh, Georgian, Lëtzebuergesch…

These are just some of the more than 60 languages spoken by collaborators on Fermilab’s CDF and D0 experiments, as determined by a quick and utterly unscientific survey in mid-July. A poll of collaborations at CERN, DESY, KEK or any of the world’s particle physics laboratories would reveal a comparable population of polyglots – men and women from every corner of the world who have come together to explore the nature of matter and energy, space and time. Particle physics is truly one community, without borders.

Moreover, when it comes to advances in research at the world’s handful of particle physics laboratories, we are all in this together, for better or worse. When CERN gets a cold, Fermilab sneezes, and vice versa. At the moment, the particle physics world is watching as Fermilab struggles to fulfil the promise of Run II at the Tevatron; and CERN’s current LHC budget and schedule challenges have strong implications for the future of every physics laboratory. On a brighter note, a physics discovery at laboratory A inevitably builds upon work at laboratories B and C. It all comes together in one worldwide particle physics enterprise.

Yet while particle physics collaborations are international, particle physics communication is not. For the most part, each region and each laboratory communicates for itself, with little coordination on issues, strategies, resources and messages. Does a press release from one laboratory (my own, for example) trumpeting a new experimental result give more than a nod to the work at other laboratories that made the result possible? It’s doubtful. With difficult news to break, does one laboratory seek support from the others? Provide a clue that it’s coming? Not likely. In planning communication strategies, do communicators coordinate their efforts? Probably not. Global Communication Network? Forget it. When it comes to communication, every laboratory is an island.

Standard model of communication

It is high time particle physics communication caught up with the reality of particle physics collaboration. To achieve the kind of future that particle physicists everywhere would like for their field, the Standard Model of Physics Communication will have to change.

In December 2001, communicators from six of the world’s physics laboratories met at DESY in Hamburg to form a worldwide collaboration for physics communication. The immediate stimulus for the meeting was a message from Petra Folkerts, communication director at DESY, to Fermilab on 12 September 2001:

“I want to say that we are all with you in these days. I myself can’t find the right words to express my feelings after this terrible 11 September. From my point of view now it’s absolutely important that we outreach people around the world will meet as soon as possible, not only to figure out how to help international particle physics stay alive, but how we, in our field of activity, can set visible footprints for the significance of peaceful collaboration across all borders.”

The message gave impetus to a project that communicators at particle physics laboratories had pondered for some time, and led to the formation of an international laboratory communication council. Membership has grown to 10 laboratories from five countries.

Initial actions of the council include the development of a particle physics image bank comprising the best photographs and graphic resources from the world’s laboratories, appropriately captioned and credited – one-stop shopping for reporters, physicists, students, teachers and policy makers who need outstanding graphics to tell the particle physics story. The image bank will live on a new website – interactions.org – devoted not to the support of any one laboratory or region, but to all. Advance coordination of press releases among member laboratories has already begun, not only to enhance the recognition of discoveries wherever they occur, but also to foster the recognition of the interconnected nature of advances in particle physics throughout the world. The collaboration plans staff exchanges, workshops and panels at international physics conferences.

The time has passed when one laboratory or one sector of the particle physics field could profit at the expense of another. Progress at every laboratory and in every region depends on the success of particle physics everywhere. As the early American experimental physicist Benjamin Franklin told his colonial colleagues in 1776: “We must all hang together, or assuredly we shall hang separately.” The Quark Wars are over. The laboratory communication council represents a recognition of this reality by the world’s particle physics communicators. As Folkerts stressed in her message of September 2001, it is a collaborative endeavour.

Whether they speak Gujarati or Georgian, Swedish or Romanian, Tagalog or the Queen’s English, I hope that particle physicists everywhere will support this worldwide venture in physics communication.

Heavy Flavour Physics – Theory and Experimental Results in Heavy Quark Physics

edited by C T H Davies and S M Playfer, Institute of Physics Publishing, ISBN 0750308672, £40.00 (€ 63).

9780750308670

A graduate text based on lectures originally presented at the 55th Scottish Universities Summer School in Physics, held at St Andrews in 2001. The school was a NATO Advanced Study Institute.

Facing Up: Science and its Cultural Adversaries

by Steven Weinberg, Harvard University Press, ISBN 067400647X, £17.95 (€ 28).

cernbooks1_9-02

These 23 essays written by Steven Weinberg from 1985 to 1999 make a nice collection around the theme of reductionism. Each is preceded by a page or so describing the context, which is often a valuable addition to the main texts of the essays. Professor Weinberg’s introduction to the set led me to believe that the book would be about facing up to the reality of a neutral universe: Tycho Brahe’s statue looking up to the sky is on the cover. The secondary title is more apt: the majority of the essays are in defence of the scientific approach to understanding our surroundings. Flaws in other approaches, especially constructionist, are pointed out.

Weinberg makes a strong case for reductionism. Phenomena can be explained in terms of others, but these explanations come in a hierarchy, which clearly points back to a theory of everything, as yet to be discovered. Physics is closest to this origin and physicists are closing in.

Not being a physicist myself, I found that many of the essays are brilliant formulations of our understanding of physics, better than anything I have read before. Apart from two more or less political statements, which I felt were out of place, the collection is very homogeneous. However, this is also its weak side: points are necessarily repeated and I will now certainly remember that the Standard Model has 18 parameters that we cannot yet calculate. From 1985 to 1999 many things happened to high-energy physics, such as the cancellation of the Superconducting Super Collider. Unless one knows the dates of these events, it is somewhat confusing to the non-physicist to follow the arguments as there is neither a synoptic nor a statement of the current state of affairs.

One thing I am not so sure of is the “emergence” argument. According to Weinberg, apart from historical accidents (initial conditions), what we observe can be understood exclusively in terms of the hierarchy of explanations, with physics at the root. However, computer simulations (for example of neuronal systems) seem to indicate that more than one underlying “physics” can indistinguishably lead to the same behaviour, by construction. Does that not mean that the mathematics governing this behaviour is independent of those physics? Then there may be independent sciences after all.

My favourite essays are the one in which Weinberg takes the humorous view that non-physicists are somewhat odd, and the 19-page overview of the history of physics in the 20th century. The latter is by far the clearest article on the fundamental ideas behind relativity and quantum mechanics that I have encountered.

The argument that science advances and that it does so independently of the cultural background is certainly in agreement with my own limited experience. Wherever in the world you walk into a university you suddenly feel this, whether lunch is eaten with chopsticks or a totalitarian regime has just been shed.

I greatly enjoyed this collection – it makes me want an entire book in which Weinberg expands on the individual views rather than repeating them in the condensed form of the essays. We need more of this eminently clear exposure of how science works.

Europe coordinates astroparticle research

cernnews3_7-02

European astroparticle physics received a boost last year with the formation of the Astroparticle Physics European Coordination (APPEC), established in an agreement signed by funding agencies from France, Germany, Italy, the Netherlands and the UK. Astroparticle physics – which covers topics as diverse as cosmic rays, dark matter and gravitational radiation – falls between more traditional areas such as particle physics, nuclear physics and astronomy, and so can lose funding opportunities. Also, different countries have different ways of defining astroparticle physics. APPEC has been set up to promote co-operation within Europe’s growing astroparticle physics community, and to develop long-term strategies at the European level, in particular for funding.

APPEC’s activities are organized through two main committees: a steering committee, currently led by Jean-Jacques Aubert of the French CNRS; and a peer-review committee, chaired by Ricardo Barbieri of the Scuola Normale Superior in Pisa. The steering committee, which meets twice a year and includes representatives from the initial partners, has already met in Berlin and London. One important action has been to begin work on a bid to the EU 6th Framework for up to 720 million for an Integrated Initiative Infrastructure (I3). The committee also seeks to widen APPEC’s membership – Spain, for example, is joining, and other countries have been approached.

The peer-review committee, which also meets twice a year, aims to assess existing programmes in different areas of astroparticle physics, and to encourage future collaboration. The committee has already met twice, to review experiments in double beta decay and in dark matter. Its next meeting, in January 2003, will consider high-energy neutrino experiments.

David Wark from Sussex and RAL, who is one of the members of the steering committee, said: “I believe this is a positive step for astroparticle physics, as it can help bring some of the rigour, co-operation and international clout to astroparticle physics that organizations like CERN and DESY bring to accelerator physics. It will also help to get astroparticle physics projects judged using similar criteria in all the countries from which they require support.”

Aside from its committee meetings, APPEC will keep in touch with European astroparticle physicists throughout the year with an electronic newsletter and a website, to be launched later this year. In the meantime, to register interest in receiving the newsletter, please email sacquin@dapnia.cea.fr.

RIKEN and Brookhaven renew their vows

cernnews5_7-02

At a ceremony marking the beginning of spin physics at Brookhaven National Laboratory’s Relativistic Heavy Ion Collider (RHIC), the US laboratory renewed its collaboration agreement with Japan’s Institute of Physical and Chemical Research (RIKEN) for a further 5 years. Initially established in 1995, the RIKEN-BNL agreement has been instrumental in establishing the spin-physics programme at RHIC, and led to the establishment of the RIKEN-BNL Research Center (RBRC) in 1997.

Latin American physics centre celebrates 40th anniversary

cernnews9_7-02

The Rio de Janeiro-based Centro Latinoamericano de Física (CLAF) celebrated its 40th anniversary on 26 March. Founded under the auspices of UNESCO following the CERN model, CLAF was established to promote research in physics and provide postgraduate training to young physicists in the region. The physicists Juan José Giambiagi of Argentina, José Leite Lopes of Brazil and Marcos Moshinsky of Mexico were instrumental in its creation. Today, CLAF has 13 member states.

The celebrations took the form of a week-long international meeting in Rio that focused on CLAF’s international collaborations and looked at the long-term future of the centre. Speakers included Faheem Hussain of the International Centre for Theoretical Physics in Trieste, Italy, which has been involved in a programme of joint PhD work with Latin American institutions since 1997. Vladimir Kadyshevsky, director of the Dubna-based Joint Institute for Nuclear Research, and Pavel Bogoliubov, who is responsible for the institute’s international relations, spoke of a 4-year-old programme to train Latin American graduate students in Russia, and of the 25 MeV microtron built at Dubna to form the basis of a proposed regional laboratory in Havana, Cuba. CERN’s Juan Antonio Rubio, who is responsible for the laboratory’s education and technology transfer activity as well as links with Latin America, spoke of the agreement signed in 2001 between CERN and CLAF to organize a joint biennial school in Latin America. Staying with education, Ramón Pascual, former rector of the Universidad Autónoma in Barcelona, signed an agreement at the meeting allowing Latin American students to take part in the European Joint Universities Accelerator School.

Research was discussed by Ana María Cetto, coordinator of the Latin American Scientific Networks, who spoke of a project to foster greater Latin American use of observatories in Chile and the possibility of extending the facilities at Mount Chacaltaya in Bolivia.

Looking to the future, CLAF support for the second CERN-CLAF school to be held in Mexico in 2003 was confirmed at the meeting, and CLAF announced the creation of a biennial school in medical physics and synchrotron radiation. The centre is also promoting improvements in postgraduate education through the ICTP programme and by coordinating a regional Masters degree programme. In research, CLAF will assume a stronger role in coordinating Latin American efforts in medical physics, condensed-matter physics and optics. It will also examine the possibility of building a proton accelerator for cancer treatment. The meeting concluded with a request to the governments of Latin America to increase their percentage of GDP spending on science and technology.

Globalization, collaboration and trust

cernview1_7-02

Industry pushes economic globalization to strengthen its market position. The process is driven by the need and desire to increase efficiency and reduce costs, but also by the wish to make the best use of different competencies in different countries. Take the European aircraft industry: parts of planes, such as the wings, tail unit, body and engines are built in different regions of Europe, and finally assembled at one plant. This has enabled distributed regional industries to jointly play a major role on the international market. Yet people are afraid to be at the mercy of some anonymous pressure, and thus increasingly oppose globalization.

Large-scale facilities in science are also increasingly tackled on a global scale. Radio-astronomers around the world have united behind the idea of jointly building their next project, ALMA, a merger of the major millimetre-array projects into one global project. Particle physics has for quite some time moved in the same direction: the large experiments have always been a role model for the shared construction of large equipment. The LHC is built with components from around the world, like HERA before it.

Global challenge

To meet the challenges of the future, accelerator-based particle physics needs to become even more global than in the past. One possible concept, the Global Accelerator Network (GAN), was originally developed as a way to build a linear collider as an international collaboration, to make the best use of worldwide competence, ideas and resources, to maintain and foster the centres of excellence in accelerator physics around the world, and to root the linear collider as an international project firmly inside the national programmes (CERN Courier June 2000).

Global projects rely on collaboration. In the past, particle physicists have developed a culture of collaboration that has worked very successfully. Indeed, they had to do so to meet the scientific challenges. Collaborations function well if their leadership acknowledges the individuality and freedom of all the partners. They do not have a strong hierarchical structure, but are driven instead by a common scientific goal. They probably would not function with an industry-style management.

Therefore the question arises as to whether a model that works for experiments can be extended to accelerators. Or to put it differently: what is needed to make this model work for accelerators as well? These questions were studied by an ICFA working group in 2001, and are now being addressed within the framework of a series of workshops, the first of which, “Enabling the Global Accelerator Network”, took place in March at Cornell. This workshop dealt with technical aspects of the remote operation of facilities, which is a key ingredient of the shared operation of accelerators. No basic problems are expected here. In fact, the TESLA test facility has already been operated remotely from Italy and France.

On the other hand, it became clear at the workshop that the sociological aspects of such a joint endeavour are probably the true challenge. As the GAN concept is built on the principle of shared responsibility, the sharing of know-how and controls is also part of the concept. The laboratory at which the facility is located would therefore relinquish the project control it traditionally had to become one of the equal partners. Mutual trust is the critical element required in order for such a collaboration to be successful.

It is well known that distributed organizations need to build up and maintain trust. Sharing working time from the very beginning is a powerful agent in establishing this trust. This requires a mixture of face-to-face interactions and the use of appropriate communication and collaboration technologies. These interactions should start as early as possible, even during the planning and R&D phase. Mutual trust and interest will continue to grow during the build-up time of the project, and will have to be sustained through the transition from early commissioning to operation and scientific exploitation. Industry is developing many tools to support the full spectrum of situations, ranging from planned, structured activities (such as scheduled meetings) to unplanned interactions.

Trust and involvement of both institutions and individuals have to be maintained over a long time – the duration of the project being typically more than 20 years. Producing exciting science and meeting technological challenges will be the key ingredients for ensuring a long-term interest of all the partners. Working on the frontiers of technology creates the need for a continuous upgrade culture. This culture needs to be distributed around the world.

However, even if the necessary trust is established, we need to solve many questions of key relevance in order to guarantee the success of the project and the major investment it requires. These questions include the management structure and organizational forms. They again are closely related to trust – we cannot afford for scientists and engineers to become disenchanted and to walk away. We need to approach global collaboration on large scientific infrastructure projects with a lot of imagination and determination.

The future of particle physics is no longer determined by scientific challenges only.

The Atom in the History of Human Thought

by Bernard Pullman (late professor of Quantum Chemistry at the Sorbonne, and director, Institut de Biologie Physico-Chimique, France), Oxford University Press, ISBN 0195114477, £14.95 (€23). Translated from the original French, Editions Fayard, ISBN 2213594635, 729.3.

cernbooks1_7-02

“This book endeavours to describe the turbulent relationship between atomic theory and philosophy and religion over a period of 25 centuries,” states the preface – a daunting task by any standards. Pullman admits that he is neither a philosopher nor a man of religion, but a chemist “having long lived side-by-side with atoms”. As such, he achieves a great deal.

The book begins with the birth of the atomic theory – the “Greek miracle” of the 7th-5th centuries BC in Pullman’s words, when a few Hellenic thinkers shed the Greek pantheon in favour of a natural philosophy. This began with theories advocating various primordial substances – water (Thales), air (Anaximenes), fire (Heraclitus) and earth (Xenophanes) – from which all things come to be. The two fundamental concepts of atomism – impenetrable, indivisible (atomos) corpuscles and void through which they travel – were formulated around 450 BC by Leucippus and Democritus, and refined a century later by Epicurus and Lucretius to a logical structure that remained essentially unchanged for the next 2000 years. The book also touches on Hindu and Buddhist atomism, which evolved independently at about the same time, but had no impact on the atomic theory of the Western world.

The book then moves on to “a few scattered revivals” during the 1st-15th centuries AD. After describing the antiatomistic position of the Church as put forward by Basil of Caesarea, St Augustine and Thomas Aquinas (among others), some mediaeval Christian atomists make an appearance. These are divided into chroniclers (such as Isidore de Seville), sympathizers and proponents. The sympathizers include Adelard of Bath (a translator of scientific Arab texts) and Thierry of Chartres (a reviver of the works of antiquity). Among the proponents are Constantine the African, a physician from Carthage who explicitly defined atoms as the fundamental constituents of substances; William of Conches; and William of Ockham.

Jewish philosophy from the 9th to the 13th centuries is discussed. This was largely opposed to atomism, although Moses Maimonides (1135-1204) described the teaching of the Arab atomists. The schismatic Jewish sect of the Karaites (founded in the 8th century) adopted the atomic theory borrowed directly from teachings of Muslim philosophers and theologians.

While Greek atomism was to free mankind from invisible powers, Arab atomism is decidedly religious in nature. The Arab atomic doctrine is expressed in the Kalam, a set of 12 propositions, one of which introduces the notion of “accidents”. These reside within atoms, and include characteristics such as life and intelligence, along with inanimate properties such as colour and odour.

Moving into the Renaissance and the age of enlightenment, Pullman describes the resurgence of atomic theory starting with Pierre Gassendi, who is counted among the Christian atomists along with the likes of Galileo, Bruno, Newton and Boyle. Gassendi criticized Aristotle and defended ancient atomists, especially Epicurus, whose teachings he tried to make acceptable to the Church. The doctrine of John Locke, who doubted any future experimental proof of the existence of these atoms, is labelled “agnostic atomism”. Pullman also discusses Maupertuis and Diderot, with their sensitive and intelligent atoms; Holbach, with his materialistic atoms; and Maxwell, who believed that atoms exist due to the action of a creator.

Christian antiatomists – philosophers or scientists who use religious arguments to reject the theory – include Descartes, who rejected the concept of void; and Leibniz with his metaphysical atoms (monads). Others mentioned are Roger Boscovitch, who tried to blend Leibniz’s monads with Newton’s laws of attraction and repulsion; George Berkeley, who rejected matter, material corpuscles and void; and Immanuel Kant, who is labelled an “atomist turned antiatomist”.

The final part of the book moves into the modern era with the advent of scientific atomism through the 19th and 20th centuries. Pullman begins with the demise of the 2000-year-old theory of four elements by the demonstration of Lavoisier that water, and of Cavendish and Priestley that air, have a compound structure. Elements came to be defined as substances that could not be decomposed. Confusion over nomenclature followed until Canizzaro formulated a distinction between atoms and molecules in 1860. Soon afterwards, Mendeleev arranged the first 63 elements in the periodic table.

Controversy, however, continued. Philosophers such as Hegel and Schopenhauer were both opposed to atomism. So were die-hard antiatomists like Berthelot, Mach and Ostwald, and a few that Pullman calls “nostalgic philosophers”, such as Nietzsche, Marx and Bergson.

Nevertheless, atomic theory was almost universally accepted by the time J J Thomson discovered the electron in 1897, bringing the hypothesis of indivisible atoms to an end.

Pullman then brings us into the quantum age in 1900 with Planck’s famous constant. He guides us through Rutherford’s 1911 conclusion that atoms are mainly vacuum with a tiny nucleus surrounded by electrons, to Bohr’s 1913 observation that Planck’s constant leads to stable orbits in the atom and to discrete spectral lines. The rest of the modern atomic picture is carefully covered, with Chadwick’s 1932 discovery of the neutron; de Broglie’s postulation of the wave-like character of matter particles, and its subsequent confirmation by Davisson and Germer; and Schrödinger’s wave mechanics leading to serious conceptual difficulties among scientists.

Chemical bonding naturally plays a large part in the book, given that its author was a chemist. Covalent bonding, where electrons are shared between atoms, leads Pullman to an interesting analogy developed in the chapter “Society of atoms: marriage”, where he concludes that “as always in life, this implies the ability and even obligation both to give and to receive”.

In a closing chapter, Pullman delves into the nanoworld. Here he describes how the scanning-tunnel microscope and the atomic-force microscope led to visualization and manipulation of single atoms interacting with bulk surfaces, and how complete isolation of single (charged) atoms surrounded by vacuum was accomplished using ion traps.

No-one can contest that the atoms conceived 2500 years ago as invisible and indivisible impenetrable philosophical constructs have today become divisible and visible objects of reality. But are they really in human thought? They are certainly in the thoughts of scientists and philosophers, but I doubt they are uppermost in the minds of most people, as Pullman suggests when he claims that “quantum physics has stoked an interest in the ‘problem of God’ among a general public”. The book is let down by its index, which is difficult to use and occasionally inaccurate. That said, however, to read this book is a fruitful learning exercise, and it has a host of informative notes.

Handbook of Radiation Effects

by Andrew Holmes-Siedle and Len Adams, 2nd edn (2002), Oxford University Press, ISBN 019850733X, £65 (€102).

41oGto9yBeL._SX313_BO1,204,203,200_

This book is aimed at specialists – engineers and applied physicists – employing electronic systems and materials in radiation environments. Its prime role is to explain how to introduce tolerance to radiation into large electronic systems. The reader is expected to be familiar with the theory and operating principles of the various devices. The book mainly addresses components used in space, but also discusses issues specific to other fields, such as military and high-energy physics applications.

The book starts with a quick overview of radiation concepts, units and radiation detection principles, followed by a brief review of the various radiation environments likely to have a degrading effect on electronic devices and systems as encountered in space, energy production (fission and fusion), high-energy physics and in military applications (nuclear weapons). This is followed by a chapter dedicated to a general description of the fundamental effects of radiation in materials and devices: atomic displacement and ionization; as well as colourability of transparent material, single-event phenomena and other transient effects.

Seven central chapters form the core of the handbook, addressing in detail the mechanisms responsible for the degradation of performance of various devices. Each chapter is dedicated to a class of devices: MOS; bipolar transistors and integrated circuits; diodes and optoelectronics such as phototransistors and CCDs; power semiconductors; various types of sensors; and miscellaneous electronic components. The physical problems of total-dose effects and how to predict the electrical changes caused in MOS devices are discussed, along with some of the best solutions to the radiation problem. Long-lived effects, which can be separated into surface and bulk mechanisms, of various radiation types on bipolar transistors are described. How these effects influence the radiation response of bipolar integrated circuits is discussed. The response of the many different types of diodes to radiation is thoroughly discussed in a dedicated chapter. Optoelectronic devices in a hostile environment are subject to multiple effects, and radiation can cause mulfunctioning in a highly tuned, high-technology system. Silicon power devices used as regulators in power subsystems of large space equipment, radiation-generating equipment and nuclear-power sources also suffer from radiation damage. One chapter is devoted to discussing the physics, chemistry and practical problems associated with windows, lenses, optical coatings and optical fibres. Another chapter concentrates on the effects of radiation on polymers and other organics, classifying the main forms of organic degradation under irradiation and summarizing some of the most important examples and problems met with polymers in engineering and science.

Two chapters are dedicated to aspects of radiation shielding of electronic devices and various computer methods for particle transport, essentially with reference to space applications (very thin shields). The three final chapters discuss radiation testing, equipment hardening and hardness assurance. Radiation testing is made unavoidable by the variability in the sensitivity of semiconductors and electronic devices to radiation, which makes it impossible to rely on theory alone to predict the effect on a device of a certain exposure to a given type of radiation. The authors provide guidelines on radiation sources that may be used in irradiation tests, in test procedures and in engineering standards. Finally, they discuss the technologies and methodologies employed in fabricating radiation-hard devices, as well as providing rules of hardening against various types of radiation and for various applications, including remote handling equipment and robots.

Each chapter ends with a summary of its most important points. Besides the usual subject index, a useful author index helps greatly in searching through the large number of references provided at the end of each chapter. With respect to the first edition (1993), the book has been enriched with many references to useful websites, including databases. Surprisingly, the old units rad, rem and curies are used throughout the book, although SI units are provided in brackets. The authors admit they thought hard about what to use, and finally opted for the old system.

It is unfortunate that this otherwise excellent volume contains, here and there, a number of typographical and punctuation errors, and mistakes in some formulae. In a few cases there are contradictory statements a few paragraphs apart. The impression is that the text was not proofread carefully enough before going to print. There are also a few statements that are clearly wrong, such as that X-rays and gamma rays leave no activity in the material irradiated (what about photonuclear reactions above a given threshold?); and others that are confusing, such as in discussing the whole-body dose limit for members of the public. In general, activation phenomena and related problems are also somewhat generally underestimated throughout the book.

Nevertheless, this volume contains a lot of valuable material and is not only a handbook, but also an excellent textbook.

Quarks, Leptons and the Big Bang (2nd edition)

by Jonathan Allday, Institute of Physics Publishing, ISBN 0750308060, £16.99 (€27).

41Wpeb2T0BL._SY291_BO1,204,203,200_QL40_ML2_

This edition is a revised and updated version of the King’s School, Canterbury teacher’s popular high-school introduction to particle physics and cosmology.

bright-rec iop pub iop-science physcis connect