Challenges and Goals for Accelerators in the XXI Century
By Oliver Brüning and Stephen Myers (eds)
Also available at the CERN bookshop
This mighty 840 page book covers an impressive range of subjects divided into no less than 45 chapters. Owing to the expertise and international reputations of the authors of the individual chapters, few if any other books in this field have managed to summarise such a broad topic with such authority. While too numerous to list in the space provided, the full list of authors – a veritable “who’s who” of the accelerator world – can be viewed at worldscientific.com/worldscibooks/10.1142/8635#t=toc.
The book opens with two chapters devoted to a captivating historical review of the Standard Model and a general introduction to accelerators, and closes with two special sections. The first of these is devoted to novel accelerator ideas: plasma accelerators, energy-recovery linacs, fixed-field alternating-gradient accelerators, and muon colliders. The last section describes European synchrotrons used for tumour therapy with carbon ions and covers, in particular, the Heidelberg Ion Therapy Centre designed by GSI and the CERN Proton Ion Medical Machine Study. The last chapter describes the transformation of the CERN LEIR synchrotron into an ion facility for radiobiological studies.
Concerning the main body of the book, 17 chapters look back over the past 100 years, beginning with a concise history of the three first lepton colliders: AdA in Frascati, VEP-1 in Novosibirsk and the Princeton–Stanford electron–electron collider. A leap in time then takes the reader to CERN’s Large Electron–Positron collider (LEP), which is followed by a description of the Stanford Linear Collider. Unfortunately, this latter chapter is too short to do full justice to such an innovative approach to electron–positron collisions.
The next section is devoted to beginnings, starting from the time of the Brookhaven Cosmotron and Berkeley Bevatron. The origin of alternating-gradient synchrotrons is well covered through a description of the Brookhaven AGS and the CERN Proton Synchrotron. The first two hadron colliders at CERN – the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) proton–antiproton collider – are then discussed. The ISR’s breakthroughs were numerous, including the discovery of Schottky scans, the demonstration of stochastic cooling and absolute luminosity measurements by van der Meer scans. Even more remarkable was the harvest of the SPS proton–antiproton collider, culminating with the Nobel prize awarded to Carlo Rubbia and Simon van der Meer. The necessary Antiproton Accumulator and Collector are discussed in a separate chapter, which ends with an amusing recollection: “December 1982 saw the collider arriving at an integrated luminosity of 28 inverse nanobarns and Rubbia offering a ‘champagne-only’ party with 28 champagne bottles!” Antiproton production methods are covered in detail, including a description of the manoeuvres needed to manipulate antiproton bunches and of the production of cold antihydrogen atoms. This subject is continued in a later chapter dedicated to CERN’s new ELENA antiproton facility.
The Fermilab proton–antiproton collider started later than the SPS, but eventually led to the discovery of the top quark by the CDF and D0 collaborations. The Fermilab antiproton recycler and main ring are described, followed by a chapter dedicated to the Tevatron, which was the first superconducting collider. The first author remarks that, over the years, some 1016 antiprotons were accumulated at Fermilab, corresponding to about 17 nanograms and more than 90% of the world’s total man-made quantity of nuclear antimatter. This section of the book concludes with a description of the lepton–proton collider HERA at DESY, the GSI heavy-ion facility, and the rare-isotope facility REX at ISOLDE. Space is also given to the accelerator that was never built, the US Superconducting Super Collider (SSC), of which “the hopeful birth and painful death” is recounted.
The following 25 chapters are devoted to accelerators for the 21st century, with the section on “Accelerators for high-energy physics” centred on the Large Hadron Collider (LHC). In the main article, magisterially written, it is recalled that the 27 km length of the LEP tunnel was chosen having already in mind the installation of a proton–proton collider, and the first LHC workshop was organised as early as 1984. The following chapters are dedicated to ion–ion collisions at the LHC and to the upgrades of the main ring and the injector. The high-energy version of the LHC and the design of a future 100 km-circumference collider (with both electron–positron and proton–proton collision modes) are also covered, as well as the proposed TeV electron–proton collider LHeC. The overall picture is unique, complete and well balanced.
Other chapters discuss frontier accelerators: super B-factories, the BNL Relativistic Heavy Ion Collider (RHIC) and its electron–ion extension, linear electron–positron colliders, electron–positron circular colliders for Higgs studies and the European Spallation Source. Special accelerators for nuclear physics, such as the High Intensity and Energy ISOLDE at CERN and the FAIR project at GSI, are also discussed. Unfortunately, the book does not deal with synchrotron light sources, free electron lasers and high- power proton drivers. However, the latter are discussed in connection with neutrino beams by covering the CERN Neutrinos to Gran Sasso project and neutrino factories.
The book is aimed at engineers and physicists who are already familiar with particle accelerators and may appreciate the technical choices and stories behind existing and future facilities. Many of its chapters could also be formative for young people thinking of joining one of the described projects. I am convinced that these readers will receive the book very positively.
• Ugo Amaldi, TERA Foundation.
Big Data: Storage, Sharing, and Security
By Fei Hu (ed.)
Nowadays, enormous quantities of data in a variety of forms are generated rapidly in fields ranging from social networks to online shopping portals to physics laboratories. The field of “big data” involves all the tools and techniques that can store and analyse such data, whose volume, variety and speed of production are not manageable using traditional methods. As such, this new field requires us to face new challenges. These challenges and their possible solutions are the subject of this book of 17 chapters, which is clearly divided into two sections: data management and security.
Each chapter, written by different authors, describes the state-of-the-art for a specific issue that the reader may face when implementing a big-data solution. Far from being a manual to follow step-by-step, topics are treated theoretically and practical uses are described. Every subject is very well referenced, pointing to many publications for readers to explore in more depth.
Given the diversity of topics addressed, it is difficult to give a detailed opinion on each of them, but some deserve particular mention. One is the comparison between different communication protocols, presented in depth and accompanied by many graphs that help the reader to understand the behaviour of these protocols under different circumstances. However, the black-and-white print makes it difficult to differentiate between the lines in these graphs. Another topic that is nicely introduced is the SP (simplicity and power) system, which makes use of innovative solutions to aspects such as the variety of data when dealing with huge amounts. Even though the majority of the topics in the book are clearly linked to big data, some of them are related to broader computing topics such as deep-web crawling or malware detection in Android environments.
Security in big-data environments is widely covered in the second section of the book, spanning cryptography, accountability and cloud computing. As the authors point out, privacy and security are key: solutions are proposed to successfully implement a reliable, safe and private platform. When managing such amounts of data, privacy needs to be carefully treated since delicate information could be extracted. The topic is addressed in several chapters from different points of view, from looking at outsourced data to accountability and integrity. Special attention is also given to cloud environments, since they are not as controlled as those “in house”. Cloud environments may require data to be securely transmitted, stored and analysed to avoid access by unauthorised sources. Proposed approaches to apply security include encryption, authorisation and authentication methods.
The book is a good introduction to many of the aspects that readers might face or want to improve in their big-data environment.
• Daniel Lanza Garcia, CERN.
Thermodynamics and Equations of State for Matter: From Ideal Gas to Quark–Gluon Plasma
By Vladimir Fortov
This monograph presents a comparative analysis of different thermodynamic models of the equation of state (EOS). The author aims to present in a unified way both the theoretical methods and experimental material relating to the field.
Particular attention is given to the description of extreme states reached at high pressure and temperature. As a substance advances along the scale of pressure and temperature, its composition, structure and properties undergo radical changes, from the ideal state of non-interacting neutral particles described by the classical statistical Boltzmann function to the exotic forms of baryon and quark–gluon matter.
Studying the EOS of matter under extreme conditions is important for the study of astrophysical objects at different stages of their evolution as well as in plasma, condensed-matter and nuclear physics. It is also of great interest for the physics of high-energy concentrations that are either already attained or can be reached in the near future under controlled terrestrial conditions.
Ultra-extreme astrophysical and nuclear-physical applications are also analysed. Here, the thermodynamics of matter is affected substantially by relativity, high-power gravitational and magnetic fields, thermal radiation, the transformation of nuclear particles, nucleon neutronisation, and quark deconfinement.
The book is intended for a wide range of specialists who study the EOS of matter and high-energy-density physics, as well as for senior students and postgraduates.
Theory of Quantum Transport at Nanoscale: An Introduction
By Dmitry A Ryndyk
This book provides an introduction to the theory of quantum transport at the nanoscale – a rapidly developing field that studies charge, spin and heat transport in nanostructures and nanostructured materials. The theoretical models and methods recollected in the volume are widely used in nano-, molecular- and bio-electronics, as well as in spin-dependent electronics (spintronics).
The book begins by introducing the basic concepts of quantum transport, including the Landauer–Büttiker method; the matrix Green function formalism for coherent transport; tunnelling (transfer) Hamiltonian and master equation methods for tunnelling; Coulomb blockade; and vibrons and polarons.
In the second part of the book, the author gives a general introduction to the non-equilibrium Green function theory, describing first the approach based on the equation-of-motion technique, and then a more sophisticated one based on the Dyson–Keldysh diagrammatic technique. The book focuses in particular on the theoretical methods able to describe the non-equilibrium (at finite voltage) electron transport through interacting nanosystems, specifically the correlation effects due to electron–electron and electron–vibron interactions.
The book would be useful for both masters and PhD students and for researchers or professionals already working in the field of quantum transport theory and nanoscience.