Bluefors – leaderboard other pages

Topics

The end of computing’s steam age

Steam once powered the world. If you wanted to build a factory, or a scientific laboratory, you needed a steam engine and a supply of coal. Today, for most of us, power comes out of the wall in the form of electricity.

The modern-day analogue is computing: if you want to run a large laboratory such as CERN, you need a dedicated computer centre. The time, however, is ripe for change.

For LHC physicists, this change has already happened. We call it the Worldwide LHC Computing Grid (WLCG), which is maintained by the global particle-physics community. As physicists move towards the High Luminosity LHC (HL-LHC), however, we need a new solution for our increasingly demanding computing and data-storage needs. That solution could look very much like the Cloud, which is the general term for distributed computing and data storage in broader society.

There are clear differences between the Cloud and the Grid. When developing the WLCG, CERN was able to factor in technology that was years in the future by banking on Moore’s law, which states that processing capacity doubles roughly every 18 months. After more than 50 years, however, Moore’s law is coming up against a hard technology limit. Cloud technology, by contrast, shows no sign of slowing down: more bandwidth simply means more fibre or colour-multiplexing on the same fibre.

Cloud computing is already at an advanced stage. While CERN was building the WLCG, the Googles and Amazons of the world were building huge data warehouses to host commercial Clouds. Although we could turn to them to satisfy our computing needs, it is doubtful that such firms could guarantee the preservation of our data for the decades that it would be needed. We therefore need a dedicated “Science Cloud” instead.

CERN has already started to think about the parameters for such a facility. Zenodo, for example, is a future-proof and non-proprietary data repository that has been adopted by other big-data communities. The virtual nature of the technology allows various scientific disciplines to coexist on a given infrastructure, making it very attractive to providers. The next step requires co-operation with governments to develop computing and data warehouses for a Science Cloud.

CERN and the broader particle-physics community have much to bring to this effort. Just as CERN played a pioneering role in developing Grid computing to meet the needs of the LHC, we can contribute to the development of the Science Cloud to meet the demands of the HL-LHC. Not only will this machine produce a luminosity five times greater than the LHC, but data are increasingly coming straight from the sensors in the LHC detectors to our computer centre with minimal processing and reduction along the way. Add to that CERN’s open-access ethos, which began in open-access publishing and is now moving towards “open data”, and you have a powerful combination of know-how relevant to designing future computing and data facilities. Particle physics can therefore help develop Cloud computing for the benefit of science as a whole.

In the future, scientific computing will be accessed much as electrical power is today: we will tap into resources simply by plugging in, without worrying about where our computing cycles and data storage are physically located. Rather than relying on our own large computer centre, there will be a Science Cloud composed of computing and data centres serving the scientific endeavour as a whole, guaranteeing data preservation for as long as it is needed. Its location should be determined primarily by its efficiency of operation.

CERN has been in the vanguard of scientific computing for decades, from the computerised control system of the Super Proton Synchrotron in the 1970s, to CERNET, TCP/IP, the World Wide Web and the WLCG. It is in that vanguard that we need to remain, to deliver the best science possible. Working with governments and other data-intensive fields of science, it’s time for particle physics to play its part in developing a world in which the computing socket sits right next to the power socket. It’s time to move beyond computing’s golden age of steam.

Tunnel visions

By M Riordan, L Hoddeson and A W Kolb
University of Chicago Press
Also available at the CERN bookshop

CCboo1_06_16

The Superconducting Super Collider (SSC), a huge accelerator to be built in Texas in the US, was expected by the physicists who supported it to be the place where the Higgs boson would be discovered. Instead, the remnants of the SSC facilities at Waxahachie are now property of the chemical company Magnablend, Inc. What happened in between? What did go wrong? What are the lessons to be learnt?

Tunnel Visions responds to these historical questions in a very precise and exhaustive way. Contrary to my expectations, it is not a doom and gloom narration but a down to earth story of the national pride, good physics and bad economics of one of the biggest collider projects in history.

The book depicts the political panorama during the 10 years (~1983–1993) of life of the SSC project. It started during the Reaganomics, hand in hand with the International Space Station (ISS), and concluded during the first Clinton presidency after the 1990s recession and the end of the Cold War. The ISS survived, possibly because political justifications for space adventure are easier to find, but most probably because from the beginning it was an international project. The book explains the management intricacies of such a large project, the partisan support and disregard, until the final SSC demise in the US congress. For the particle-physics community this is a well-known tale, but the historical details are welcome.

However, the book is more than that, because it also sheds light on the lessons learnt. The final woes of the SSC signed the definitive opening of the US particle-physics community to full international collaboration. For 50 years, without doubt, the US had been the place to go for any particle physicist. Fermilab, SLAC and Brookhaven were, and still are, great stars in the physics firmament. Even if the SSC project had not been cut, those three had to keep working in order to maintain the progress in the field. But that was too much for essentially a zero-sum budget game. The show must go on, so Fermilab got the main injector, SLAC the BaBar factory, and Brookhaven the RHIC collider. Thanks to these upgrades, the three laboratories made important progress in particle physics: top quark discovery; W and Z boson precision measurements; Higgs boson mass hunt narrowing between 113 and 170 GeV; detection of possible discrepancies in the Standard Model associated with b-meson decay; and the discovery of the liquid-like quark–gluon plasma.

Why did the SSC project collapse? The authors explain the real reasons, not related to technical problems but to poor management in the first years and the clash of cultures between the US particle-physics community and the US military-industrial system. But there are also reasons of opportunity. The SSC was several steps beyond its time. To put it into context: during the years of the SSC project, at CERN the conversion of the SPS into a collider took place, along with the whole LEP programme and the beginning of the LHC project. That effort prevented any possible European contribution to the SSC. The last-ditch attempt to internationalize the SSC into a trans-Pacific partnership with Japan was also unsuccessful. The lessons from history, the authors conclude, are that at the beginning of the 1990s the costs of frontier experimental particle physics had grown too much, even for a country like the US. Multilateral international collaboration was the only way out, as the ISS showed.

The Higgs boson discovery was possible at CERN. The book avoids any “hare and tortoise” comparison here, however, since in the dawning of the new century, the US became a CERN observer state with a very important in-kind contribution. In my opinion, this is where the book grows in interest because it explains how the US particle-physics community took part in the LHC programme, becoming decisive. In particular, the US technological effort in developing superconducting magnets was not wasted. The book also talks about the suspense around the Higgs search when the Tevatron was the only one still in the game during the LHC shutdown after the infamous incident in September 2008.

Useful appendices providing notes, a bibliography and even a short explanation of the Standard Model complete the text.

Entropy Demystified: The Second Law Reduced to Plain Common Sense (2nd edition)

By Arieh Ben-Naim
World Scientific

41vMCDUos3L._SX312_BO1,204,203,200_

In this book, the author explains entropy and the second law of thermodynamics in a clear and easy way, and with the help of many examples. He intends, in particular, to show that these physics laws are not intrinsically incomprehensible, as they appear at first. The fact that entropy, which is defined in terms of heat and temperature, can be also expressed in terms of order and disorder, which are intangible concepts, together with the evidence that entropy (or, in other words, disorder) increases perpetually, can puzzle students. Some mystery seems to be inevitably associated with these concepts. The author asserts that, looking at the second law from the molecular point of view, everything clears up. What a student needs to know is the atomistic formulation of entropy, which comes from statistical mechanics.

The aim of the book is to clarify these concepts to readers who haven’t studied statistical mechanics. Many dice games and examples from everyday life are used to make readers familiar with the subject. They are guided along a path that allows them to discover by themselves what entropy is, how it changes, and why it always changes in one direction in a spontaneous process.

In this second edition, seven simulated games are also included, so that the reader can experiment with and appreciate the joy of understanding the second law of thermodynamics.

Modern Physics Letters A: Special Issue on Hadrontherapy

By Saverio Braccini (ed.)
World Scientific

The applications of nuclear and particle physics to medicine have seen extraordinary development since the discovery of X-rays by Röntgen at the end of the 19th century. Medical imaging and oncologic therapy with photons and charged particles (specifically hadrons) are currently hot research topics.

This special issue of Modern Physics Letters is dedicated to hadron therapy, which is the frontier of cancer radiation therapy, and aims at filling a gap in the current literature on medical physics. Through 10 invited review papers, the volume presents the basics of hadron therapy, along with the most recent scientific and technological developments in the field. The first part covers topics such as the history of hadron therapy, radiation biophysics, particle accelerators, dose-delivery systems and treatment planning. In the second part, more specific topics are treated, including dose and beam monitoring, proton computer tomography, innoacustics and microdosimetry.

This volume will be very useful to students, researchers approaching medical physics, and scientists interested in this interdisciplinary and fast-moving field.

Beyond the Galaxy: How Humanity Looked Beyond our Milky Way and Discovered the Entire Universe

By Ethan Siegel
World Scientific

719Eu+6dvnL

This book provides an introduction to astrophysics and cosmology for absolute beginners, as well as for any reader looking for a general overview of the subject and an account of its latest developments.

Besides presenting what we know about the history of the universe and the marvellous objects that populate it, the author is interested in explaining how we came to such knowledge. He traces a trajectory through the various theories and the discoveries that defined what we know about our universe, as well as the boundary of what is still to be understood.

The first six chapters deal with the state-of-the-art of our knowledge about the structure of the universe, its origin and evolution, general relativity and the life of stars. The following five address the most important open problems, such as: why there is more matter than antimatter, what dark matter and dark energy are, what there was before the Big Bang, and what the fate of the universe is.

Written in plain English, without formulas and equations, and characterized by a clear and fluid prose, this book is suitable for a wide range of readers.

Bananaworld: Quantum Mechanics for Primates

By Jeffrey Bub
Oxford University Press

51SD0QRDsoL

This is not another “quantum mechanics for dummies” book, as the author himself states. Nevertheless, it is a text that talks about quantum mechanics but is not meant for experts in the field. It explains complex concepts of theoretical physics almost without bringing up formulas, and makes no reference to a specialist background.

The book focuses on an intriguing issue of present-day physics: nonlocality and the associated phenomenon of entanglement. Thinking in macroscopic terms, we know that what happens here affects only the surrounding environment. But going down to the microscopic level where quantum mechanics applies, we see that things work in a different way. Scientists discovered that in this case, besides the local effects, there are less evident effects that reveal themselves in strange correlations that occur instantaneously between remote locations. Even stronger nonlocal correlations, still consistent with relativity, have been theoretically supposed, but have not been observed up to now.

This complex subject is treated by the author using a particular metaphor, which is actually more than just that: he draws a metaphoric world made of magic bananas, and simple actions that can be performed on them. Thanks to this, he is able to explain nonlocality and other difficult physics concepts in a relatively easy and comprehensive way.

Even if it requires some general knowledge of mathematics and familiarity with science, this book will be accessible and interesting to a wide range of readers, as well as being an entertaining read.

Particles and the Universe: From the Ionian School to the Higgs Boson and Beyond

By Stephan Narison
World Scientific

particles-and-the-universe-from-the-ionian-school-to-the-higgs-boson-and-beyond

This book aims to present the history of particle physics, from the introduction of the concept of particles by Greek philosophers, to the discovery of the last tile of the Standard Model, the Higgs boson particle, which took place at CERN in 2012. Chronologically following the development of this field of science, the author gives an overview of the most important notions and theories of particle physics.

The text is divided into seven sections. The first part provides the basics concepts and a summary of the history of physics, arriving at the modern theory of forces, which are the subject of the second part. It carries on with the Higgs boson discovery and the description of some of the experimental apparatus used to study particles (from the LHC at CERN to cosmic rays and neutrino experiments). The author also provides a brief treatment of general relativity, the Big Bang model and the evolution of the universe, and discusses the future developments of particle physics.

In the main body of the book, the topics are presented in a non-technical fashion, in order to be accessible to non-experts. Nevertheless, a rich appendix provides demonstrations and further details for advanced readers. The text is accompanied by plenty of images, including paintings and photographs of many of the protagonists of particle physics.

Statistical Methods for Data Analysis in Particle Physics

By Luca Lista
Springer
Also available at the CERN bookshop

CCboo2_06_16

Particle-physics experiments are very expensive, not only in terms of the cost of building accelerators and detectors, but also due to the time spent by physicists and engineers in designing, building and running them. With the statistical analysis of the resulting data being relatively inexpensive, it is worth trying to use it optimally to extract the maximum information about the topic of interest, whilst avoiding claiming more than is justified. Thus, lectures on statistics have become regular in graduate courses, and workshops have been devoted to statistical issues in high-energy physics analysis. This also explains the number of books written by particle physicists on the practical applications of statistics to their field.

This latest book by Lista is based on the lectures that he has given at his home university in Naples, and elsewhere. As part of the Springer series of “Lecture Notes in Particle Physics”, it has the attractive feature of being short – a mere 172 pages. The disadvantage of this is that some of the explanations of statistical concepts would have benefited from a somewhat fuller treatment.

The range of topics covered is remarkably wide. The book starts with definitions of probability, while the final chapter is about discovery criteria and upper limits in searches for new phenomena, and benefits from Lista’s direct involvement in one of the large experiments at CERN’s LHC. It mentions such topics as the Feldman–Cousins method for confidence intervals, the CLs approach for upper limits, and the “look elsewhere effect”, which is relevant for discovery claims. However, there seems to be no mention of the fact that a motivation for the Feldman–Cousins method was to avoid empty intervals; the CLs method was introduced to protect against the possibility of excluding the signal plus background hypothesis when the analysis had little or no sensitivity to the presence or absence of the signal.

The book has no index, nor problems for readers to solve. The latter is unfortunate. In common with learning to swim, play the violin and many other activities, it is virtually impossible to become proficient at statistics by merely reading about it: some practical exercise is also required. However, many worked examples are included.

There are several minor typos that the editorial system failed to notice; and in addition, figure 2.17, in which the uncertainty region for a pair of parameters is compared to the uncertainties in each of them separately, is confusing.

There are places where I disagree with Lista’s emphasis (although statistics is a subject that often does produce interesting discussions). For example, Lista claims it is counter-intuitive that, for a given observed number of events, an experiment that has a larger than expected number of background events (b) provides a tighter upper limit than one with a smaller background (i.e. a better experiment). However, if there are 10 observed events, it is reasonable that the upper limit on any possible signal is better if b = 10 than if b = 0. What is true is that the expected limit is better for the experiment with smaller backgrounds.

Finally, the last three chapters could be useful to graduate students and postdocs entering the exciting field of searching for signs of new physics in high energy or non-accelerator experiments, provided that they have other resources to expand on some of Lista’s shorter explanations.

Path Integrals for Pedestrians

By E Gozzi, E Cattaruzza and C Pagani
World Scientific

CCboo3_06_16

The path integral formulation of quantum mechanics is one the basic tools used to construct quantum field theories, especially gauge-invariant theories. It is the bread and butter of modern field theory. Feynman’s original formulation developed and extended some of the work of Dirac in the early 1930s, and provided an elegant and insightful solution to a generic Schrödinger equation.

This short book provides a clear, pedagogical and insightful presentation of the subject. The derivations of the basic results are crystal clear, and the applications worked out to be rather original. It includes a nice presentation of the WKB approximation within this context, including the Van Vleck and functional determinant, the connections formulae and the semiclassical propagator.

An interesting innovation in this book is that the authors provide a clear presentation of the path integral formulation of the Wigner functions, which are fundamental in the study of quantum statistical mechanics; and, for the first time in an elementary book, the work of Koopman and von Neumann on classical and statistical mechanics.

The book closes with a well selected set of appendices, where some further technical details and clarifications are presented. Some of the more mathematical details in the basic derivations can be found there, as well as aspects of operator ordering as seen from the path integral point formulation, the formulation in momentum space, and the use of Grassmann variables, etc.

It will be difficult to find a better and more compact introduction to this fundamental subject.

Record-breaking production at the LHC

The past few weeks have been a record-breaking period for the LHC, with the machine now delivering long fills with unprecedented luminosity. Following the interruption in late May due to problems with the PS main power supply, on 1 June the operations team established collisions with 2040 bunches for the first time this year. This is the maximum number of bunches achievable with the current limitations from the SPS beam dump, which allows the injection of trains of 72 bunches spaced by 25 ns.

The following week saw LHC’s previous luminosity record at 6.5 TeV broken by a peak luminosity of just over 8 × 1033 cm–2 s–1, representing 80% of the design luminosity. This was followed by a new record for integrated luminosity in a single fill, with 370 pb–1 delivered in just 18 hours of colliding beams. The availability for collisions during this period was a remarkable 75%, more than double the annual average in 2015. Around 2 fb–1 were delivered during one week, breaking the previous record of 1.4 fb–1 established in June 2012.

These records follow the decision taken at the end of May to focus on delivering the highest possible integrated luminosity for the summer conferences. Following a short technical stop that ended on 9 June, the machine was re-validated from a machine-protection perspective for a sustained period of 2040 bunch operation at high luminosity. Afterwards, new records were set immediately, with one fill on 13–14 June producing more than 0.5 fb–1 in around 27 hours, and the following fill recording a peak luminosity of over 9 × 1033 cm–2 s–1. The record integrated luminosity delivered in seven days now stands at 2.4 fb–1. Finally, on 26 June, the team hit the LHC design luminosity (1034 cm–2 s–1) for the first time. With such performance, the operations team hopes to deliver over 10 fb–1 to both ATLAS and CMS before the summer conferences.

This is truly a new phase for the LHC and thanks are due to all the teams who have worked tirelessly to make it possible. This year the smaller beam size at the interaction points provides almost double the instantaneous luminosity compared to 2015, yet the machine is behaving impeccably. The stunning and surprising availability is due to a sustained effort over the years by hardware groups such as cryogenics, quench protection, power converters, RF, collimation, injection and others to maximise the reliability of their systems. Of particular note is the major effort co-ordinated by the radiation to electronics team to mitigate the effects of beam-induced radiation on tunnel electronics.

Another case in point concerns cryogenics. With so many bunches circulating, the heat load deposited by the electron cloud on the LHC beam screens in the arcs can reach 150 W per half-cell (a half-cell in an arc includes one quadrupole and three dipole magnets). This is just below the maximum of 160 W that can be sustained by the cryogenics system. With a new cryogenic feed-forward system in place to tune the beam-screen cooling parameters according to the intensity stored in the machine and the beam energy, operation with the high electron cloud currently present in the machine is significantly smoother than in 2015. Of course, the LHC remains a hugely complex machine and the availability is always liable to fluctuations.

bright-rec iop pub iop-science physcis connect