The exploration of W- and Z-boson interactions at the energy frontier probes the heart of the Brout–Englert–Higgs mechanism. The cross-section of longitudinal weak-boson scattering would diverge, resulting in meaningless values, were it not for the exact cancellation due to Higgs-boson contributions. The key processes for this exploration are the scattering between W and Z bosons emitted by quarks in proton–proton collisions, which are among the rarest processes of the Standard Model (SM) and that have remained inaccessible until very recently.
At the 2018 International Conference on High Energy Physics (ICHEP), held in Seoul on 4–11 July, ATLAS reported the observation of the W±W±jj final state, and, for the first time, the observation of the W±Zjj final state produced by pure electroweak processes, among which vector-boson scattering (VBS) is dominant. Observation of the electroweak production of W±W±jj was reported by the CMS collaboration in 2017.
ATLAS data corresponding to an integrated luminosity of 36 fb–1 collected in 2015 and 2016 at a centre-of-mass energy of 13 TeV were used. The two final states were searched for using W- and Z-boson decays to leptons (electrons or muons), featuring the typical signature of a centrally produced diboson system accompanied by two forward jets that are well separated in rapidity. The large invariant mass (mjj) of the two jets was used to isolate signal events from the overwhelming background arising from strong interactions. Further selection requirements, utilising additional features in the two channels, were necessary to suppress this background.
In the WWjj channel, the strong-interaction contribution to the production can be greatly reduced by selecting events with the same W-boson charge. Remaining backgrounds arise from processes in which leptons are misidentified or the charge of the lepton is incorrectly measured. The analysis therefore focused on the reduction and control of these backgrounds that are estimated from data. Additional background from incompletely reconstructed WZ events was estimated from simulations. The final mjj distribution of selected events is shown in the left-hand figure, with the signal accumulating at large mjj values. The analysis led to a significance of 6.9σ, qualifying for an observation.
Most of the background in the WZjj channel arises from strong-interaction processes contributing to the same final state. Kinematic variables that show distinct differences between electroweak and strong production were exploited to isolate the signal using a multivariate discriminant from a boosted decision tree (figure, right). The analysis leads to an observed significance of 5.6σ.
These observations open up a new era of exploration of a yet largely unknown part of the SM: the quartic couplings of weak bosons. The larger amounts of data collected during LHC Run 2 and future runs will allow for a detailed characterisation of VBS interactions using differential cross-section measurements. Such measurements combined with refined theory modelling provide sensitive tests of the electroweak sector of the SM, and may reveal signs of new physics.
Every second, each square metre of the Earth is struck by thousands of charged particles travelling from deep space. It is now more than a century since cosmic rays were discovered, yet still they present major challenges to physics. The origin of high-energy cosmic rays is the biggest mystery, their energy too high to have been generated by astrophysical sources such as supernovae, pulsars or even black holes. But cosmic rays are also of interest beyond astrophysics. Recent studies at CERN’s CLOUD experiment, for example, suggest that cosmic rays may influence cloud cover through the formation of new aerosols, with important implications for the evolution of Earth’s climate.
This year, two independent missions were mounted in the Arctic and in Antarctica – Polarquest2018 and Clean2Antarctica – to understand more about the physics of high-energy cosmic rays. Both projects have a strong educational and environmental dimension, and are among the first to measure cosmic rays at such high latitudes.
Geomagnetic focus
Due to the shape of the geomagnetic field, the intensity of the charged cosmic radiation is higher at the poles than it is in equatorial regions. At the end of the 1920s it was commonly believed that cosmic rays were high-energy neutral particles (i.e. gamma rays), implying that the Earth’s magnetic field would not affect cosmic-ray intensity. However, early observations of the dependence of the cosmic-ray intensity on latitude rejected this hypothesis, showing that cosmic rays mainly consist of charged particles and leading to the first quantitative calculations of their composition.
The interest in measuring the cosmic-ray flux close to the poles is related to the fact that the geomagnetic field shields the Earth from low-energy charged cosmic rays, with an energy threshold (geomagnetic cut-off) depending on latitude, explains Mario Nicola Mazziotta, an INFN researcher and member of the Polarquest2018 team. “Although the geomagnetic cut-off decreases with increasing latitude, the cosmic-ray intensity at Earth reaches its maximum at latitudes of about 50–60°, where the cut-off is of a few GeV or less, and then seems not to grow anymore with latitude. This indicates that cosmic-ray intensity below a given energy is suppressed, due to solar effects, and makes the study of cosmic rays near the polar regions a very useful probe of solar activity.”
Polarquest2018 is a small cosmic-ray experiment that recently completed a six-week-long expedition to the Arctic Circle, on board a 18 m-long boat called Nanuq designed for sailing in extreme regions. The boat set out from Isafjordur, in North-East Iceland, on 22 July, circumnavigating the Svalbard archipelago in August and arriving in Tromsø on 4 September. The Polarquest2018 detectors reached 82 degrees north, shedding light on the soft component of cosmic rays trapped at the poles by Earth’s magnetic field.
Polarquest2018 is the result of the hard work of a team of a dozen people for more than a year, in addition to enthusiastic support from many other collaborators. Built at CERN by school students from Switzerland, Italy and Norway, Polarquest2018 encompasses three scintillator detectors to measure the cosmic-ray flux at different latitudes: one mounted on the Nanuq’s deck and two others installed in schools in Italy and Norway. The detectors had to operate with the limited electric power (12 W) that was available on board, both recording impinging cosmic rays and receiving GPS signals to timestamp each event with a precision of a few tens of nanoseconds. The detectors also had to be mechanically robust to resist the stresses from rough seas.
The three Polarquest2018 detectors join a network of around 60 others in Italy called the Extreme Energy Events – Science Inside Schools (EEE) experiment, proposed by Antonino Zichichi in 2004 and presently co-ordinated by the Italian research institute Centro Fermi in Rome, with collaborators including CERN, INFN and various universities. The detectors (each made of three multigap resistive plate chambers of about 2 m2 area) were built at CERN by high-school students and the large area of the EEE enables searches for very-long-distance correlations between cosmic-ray showers.
A pivotal moment in the arctic expedition came when the Nanuq arrived close to the south coast of the Svalbard archipelago and was sailing in the uncharted waters of the Recherche Fjord. While the crew admired a large school of belugas, the boat struck the shallow seabed, damaging its right dagger board and leaving the craft perched at a 45° incline. The crew fought to get the Nanuq free, but in the end had to wait almost 12 hours for the tide to rise again. Amazingly, explains Polarquest2018 project leader Paola Catapano of CERN, the incident had its advantages. “It allowed the team to check the algorithms used to correct the raw data on cosmic rays for the inclination and rolling of the boat, since the data clearly showed a decrease in the number of muons due to a reduced acceptance.”
Analysis of the Polarquest2018 data will take a few months, but preliminary results show no significant increase in the cosmic-ray flux, even at high latitudes. This is contrary to what one could naively expect considering the high density of the Earth’s magnetic field lines close to the pole, explains Luisa Cifarelli, president of Centro Fermi in Rome. “The lack of increase in the cosmic flux confirms the hypothesis formulated by Lemaître in 1932, with much stronger experimental evidence than was available up to now, and with data collected at latitudes where no published results exist,” she says. The Polarquest2018 detector has also since embarked on a road trip to measure cosmic rays all along the Italian peninsula, collecting data over a huge latitude interval.
Heading south
Meanwhile, 20,000 km south, a Dutch expedition to the South Pole called Clean2Antarctica has just got under way, carrying a small cosmic-ray experiment from Nikhef on board a vehicle called Solar Voyager. The solar-powered cart, built from recycled 3D-printed household plastics, will make the first ground measurements in Antarctica of the muon decay rate and of charged particles from extensive-air cosmic-ray showers. Cosmic rays will be measured by a roof-mounted scintillation device as the cart makes a 1200 km, six-week-long journey from the edge of the Antarctic icefields to the geometric South Pole.
The team taking the equipment across the Antarctic to the South Pole comprises mechanical engineer Ter Velde and his wife Liesbeth, who initiated the Clean2Antarctica project and are both active ocean sailors. Back in the warmer climes of the Netherlands, researchers from Nikhef will remotely monitor for any gradients in the incoming particle fluxes as the magnetic field lines are converging closer to the pole. In theory, the magnetic field will funnel charged particles from the high atmosphere to the Earth’s surface, leading to higher fluxes near the pole. But the incoming muon signal should not be affected, as this is produced by high-energy particles producing air showers of charged particles, explains Nikhef project scientist Bob van Eijk. “But this is experimental physics and a first, so we will just do the measurements and see what comes out,” he says.
The scintillation panel used is adapted from the HiSPARC rooftop cosmic-ray detectors that Nikhef has been providing in high schools in the Netherlands, the UK and Denmark for the past 15 years. Under professional supervision, students and teachers build these roof-box-sized detectors themselves and run the detection programme and data-analysis in their science classes. Some 140 rooftop stations are online and many thousands of pupils have been involved over the years, stimulating interest in science and research.
Pristine backdrop
The panel being taken to Antarctica is a doubled-up version that is half the usual area of the HiSPARC panels due to strict space restrictions. Two gyroscope systems will correct for any changes in the level of the panel while traversing the Antarctic landscape. All the instruments are solar powered, with the power coming from photovoltaic panels on two additional carts pulled by the main electric vehicle. The double detection depth of the panels will allow for muon-decay detection by photomultiplier tubes as well as regular cosmic-ray particles such as electrons and photons. Data from the experiment will be relayed regularly by satellite from the Solar Voyager vehicle so that analysis can take place in parallel, and will be made public through a dedicated website.
The Clean2Antarctic expedition set off in mid-November from Union Glacier Camp station near the Antarctic Peninsula. It is sponsored by Dutch companies and from crowd funding, and has benefitted from extensive press and television coverage. The trip will take the team across bleak snow planes and altitudes up to 2835 m and, despite being the height of Antarctic summer, temperatures could be down to –30 °C. The mission aims to use the pristine backdrop of Antarctica to raise public awareness about waste reduction and recycling.
“This is one of the rare occasions that a scientific outreach programme, with genuine scientific questions targeting high-school students as prime investigators, teams up with an idealist group that tries to raise awareness on environmental issues regarding circular economy,” says van Eijk. “The plastic for the vehicles was collected by primary-school kids, while three groups of young researchers formed ‘think tanks’ to generate solutions to questions about environmental issues that industrial sponsors/partners have raised.” Polarquest2018 had a similar goal, and its MantaNet project became the first to assess the presence and distribution of microplastics in the Arctic waters north of Svalbard at a record latitude of 82.7° north. According to MantaNet project leader Stefano Alliani: “One of the conclusions already drawn by sheer observation is that even at such high latitudes the quantity of macro plastic loitering in the most remote and wildest beaches of our planet is astonishing.”
Paul Kunz, who revolutionised particle-physics computing and established the first Web server in the US, passed away on 12 September.
After completing a PhD in physics at Princeton University, Kunz began his illustrious 35 year-long career at the Stanford Linear Accelerator Center (SLAC) in 1974 as a research associate in David Leith’s experimental physics Group B. As well as being an accomplished particle physicist, he quickly took an interest in one of the computing challenges facing experiments at the time – how to increase offline data-processing capability at a reasonable cost.
Using his deep understanding of computer architecture, software and hardware, he proposed a novel solution well beyond the norms of the time: the construction of a “farm” of interconnected processors each capable of executing IBM 370/168 instructions generated from standard FORTRAN code by an intermediate translator. In effect, the collection of interconnected computers in the farm would emulate, at a much lower cost, a single mainframe, distributing tasks to the individual processors, which became known as emulators. Each emulator would process entire events from particle interactions. Thus a simple parallel processing algorithm that did not require special programming or intricate modification to an experiment’s existing software was born.
After forming a small team at SLAC and building a prototype emulator known as the 168/E, Paul met CERN computer specialists David Lord and Adolfo Fucci. They immediately expressed a desire to join forces. In fact, they were looking for an online filter processor that would execute standard offline FORTRAN code to make a selection of events for fast analysis by CERN’s UA1 experiment, the so-called express line. Exhibiting his usual selfless interest in sharing ideas, Paul agreed to join forces, thereby establishing one of the first “real time” intercontinental collaborations by using the European Academic Research Network (EARN) and BITNET, courtesy of IBM.
The CERN/SLAC collaboration successfully constructed offline processing farms, notably for UA1. In parallel, one of the members of Paul’s original team, Hanoch Brafman, went on to build a complementary 370/E emulator at the Weizmann Institute of Science in Israel. Subsequently, the CERN/SLAC teams developed the next-generation emulator, the 3081/E, which was used by UA1 and the Large Electron–Positron Collider (LEP) experiments in online and offline environments. The farms were inherently extendable by simply adding processors, and are arguably the inspiration for today’s distributed offline computing facilities. The success of the emulator-based UA1 third-level trigger facility pioneered the use of a processor farm for the so-called high-level trigger system, which has since been employed by most collider experiments (at LEP, the Tevatron and the LHC), albeit with commercial processors.
In the 1990s, Paul turned his attention to the challenges of software development and became a guru and advocator of object-oriented programming. He was a passionate user of Steve Job’s NeXT computer and on an historic visit to CERN where he had regularly been giving courses on C++ programming, Paul immediately recognised the potential of the Web as demonstrated by Tim Berners-Lee and Robert Cailliau. Returning to SLAC, Paul not only installed the software on his NeXT, thereby establishing the first Web server in the US, he also connected it to the SPIRES database, giving the Web development team a “killer app”, which demonstrated the huge potential of their project.
In his personal life, Paul was a champion BMW autocross driver and president of the Bay Area BMW club, at which he was also, along with his wife, a driving instructor for teenagers. When travelling to CERN, he would often land in Frankfurt, hire a BMW, and take it for a spin round the Nuremberg ring. He loved Chinese food and liked nothing better than to enjoy dinner in one of the many Chinese restaurants in the SLAC area with visiting friends and colleagues, often speaking French with his unique accent, a legacy of time spent at CEA Saclay in the 1970s.
Paul Kunz was a computing visionary and pioneer, and a great communicator who loved to share his ideas with irrepressible energy.
Leon Lederman, a pioneering US experimental particle physicist who shared the Nobel Prize in Physics for the discovery of the muon neutrino, passed away on 3 October at the age of 96. Lederman’s career spanned more than 60 years and had a major impact in putting the Standard Model of particle physics on empirical ground.
Lederman was born in New York City on 15 July 1922 to Russian–Jewish immigrant parents. He graduated from City College of New York with a degree in chemistry in 1943, but had already fallen under the influence of future physicists including Isaac Halpern and Martin Klein. After graduating he spent three years in the US Army, where he rose to the rank of 2nd lieutenant in the signal corps. In 1946 he entered the graduate school of physics at Columbia University, chaired by I I Rabi, and in 1951 he received his PhD in particle physics.
During the 1950s Lederman contributed to two major physics results: the discovery of the long-lived neutral K meson at Brookhaven National Laboratory’s 3 GeV Cosmotron in 1956; and, in 1957, the observation of parity violation in the pion–muon–electron decay chain at the Nevis 385 MeV synchrocyclotron at Columbia University. The latter experiment provided the first measurement of the muon magnetic moment, opening a path to the “g-2” experiment at CERN’s first accelerator, the synchrocyclotron. In 1958, shortly after he was promoted to professor, Lederman took his first sabbatical at CERN where he contributed to the organisation of the g-2 experiment. This programme lasted for almost two decades and involved many prominent CERN physicists, including Georges Charpak, Emilio Picasso, Francis Farley, Johannes Sens and Antonino Zichichi.
Lederman’s crowning achievement came in 1962 with the co-discovery of the muon neutrino at Brookhaven’s Alternating Gradient Synchrotron (AGS). For this work, he shared the 1988 Nobel Prize in Physics with Jack Steinberger and the late Melvin Schwartz “for the neutrino beam method and the demonstration of the doublet structure of the leptons through the discovery of the muon neutrino.” The experiment used a spark chamber to show that the neutrinos from beta decay and the neutrinos from muon decay were different, leading to the first direct observation of muon neutrinos and marking a key step in the understanding of weak interactions. Steinberger, who joined CERN six years after the muon-neutrino discovery and is now 97, has fond memories of his collaboration with Lederman. “What I remember about Leon is that we worked together at the same labs, at Nevis and at Brookhaven, and we got the Nobel Prize together, with Mel Schwartz. It was for a non-trivial experiment. He was a good friend. I’m very sad that he’s gone.”
At the end of the 1960s, in another experiment at the AGS, Lederman discovered the production of muon pairs with a continuous mass distribution in proton–nucleon collisions – an unexpected phenomenon that was soon interpreted as the result of quark–antiquark annihilation. As a follow-up to this experiment, in collaboration with physicists from CERN, Columbia and Rockefeller universities, Lederman proposed to study the production of electron–positron pairs at CERN’s Intersecting Storage Rings (ISR), which started operation in 1971. The experiment, known as R-103, discovered the production of neutral pions at high transverse momentum with a yield several orders of magnitude larger than expected. The result was again interpreted in terms of hard collisions between point-like proton constituents, demonstrating that these constituents also feel the strong interaction. “I had the privilege of working with Leon at Columbia in 1969–1970, when the R-103 experiment was proposed, and during the first years of ISR operation when he came to CERN,” says Luigi Di Lella of CERN. “I remember Leon as a physicist with enormous imagination, a boundless source of stimulating and often unconventional new ideas, which he always presented in a friendly and joyful atmosphere.”
As leader of the E70 and E288 experiments at Fermilab in the 1970s, Lederman also drove the effort that led to the discovery of the upsilon, the bound state of a bottom quark and antiquark, in 1977 (CERN Courier June 2017 p35). But his influence on the field of particle physics permeates far beyond his specific areas of research. In particular, he was a passionate advocate of education and worked with government and schools to create opportunities for students and better integrate physics into public education. He also had the rare quality to not take himself too seriously. Concerning the Higgs boson, he famously coined the term “God Particle” by using it in the title of his 1993 popular-science book The God Particle: If the Universe Is the Answer, What Is the Question? – though legend has it that he had originally wanted to call it the “God-dammed particle” because the Higgs boson was so difficult to find.
Lederman was director of the Nevis Laboratories at Columbia from 1961 to 1978. In 1979 he became director of Fermilab, where his vision and strategic planning led to the construction of the Tevatron (which operated from 1987 to 2011 and was the world’s highest-energy accelerator before the Large Hadron Collider came along). He stepped down as Fermilab director in 1989 and joined the faculty of the University of Chicago and, later, the Illinois Institute of Technology.
“Leon Lederman provided the scientific vision that allowed Fermilab to remain on the cutting edge of technology for more than 40 years,” says Nigel Lockyer, Fermilab’s current director. “Leon’s leadership helped to shape the field of particle physics, designing, building and operating the Tevatron and positioning the laboratory to become a world leader in accelerator and neutrino science. Leon had an immeasurable impact on the evolution of our laboratory and our commitment to future generations of scientists, and his legacy will live on in our daily work and our outreach efforts.”
If there is one area where particle accelerators have had the most significant scientific impact, it is surely X-ray science. Each year, tens of thousands of experiments are carried out at the 50 or so synchrotron X-ray facilities worldwide, spanning a wide range of disciplines – from materials science and archaeology, through structural biology, planetary science, environmental science, nanotechnology and more.
Nestled between two rivers in northwest Grenoble, France, the 22 partner-nation European Synchrotron Radiation Facility (ESRF) is one of the world’s leading light sources. It is based on a 844 m-circumference 6 GeV electron storage ring with 44 specialised experimental stations serving around 5000 users per year, and so far experiments there have contributed to over 32,000 peer-reviewed publications. Established in 1988 and inaugurated in 1994, the ESRF was the first third-generation synchrotron, using periodic magnetic arrays called undulators to deliver the world’s brightest X-ray beams. Since then, numerous national light sources have sprung up, alongside flagship third-generation facilities in Japan and the US.
A decade ago, the ESRF embarked on a 330 million euro upgrade programme to help ensure Europe’s leading position in the light-source world. At its core is a first-of-its-kind storage ring with an increased X-ray brightness and coherent flux 100 times higher than before, in addition to new X-ray beamlines, instrumentation, computing and other improvements. The ESRF upgrade will allow users to probe complex materials at the atomic level in greater detail, with higher quality, and much faster. The final beam of the current ESRF storage ring will be extracted on 10 December, and operations will cease for 20 months while the new machine – called the Extremely Brilliant Source (EBS) – takes shape.
Synchrotron radiation, which spans a broad spectrum up to the hard X-ray region, is famously something particle physicists try to avoid, as it limits the energy of circular colliders. But the power of X rays to elucidate the structure of matter had been known since the early 1900s, and it wasn’t long before this by-product of particle physics was put to use. Following the rise of storage rings during the 1960s, nascent synchrotron X-ray users carried out experiments “parasitically” by placing their apparatus in the path of synchrotron radiation from circular particle colliders. By the 1970s, dedicated second-generation sources started to be built, but it was not until the 1990s that third-generation light sources such as the ESRF were possible. Instead of just producing X-rays from the curved trajectories of electrons passing through a dipole, these facilities use long zip-like arrays of alternating-polarity magnets called undulators to produce much brighter, more spectrally coherent, X-ray beams. This allows users to probe matter at shorter spatial and temporal scales.
Key to increased X-ray brightness at a synchrotron is a parameter called the beam emittance, which is a measure of the size and spread of the electron beam as it circulates. Third-generation storage rings reduce the vertical emittance to the X-ray diffraction limit, thanks to a lattice design that decouples the vertical and horizontal motions of the electrons. Reducing the horizontal component is more challenging. One approach is to build a bigger storage ring (the 6.3 km tunnel at Fermilab that housed the Tevatron collider has been mooted as a possible site for an ultimate X-ray source, for example). However, that option isn’t always suitable, and the ESRF had to find a smarter solution.
The existing ESRF storage ring consists of 32 cells each 27 m long and made up of sequences of magnets, vacuum chambers and position monitors. Technically it’s called a double bend achromat lattice, since it relies on two dipoles per cell and because electrons of different energies are bent and focused in the same way (achromatically) resulting in very collimated and stable beams.
The new EBS lattice has the same footprint as the previous machine and will leave the present beamline layout unchanged. It is based on a hybrid multi-bend achromat (HMBA) with seven, as opposed to two, bending magnets per cell and optics that maximise the stable phase space volume available for the electron beam, reducing the horizontal emittance. The result is a tighter packing of electrons, increasing the brightness and degree of coherence of the X-rays by two orders of magnitude. This gives the EBS beams laser-like properties approaching those of X-ray free-electron lasers (XFELs) such as the European XFEL (CERN Courier July/August 2017 p18), and will make EBS the first high-energy fourth-generation synchrotron light source.
Inspired by SuperB
The EBS lattice uses concepts developed for the former SuperB project (an asymmetric electron—positron collider for flavour-physics studies) which, unusually for a collider, had an optimal horizontal emittance close to zero. The physicist behind the design, Pantaelo Raimondi, who led the SuperB project at INFN Frascati (and who is also the originator of the crab-waist technique to maximise collider luminosity, which is currently being considered for almost all future high-energy colliders) became director of accelerators at the ESRF in 2012 and set about the task of building a new kind of light source. An important moment in the evolution of the HMBA came at a workshop at CERN in 2011, when Raimondi realised the synergy with the multi-bend achromat design for an advanced synchrotron in Lund, Sweden, called MAX IV (CERN Courier September 2016 p39).
The challenges facing the ESRF’s accelerator physicists, engineers and technicians are huge. The new storage ring requires 1000 innovative magnets – nearly twice as many as in the previous ring. These have to be squeezed into the same space inside the accelerator tunnel, so each of them has to be more compact and generate magnetic fields up to three times stronger than existing models. The vacuum chambers also have had to be redesigned to fit the limited space in and around the magnets, while the mechanical tolerances of some components have shrunk to within a hundredth of a millimetre. Non-evaporable getter (NEG) technologies pioneered at CERN are to play an important role in overcoming these challenges. EBS will use permanent-magnet technology for its 128 dipoles, and around 90% of the existing ESRF infrastructure will also be reused. The reduced beam energy loss due to unwanted synchrotron radiation and the optimised magnet design have also resulted in a decrease of the power consumption of the synchrotron by about 30%.
Last year, engineers built an entire EBS cell, consisting of girders, magnets, vacuum chambers and other components. This allowed the team to confirm the engineering principles of the new arc and to start series production of the storage-ring components. The existing storage ring will be dismounted by April 2019 and EBS will be installed later in the year and commissioned by March 2020, with user service resuming by September that year.
Four brand new beamlines will be designed to take full advantage of the EBS, while others will be refurbished, and the project has a special development programme to address the optics that transform the raw X-ray beam into a form suited for experiments. The X-ray detectors themselves are also being developed to handle the increased flux, with one technology based on the MEDIPIX3RX chip developed by the Medipix3 collaboration, of which the ESRF and CERN are partners. Control systems, mechatronics and software are other key areas of work to ensure the EBS is ready when it enters user mode two years from now.
Lattice travels
The EBS lattice has inspired other major light sources around the world. In addition to MAX IV, which was inaugurated in 2016, several multi-bend achromat synchrotrons are under planning or construction. This includes Sirius in Brazil but also upgrades at high-energy synchrotrons such as the Advanced Photon Source (APS) at Argonne National Laboratory in the US. Meanwhile, the Advanced Light Source (ALS) in California is moving towards a conceptual design report, and SPring-8 in Japan is pursuing a HMBA that will enter operation on a similar timescale. The long-term goal is to one day build an “ultimate” storage ring in which the horizontal emittance reaches the fundamental X-ray diffraction limit, as is the case today for the vertical emittance. Meanwhile, XFELs are starting to so sprout up in the way storage rings did around the turn of the millennium.
Particle physics will continue to play a vital role in X-ray science by driving accelerators forward and sharing vacuum, detector and other technologies. In fact, when workers come to dismantle and rebuild the ESRF storage ring in the coming weeks and months, they will also be ripping out and remounting parts of the radio-frequency (RF) system of CERN’s Large Electron–Positron (LEP) collider, the forerunner to the LHC. LEP’s RF system was the starting point of the ESRF design and teams at both laboratories, located just 150 km apart, collaborated closely to turn third-generation light sources from an idea into reality.
This book provides a comprehensive overview of high-energy-density physics (HEDP), which concerns the dynamics of matter at extreme temperatures and densities. Such matter is present in stars, active galaxies and planetary interiors, while on Earth it is not found in normal conditions, but only in the explosion of nuclear weapons and in laboratories using high-powered lasers or pulsed-power machines.
After introducing, in the first three chapters, many fundamental physics concepts necessary to the understanding of the rest of the book, the author delves into the subject, covering many key aspects: gas dynamics, ionisation, the equation-of-state description, hydrodynamics, thermal energy transport, radiative transfer and electromagnetic wave–material interactions.
The author is an expert in radiation-hydrodynamics simulations and is known for developing the HYADES code, which is largely used among the HEDP community. This book can be a resource for research scientists and graduate students in physics and astrophysics.
Quantised Detector Networks (QDN) theory was invented to reduce the level of metaphysics in the application of quantum mechanics (QM), moving the focus from the system under observation to the observer and the measurement apparatuses. This approach is based on the consideration that “labstates”, i.e. the states of the system we use for observing, are the only things we can actually deal with, while we have no means to prove that the objects under study “exist” independently of observers or observations.
In this view, QM is not a theory describing objects per se, but a theory of entitlement, which means that it provides physicists with a set of rules defining what an observer is entitled to say in any particular context.
The book is organized in four parts: Basics, Applications, Prospects, and Appendices. The author provides, first of all, the formalism of QDN and then applies it to a number of experiments that show how it differs from standard quantum formalism. In the third part, the prospects for future applications of QDN are discussed, as well as the possibility of constructing a generalised theory of observation. Finally, the appendices collect collateral material referred to at various places in the book.
The aim of the author is to push the readers to look in a different way at the world they live in, to show them the cognitive traps caused by realism – i.e. the assumption that what we observe has an existence independent of our observation – and alerting them that various speculative concepts and theories discussed by some scientists do not actually have empirical basis. In other words, they cannot be experimentally tested.
Enrico Fermi formulated his eponymous paradox during a casual lunchtime chat with colleagues in Los Alamos: the great physicist argued that, probabilistically, intelligent extraterrestrial lifeforms had time to develop countless times in the Milky Way, and even to travel across our galaxy multiple times; but if so, where are they?
The author of this book, Milan Cirkovic, claims that, with the wealth of scientific knowledge accumulated in the many decades since then, the paradox is now even more severe. Space travel is not speculative anymore, and we know that planetary systems are common – including Earth-like planets – suggesting that life on our planet started very early and that our solar system is a relative late-comer on the cosmic scene; hence, we should expect many civilisations to have evolved way beyond our current stage. Given the huge numbers involved, Cirkovic remarks, the paradox would not even be completely solved by the discovery of another civilisation: we would still have to figure out where all others are!
The Great Silence aims at an exhaustive review of the solutions proposed to this paradox in the literature (where “literature” is to be understood in the broadest sense, ranging from scholarly astrobiology papers to popular-science essays to science-fiction novels), following a rigorous taxonomic approach. Cirkovic’s taxonomy is built from the analysis of which philosophical assumptions create the paradox in the first place. Relaxing the assumptions of realism, Copernicanism, and gradualism leads, respectively, to the families of solutions that Cirkovic labels “solipsist”, “rare Earth”, and “neocatastrophic”. His fourth and most heterogeneous category of solutions, labelled “logistic”, arises from considering possible universal limitations of physical, economic or metabolic nature.
The book starts by setting a rigorous foundation for discussion, summarising the scientific knowledge and dissecting the philosophical assumptions. Cirkovic does not seem interested in captivating the reader from the start: the preface and the first three chapters are definitely scholarly in their intentions, and assume that the reader already knows a great deal about Fermi’s paradox. As a particularly egregious example, Kardashev’s speculative classification of civilisations, based on the scale of their energy consumption, plays a very important role in this book; one would have therefore expected a discussion about that, somewhere at the beginning. Instead, the interested reader has to resort to a footnote for a succinct definition of the three types of civilisation (Type I: exploiting planetary resources; Type II: using stellar system resources; Type III: using galactic resources).
However, after these introductory chapters, Cirkovic’s writing becomes very pleasant and engaging, and his reasoning unfolds clearly. Chapters four to seven are the core of the book, each of them devoted to the solutions allowed by negating one assumption. Every chapter starts with an analogy with a masterpiece in cinema or literature, followed by a rigorous philosophical definition. Then, the consequent solutions to Fermi’s paradox are reviewed and, finally, a résumé of take-home messages is provided.
This parade of solutions gives a strange feeling: each of them sounds either crazy, or incredibly unlikely, or insufficient to solve the paradox (at least in isolation). Still, once we accept Cirkovic’s premise that Fermi’s paradox means that some deeply rooted assumption cannot be valid, we are compelled to take seriously some outlandish hypothesis. The reader is invited to ponder, for example, how the solution to the paradox might depend on the politics of the Milky Way in the last few billion years: extraterrestrial civilisations may have all converged to a Paranoid Style in Galactic Politics, or we might unknowingly be under the jurisdiction of an Introvert Big Brother (Cirkovic has a talent for catchy titles). Some Great Old Ones might be temporarily asleep, or we (and any conceivable biological intelligence) might be limited in our evolution by some Galactic Stomach-Ache. A large class of very gloomy hypotheses assumes that all our predecessors were wiped out before reaching very high Kardashev’s scores, and Cirkovic seems particularly fond of the idea of swarms of Deadly Probes that may still be roaming around, ready to point at us as soon as they notice our loudness. Unless we reach the aforementioned state of galactic paranoia, which makes for a very nice synergy between two distinct solutions of the paradox.
The author not only classifies the proposed solutions, but also rates them by how fully they would solve this paradox. The concluding chapter elaborates on several philosophical challenges posed by Fermi’s paradox, in particular to Copernicanism, and on the link between it and the future of humanity.
Cirkovic is a vocal (and almost aggressive) critic of most of the SETI-related literature, claiming that it relies on excessive assumptions which strongly limits SETI searches. In his words, the failure of SETI so far has mostly occurred on philosophical and methodological levels. He quotes Kardashev in saying that extraterrestrial civilisations have not been found because they have not really been searched for. Hence Cirkovic’s insistence on a generalisation of targets and search methods.
An underlying theme in this book is the relevance of philosophy for the advancement of science, in particular when a science is in its infancy, as he argues to be the case for astrobiology. Cirkovic draws an analogy with early 20th century cosmology, including a similitude between Fermi’s and Olmert’s paradoxes (the latter being: how can the night sky be dark, if we are reachable by the light of an infinite number of stars in an infinitely old universe?).
I warmly recommend The Great Silence to any curious reader, in spite of its apparent disinterest for a broad readership. In it, Cirkovic makes a convincing case that Fermi’s paradox is a fabulously complex and rich intellectual problem.
In this book, Timothy Jorgensen, a professor of radiation medicine at Georgetown University in the US, recounts the story of the discovery of radioactivity and how mankind has been transformed by it, with the aim of sweeping away some of the mystery and misunderstanding that surrounds radiation.
The book is structured in three parts. The first is devoted to the discovery of ionising radiation in the late 19th century and its rapid application, notably in the field of medical imaging. The author establishes a vivid parallel with the discovery and exploitation of radio waves, a non-ionising counterpart of higher energy X rays. A dynamic narrative, peppered with personal anecdotes by key actors, succeeds in transmitting the decisive scientific and societal impact of radiation and related discoveries. The interleaving of the history of the discovery with aspects of the lives of inspirational figures such as Ernest Rutherford and Enrico Fermi is certainly very relevant, attractive and illustrative.
In the second part, the author focuses on the impact of ionising radiation on human health, mostly through occupational exposure in different working sectors. A strong focus is on the case of the “radium girls” – female factory workers who were poisoned by radiation from painting watch dials with self-luminous paint. This section also depicts the progress in radiation-protection techniques and the challenges related to quantifying the effects of radiation and establishing limits for the exposure to it. The text succeeds in outlining the difficulties of linking physical quantities of radiation with its impact on human health.
The risk assessment related to radiation exposure and its impact on human health is further covered in the third part of the book. Here, Jorgensen aims to provide quantitative tools for the public to be able to evaluate the benefits and risks associated with radiation exposure. Despite his effort to offer a combination of complementary statistical approaches, readers are left with an impression that many aspects of the impact of radiation on human health are not fully understood. On the contrary, the large number of radiation-exposure cases in the Hiroshima and Nagasaki nuclear bombings, after which it was possible to correlate the absorbed dose with the location of the various victims at the time of the explosion, provides a scientifically valuable sample to study both deterministic and stochastic effects of radiation on human health.
In part three, the book also digresses at length about the role of nuclear weapons in the US defence and geopolitical strategy. This topic seems somewhat misplaced with respect to the more technical and scientific content of the rest of the text. Moreover, it is highly US-centric, often neglecting the analogous role of such weapons in other countries.
It is noteworthy that the book does not cover radiation in space and its crucial impact on human spaceflight. Likewise, the discovery of cosmic radiation through Hess’ balloon experiment in 1911–1912, while constituting an essential finding in addition to the already discovered radioactivity from elements on the Earth’s surface, is completely overlooked.
Despite the lack of space-radiation coverage and the somewhat uncorrelated US defence considerations, this book is definitely a very good read that will satisfy the reader’s curiosity and interest with respect to radiation and its impact on humans. In addition, it provides insight into the more general progress of physics, especially in the first half of the 19th century, in a highly dynamic and entertaining manner.
The 11th ROOT Users’ Workshop was held on 10–13 September in Sarajevo, Bosnia and Herzegovina, at the Academy of Science and Arts: an exceptional setting that also provided an opportunity to involve Bosnia and Herzegovina in CERN’s activities.
The SoFTware Development for Experiments group in the experimental physics department at CERN drives the development of ROOT, a modular software toolkit for processing, analysing and visualising scientific data. ROOT is also a means to read and write data: LHC experiments alone produced about 1 exabyte of data stored in the ROOT file format.
Thousands of high-energy physicists use ROOT daily to produce scientific results. For the ROOT team, this is a big responsibility, especially considering the challenges Run 3 at the LHC and the High Luminosity LHC (HL-LHC) pose to all of us. Luckily, we can rely on a lively user community, whose contribution is so useful that, periodically, a ROOT users’ workshop is organised. The event’s objective is to gather together the ROOT community of users and developers to collect criticism, praise and suggestions: a unique occasion to shape the future of the ROOT project.
More than 100 people attended this year’s workshop, a 30% increase from 2015, making the event a success. What’s more, the diversity of the attendees – students, analysis physicists, software experts and framework developers – brought different levels of expertise to the event. The workshop featured 69 contributions as well as engaging discussions. Software companies participated, with three invited contributions: Peter Müßig from SAP presented OpenUI5, the core of the SAP javascript framework that will be used for ROOT’s graphical user interface; Chandler Carruth from Google discussed ways to make large-scale software projects written in C++, the language for number-crunching code in high-energy and nuclear physics (HENP), simpler, faster and safer; and Sylvain Corlay from Quantstack showed novel ways to tackle numerical analysis with multidimensional array expressions. These speakers said they enjoyed the workshop and plan to come to CERN to extend the collaboration.
ROOT’s renovation was the workshop’s main theme. To be at the bleeding edge of software technology, ROOT – which has been the cornerstone of virtually all HENP software stacks for two decades – is undergoing an intense upgrade of its key components. This effort represents an exciting time for physicists and software developers. In the event, ROOT users expressed their appreciation of the effort to make it easier to use and faster on modern computer architectures, with the sole objective of reducing the time interval between data delivery and the presentation of plots.
In particular, the spotlight was on the modernisation of the I/O subsystem, crucial for the future LHC physics programme; ROOT’s parallelisation, a prerequisite to face Run 3 and HL-LHC analyses; as well as on new graphics, multivariate tools and an interface to the Python language, which are all elements of prime importance for scientists’ everyday work.
The participants’ feedback was enthusiastic, the atmosphere was positive, and the criticism received was constructive and fruitful for the ROOT team. We thank the participating physicists and computer scientists: we appreciated your engagement and are looking forward to organising the next ROOT workshop.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.