Bluefors – leaderboard other pages

Topics

Facing up to the exabyte era

The high-luminosity Large Hadron Collider (HL-LHC) will dramatically increase the rate of particle collisions compared with today’s machine, boosting the potential for discoveries. In addition to extensive work on CERN’s accelerator complex and the LHC detectors, this second phase in the LHC’s life will generate unprecedented data challenges.

The increased rate of collisions makes the task of reconstructing events (piecing together the underlying collisions from millions of electrical signals read out by the LHC detectors) significantly more complex. At the same time, the LHC experiments are planning to employ more flexible trigger systems that can collect a greater number of events. These factors will drive a huge increase in computing needs for the start of the HL-LHC era in around 2026. Using current software, hardware and analysis techniques, the required computing capacity is roughly 50–100 times higher than today, with data storage alone expected to enter the exabyte (1018 bytes) regime.

It is reasonable to expect that technology improvements over the next seven to 10 years will yield an improvement of around a factor 10 in both processing and storage capabilities for no extra cost. While this will go some way to address the HL-LHC’s requirements, it will still leave a significant deficit. With budgets unlikely to increase, it will not be possible to solve the problem by simply increasing the total computing resources available. It is therefore vital to explore new technologies and methodologies in conjunction with the world’s leading information and communication technology (ICT) companies.

CERN openlab, which was established by the CERN IT department in 2001, is a public–private partnership that enables CERN to collaborate with ICT companies to meet the demands of particle-physics research. Since the start of this year, CERN openlab has carried out an in-depth consultation to identify the main ICT challenges faced by the LHC research community over the coming years. Based on our findings, we published a white paper in September on future ICT challenges in scientific research.

The paper identifies 16 ICT challenge areas that need to be tackled in collaboration with industry, and these have been grouped into four overarching R&D topics. The first focuses on data-centre technologies to ensure that: data-centre architectures are flexible and cost effective; cloud-computing resources can be used in a scalable, hybrid manner; new technologies for solving storage-capacity issues are thoroughly investigated; and long-term data-storage systems are reliable and economically viable. The second major R&D topic relates to the modernisation of code, so that the maximum performance can be achieved on the new hardware platforms available. The third R&D topic focuses on machine learning, in particular its potentially large role in monitoring the accelerator chain and optimising the use of ICT resources.

The fourth R&D topic in the white paper identifies ICT challenges that are common across research disciplines. With ever more research fields such as astrophysics and biomedicine adopting big-data methodologies, it is vital that we share tools and learn from one another – in particular to ensure that leading ICT companies are producing solutions that meet our common needs.

In summary, CERN openlab has identified ICT challenges that must be tackled over the coming years to ensure that physicists worldwide can get the most from CERN’s infrastructure and experiments. In addition, the white paper demonstrates the emergence of new technology paradigms, from pervasive ultra-fast networks of smart sensors in the “internet of things”, to machine learning and “smart everything” paradigms. These technologies could revolutionise the way big science is done, particularly in terms of data analysis and the control of complex systems, and also have enormous potential for the benefit of wider society. CERN openlab, with its unique collaboration with several of the world’s leading IT companies, is ideally positioned to help make this a reality.

• openlab.cern.

Health Physics: Radiation-Generation Devices, Characteristics, and Hazards

By Joseph John Bevelacqua
Wiley-VCH

176406_2015_07_Bevelacqua_HC_EM_COMP_LAY1.indd

When developing technologies involving the use of nuclear material or ionisation radiation, a number of safety issues and potential risks have to be addressed. The author of this book, a certified health physicist and an expert in radiation protection, discusses these emerging topics related to radiation-generating technologies and associated hazards.

The book opens with a brief overview of modern radiation-protection challenges, before delving into specific areas. First, the author discusses the nuclear-fuel cycle, analysing its steps and related issues such as reactors, new technologies for uranium enrichment and waste disposal. In the following section, he deals with nuclear accidents and radiological emergencies – making specific reference to the well-known disasters of Three Mile Island, Chernobyl and Fukushima Daiichi – and with the risk of terrorist events involving sabotage or the use of improvised nuclear weapons and devices.

Today, nuclear material is also largely employed for medical imaging and therapies, thus a part of the book is devoted to these technologies and to the consequent increase of public radiation exposure. Finally, the last section focuses on regulatory issues, limitations and challenges.

Meant for upper-level undergraduate and graduate students of health-physics and engineering courses, the book would also be a useful reference for scientists and professionals working in radiation protection, fuel-cycle technology and nuclear medicine. More than 300 problems with solutions accompany the text and many appendices provide background information.

Neutrino Astronomy: Current Status, Future Prospects

By T Gaisser and A Karle (eds)
World Scientific

51UNUvRgG9L._SX342_SY445_QL70_ML2_

This review volume is motivated by the 2014 observation of a high-energy neutrino flux of extraterrestrial origin by the IceCube experiment at the South Pole. The energy of the events recorded ranges from 30 to 2000 TeV, with the latter marking the highest-energy neutrino interaction ever observed. The study of neutrinos originating from violent astrophysical sources enhances our knowledge not only of cosmological phenomena but also of neutrinos themselves.

This book gives an overview of the current status of research in the field and of existing and future neutrino observatories. The first group of chapters present the physics of potential sources of high-energy neutrinos, including gamma-ray bursts, active galactic nuclei, star-forming galaxies and sources in the Milky Way. A chapter is then dedicated to the measurements performed by IceCube, the results of which are discussed in terms of energy spectrum, flavour-ratio and arrival-direction isotropy. Following this, the results of two deep-sea neutrino experiments, ANTARES and Baikal, are presented.

After a brief discussion of other research topics in which the study of high-energy astrophysical neutrinos can play an important role, such as the quest for dark matter, the book examines the next generation of cosmic neutrino detectors. In particular, the future KM3NeT experiment, which will consist of a network of underwater telescopes located in the Mediterranean Sea, and IceCube-Gen2, characterised by unprecedented sensitivity and higher angular resolution compared to IceCube, are described.

Finally, a review of present and in-planning experiments aiming at detecting radio emissions from high-energy neutrino interactions concludes the volume.

An Introduction to Gauge Theories

By N Cabibbo, L Maiani and O Benhar
CRC Press

CCboo1_08_17

There is always great excitement among the academic community when a new book by renowned scientists is published. Written by leading experts in particle physics, this book by Luciano Maiani and Omar Benhar, with contributions from the late Nicola Cabibbo, does not disappoint in this regard. Former CERN Director-General Maiani co-proposed the GIM mechanism, which is required to suppress flavour-changing neutral currents at the tree level and assumed the existence of a fourth quark that was discovered in 1974 at SLAC and BNL, while Cabibbo proposed a solution to the puzzle of electroweak decays of strange particles, which was later extended to give rise to the Cabibbo–Kobayashi–Maskawa mixing matrix. Omar Benhar, an INFN research director and professor at the University of Rome “La Sapienza”, is expert in the theory of many-particle systems, the structure of compact stars and electroweak interactions of nuclei.

Their book is the third volume of a series dedicated to relativistic quantum mechanics, gauge theories and electroweak interactions, based on material taught to graduate students at the University of Rome over a period of several decades. Given that gauge theories are the basis of interactions between elementary particles, it is not surprising that there are many books about gauge theories already out there – among the best are those written by Paul Frampton, J R Aitchison and Anthony Hey, Chris Quigg, Ta-Pei Cheng and Ling-Fong Li. One might therefore think that it is hard to add something new to the field, but this book introduces the reader in a concise and elegant manner to a modern account of the fundamentals of renormalisation in quantum field theories and to the concepts underlying gauge theories.

Containing more than 300 pages organised in 20 chapters and several appendices, the book focuses mainly on quantum electrodynamics (QED), which – despite its simplicity and limitations – serves as the mould of a gauge theory and at the same time it has a high predictive power and numerous applications. The first part of this treatise deals with the quantisation of QED via the path-integral method, from basic to advanced concepts, followed by a brief discussion on the renormalisation of QED and some of its applications, such as bremsstrahlung, the Lamb shift, and the electron anomalous magnetic moment. The prediction of the latter is considered one of the great achievements of QED.

In the second part of the book, the authors cover the renormalisation group equations of QED and introduce the quantisation of non-Abelian gauge theories, finishing with a proof of the asymptotic freedom of quantum chromodynamics. Afterwards, the concept of the running coupling constant is used to introduce a few ideas about grand unification. The final chapters are devoted to concepts related to the Standard Model of particle physics, such as the Higgs mechanism and the electroweak corrections to the muon anomalous magnetic moment. Finally, a few useful formulas and calculations are provided in several appendices.

Throughout the book the authors not only present the mathematical framework and cover basic and advanced concepts of the field, but also introduce several physical applications. The most recent discoveries in the field of particle physics are discussed. This is a book targeted at advanced students accustomed to mental challenges. A minor flaw is the lack of problems at the end of the chapters, which would offer students the possibility to apply the acquired knowledge, although the authors do encourage readers to complete a few demonstrations. This text will be very helpful for students and teachers interested in a treatment of the fundamentals of gauge theories via a concise and modern approach in the constantly changing world of particle physics.

Centennial of General Relativity: A Celebration

By César Augusto Zen Vasconcellos (ed.)
World Scientific

514-rF-ng-L._SX312_BO1,204,203,200_

In 1915 Albert Einstein presented to the Royal Prussian Academy of Sciences his theory of general relativity (GR), which represented a breakthrough in modern physics and became the foundation of our understanding of the universe at large. A century later, this elegant theory is still the basis of the current description of gravitation and a number of predictions derived from it have been confirmed in observations and experiments – most recently with the direct detection of gravitational waves.

This book celebrates the centenary of GR with a collection of 11 essays by different experts, which offer an overview of the theory and its numerous astrophysical and cosmological implications. After an introduction to GR, the Tolman–Oppenheimer–Volkoff equations describing the structure of relativistic compact stars are derived and their extension to deformed compact stellar objects presented. The book then moves to the so-called pc-GR theory, in which GR is algebraically extended to pseudo-complex co-ordinates in an attempt to get around singularities. Other topics covered are strange matter, in particular a conjecture that pulsar-like compact stars may be made of a condensed three-flavour quark state, and the use of a particular solution of the GR equations to construct multiple non-spherical cosmic structures.

Keeping the book contemporary, it also gives an overview of the most recent experimental results in particle physics and cosmology. Several contributions are devoted to the search for physics beyond the Standard Model at CERN, studies of cosmic objects and phenomena through gamma-ray lenses and, finally, to the recent detection of gravitational waves by the LIGO experiment.

Strangeness in quark matter

The 17th edition of the International Conference on Strangeness in Quark Matter (SQM 2017) was held from 10 to 15 July at Utrecht University in the Netherlands. The SQM series focuses on new experimental and theoretical developments on the role of strangeness and heavy-flavour production in heavy-ion collisions, and in astrophysical phenomena related to strangeness. This year’s SQM event attracted more than 210 participants from 25 countries, with 20% of attendees made up of female researchers. A two-day-long graduate school on the role of strangeness in heavy-ion collisions with 40 participants preceded the conference.

The scientific programme consisted of 53 invited plenary talks, 70 contributed parallel talks and a poster session. Three discussion sessions provided scope for the necessary debates on crucial observables to characterise strongly interacting matter at extreme conditions of high baryon density and high temperature and to define future possible directions. One of the discussions centred on the production of hadron resonances and their vital interactions in the partonic and hadronic phase, which provide evidence for an extended hadronic lifetime even in small collision systems and might affect other QGP observables. Moreover, future astrophysical consequences for SQM following the recent detection of gravitational waves were outlined: gravitational waves from relativistic neutron-star collisions can serve as cosmic messengers for the phase structure and equation-of-state of dense and strange matter, quite similar to the environment created in relativistic heavy-ion collisions.

Representatives from all major collaborations at CERN’s Large Hadron Collider and Super Proton Synchrotron, Brookhaven’s Relativistic Heavy Ion Collider (RHIC), and the Heavy Ion Synchrotron SIS at the GSI Helmholtz Centre in Germany made special efforts to release new data at this conference. Thanks to the excellent performance of these accelerator facilities, a wealth of new data on the production of strangeness and heavy quarks in nuclear collisions have become available.

Physics at the festival frontier

Following a successful debut in 2016 that witnessed 4000 people happily packing themselves into a tent over three days in Charlton Park in the UK, CERN returned to the WOMAD festival this year with the Physics Pavilion. Featuring talks on theremins and electric guitars, the physics of the NA62 experiment, and the origins of creativity, the Pavilion was once again packed during a year with record attendance at the festival. Those who craved something more hands-on were not disappointed: The Lab, one of two new additions requested by the festival’s management team, allowed participants to build their own cloud chamber, particle collision event, and more. CERN virtual-reality headsets created a queue at a third location called Outside at The Lab, where people eagerly explored the CMS experiment, and Devoxx4Kids allowed children to understand physics by modelling catapults in Minecraft code – perhaps inspiring the next generation of physics model-builders.

The team behind the Physics Pavilion includes physicists from CERN, the UK Science and Technology Facilities Council, the Institute of Physics, and Lancaster University. The enthusiasm for science was clear to see, with proud parents asking for careers advice for their young Einsteins and Curies. “Can we visit?”, “What if there are things smaller than quarks?”, “What are you looking for now?” and “What can we use the Higgs boson for?” were just some of the tough questions asked by hungry minds at a festival that celebrates diversity and culture. With events like this, CERN and its partners are reaching out to a wide range of people, sharing knowledge and excitement, and showing that science is part of our common cultural heritage.

Construction of protoDUNE detector begins

The Deep Underground Neutrino Experiment (DUNE) in the US, the cavern for which entered construction this summer, will make precision studies of neutrinos produced 1300 km away at Fermilab as part of the international Long-Baseline Neutrino Facility. The DUNE far detector will be the largest liquid-argon (LAr) neutrino detector ever built, comprising four cryostats holding 68,000 tonnes of liquid, and prototype detectors called protoDUNE are being built at CERN.

Each protoDUNE detector comprises a 10 × 10 × 10 m LAr time projection chamber with a single-phase (SP) or dual-phase (DP) configuration, containing about 800 tonnes of LAr. While the two big cryostats housing the detectors are about to be completed, the construction of the protoDUNE detectors themselves has just started. The first of six anode-plane-assembly modules for the protoDUNE-SP detector, which will detect electrons produced by ionising particles passing through the detector (pictured) recently arrived at CERN. The module will be tested, together with its electronics, and then installed in its final position inside the cryostat.

In parallel with the anode-plane-assembly, other parts of the protoDUNE-SP detector are being assembled at CERN, including the field cage, which keeps the electric field uniform inside the volume of the detector. Around a quarter of the 28 field-cage modules have already been assembled and are stored in CERN’s EHN1 hall, ready to be installed. The assembly and installation of the detector parts is expected to be completed by spring next year, in order for protoDUNE-SP to take data in autumn 2018.

The protoDUNE detectors are among several major activities taking place at the CERN neutrino platform, which was initiated in 2013 to develop detector technology for neutrino experiments in the US and Japan.

Study links solar activity to exotic dark matter

The origin of solar flares, powerful bursts of radiation appearing as sudden flashes of light, has puzzled astrophysicists for more than a century. The temperature of the Sun’s corona, measuring several hundred times hotter than its surface, is also a long-standing enigma.

A new study suggests that the solution to these solar mysteries is linked to a local action of dark matter (DM). If true, it would challenge the traditional picture of DM as being made of weakly interacting massive particles (WIMPs) or axions, and suggest that DM is not uniformly distributed in space, as is traditionally thought.

The study is not based on new experimental data. Rather, lead author Sergio Bertolucci, a former CERN research director, and collaborators base their conclusions on freely available data recorded over a period of decades by geosynchronous satellites. The paper presents a statistical analysis of the occurrences of around 6500 solar flares in the period 1976–2015 and of the continuous solar emission in the extreme ultraviolet (EUV) in the period 1999–2015. The temporal distribution of these phenomena, finds the team, is correlated with the positions of the Earth and two of its neighbouring planets: Mercury and Venus. Statistically significant (above 5σ) excesses of the number of flares with respect to randomly distributed occurrences are observed when one or more of the three planets find themselves in a slice of the ecliptic plane with heliocentric longitudes of 230°–300°. Similar excesses are observed in the same range of longitudes when the solar irradiance in the EUV region is plotted as a function of the positions of the planets.

If true, our findings will provide a totally different view about dark matter

Konstantin Zioutas

These results suggest that active-Sun phenomena are not randomly distributed, but instead are modulated by the positions of the Earth, Venus and Mercury. One possible explanation, says the team, is the existence of a stream of massive DM particles with a preferred direction, coplanar to the ecliptic plane, that is gravitationally focused by the planets towards the Sun when one or more of the planets enter the stream. Such particles would need to have a wide velocity spectrum centred around 300 km s–1 and interact with ordinary matter much more strongly than typical DM candidates such as WIMPs. The non-relativistic velocities of such DM candidates make planetary gravitational lensing more efficient and can enhance the flux of the particles by up to a factor of 106, according to the team.

Co-author Konstantin Zioutas, spokesperson for the CAST experiment at CERN, accepts that this interpretation of the solar and planetary data is speculative – particularly regarding the mechanism by which a temporarily increased influx of DM actually triggers solar activity. However, he says, the long persisting failure to detect the ubiquitous DM might be due to the widely assumed small cross-section of its constituents with ordinary matter, or to erroneous DM modelling. “Hence, the so-far-adopted direct-detection concepts can lead us towards a dead end, and we might find that we have overlooked a continuous communication between the dark and the visible sector.

Models of massive DM streaming particles that interact strongly with normal matter are few and far between, although the authors suggest that “antiquark nuggets” are best suited to explain their results. “In a few words, there is a large ‘hidden’ energy in the form of the nuggets,” says Ariel Zhitnitsky, who first proposed the quark-nugget dark-matter model in 2003. “In my model, this energy can be precisely released in the form of the EUV radiation when the anti-nuggets enter the solar corona and get easily annihilated by the light elements present in such a highly ionised environment.”

The study calls for further investigation, says researchers. “It seems that the statistical analysis of the paper is accurate and the obtained results are rather intriguing,” says Rita Bernabei, spokesperson of the DAMA experiment, which for the first time in 1998 claimed to have detected dark matter in the form of WIMPs on the basis of an observed seasonal modulation of a signal in their scintillation detector. “However, the paper appears to be mostly hypothetical in terms of this new type of dark matter.

The team now plans to produce a full simulation of planetary lensing taking into account the simultaneous effect of all the planets in the solar system, and to extend the analysis to include sunspots, nano-flares and other solar observables. CAST, the axion solar telescope at CERN, will also dedicate a special data-taking period to the search for streaming DM axions.

“If true, our findings will provide a totally different view about dark matter, with far-reaching implications in particle and astroparticle physics,” says Zioutas. “Perhaps the demystification of the Sun could lead to a dark-matter solution also.”

Miniature detector first to spot coherent neutrino–nucleus scattering

The COHERENT collaboration at Oak Ridge National Laboratory (ORNL) in the US has detected coherent elastic scattering of neutrinos off nuclei for the first time. The ability to harness this process, predicted 43 years ago, offers new ways to study neutrino properties and could drastically reduce the scale of neutrino detectors.

Neutrinos famously interact very weakly, requiring very large volumes of active material to detect their presence. Typically, neutrinos interact with individual protons or neutrons inside a nucleus, but coherent elastic neutrino–nucleus scattering (CEνNS) occurs when a neutrino interacts with an entire nucleus. For this to occur, the momentum exchanged must remain significantly small compared to the nuclear size. This restricts the process to neutrino energies below a few tens of MeV, in contrast to the charged-current interactions by which neutrinos are usually detected. The signature of CEνNS is a low-energy nuclear recoil with all nucleon wavefunctions remaining in phase, but until now the difficulty in detecting these low-energy nuclear recoils has prevented observations of CEνNS – despite the predicted cross-section for this process being the largest of all low-energy neutrino couplings.

The COHERENT team, comprising 80 researchers from 19 institutions, used ORNL’s Spallation Neutron Source (SNS), which generates the most intense pulsed neutron beams in the world while simultaneously creating a significant yield of low-energy neutrinos. Approximately 5 × 1020 protons are delivered per day, each returning roughly 0.08 isotropically emitted neutrinos per flavour. The researchers placed a detector, a caesium-iodide scintillator crystal doped with sodium, 20 m from the neutrino source with shielding to reduce background events associated with the neutron-induced nuclear recoils produced from the SNS. The results favour the presence of CEνNS over its absence at the 6.7σ level, with 134±22 events observed versus 173±48 predicted.

Crucially, the result was achieved using the world’s smallest neutrino detector, with a mass of 14.5 kg. This is a consequence of the large nuclear mass of caesium and iodine, which results in a large CEνNS cross-section.

The intense scintillation of this material for low-energy nuclear recoils, combined with the large neutrino flux of the SNS, also contributed to the success of the measurement. In effect, CEνNS allows the same detection rates as conventional neutrino detectors that are 100 times more massive.

“It is a nearly ideal detector choice for coherent neutrino scattering,” says lead designer Juan Collar of the University of Chicago. “However, other new coherent neutrino-detector designs are appearing over the horizon that look extraordinarily promising in order to further reduce detector mass, truly realising technological applications such as reactor monitoring.”

Yoshi Uchida of Imperial College London, who was not involved in the study, says that detecting neutrinos via the neutral-current process as opposed to the usual charged-current process is a great advantage because it is “blind” to the type of neutrino being produced and is sensitive at low energies. “So in combination with other types of detection, it could tell us a lot about a particular neutrino source of interest.” However, he adds that the SNS set-up is very specific and that, outside such ideal conditions, it might be difficult to scale a similar detector in a way that would be of practical use. “The fact that the COHERENT collaboration already has several other target nuclei (and detection methods) being used in their set-up means there will be more to come on this subject in the near future.”

bright-rec iop pub iop-science physcis connect