Comsol -leaderboard other pages

Topics

A new era for particle physics

Driven by technology, scale, geopolitical reform, and even surprising discoveries, the field of particle physics is changing dramatically. This evolution is welcomed by some and decried by others, but ultimately is crucial for the survival of the field. The next 50 years of particle physics will be shaped not only by the number, character and partnerships of labs around the world and their ability to deliver exciting science, but also by their capacity to innovate.

Increasingly, the science we wish to pursue as a field has demanded large, even mega-facilities. CERN’s LHC is the largest, but the International Linear Collider (ILC) project and plans for a large circular collider in China are not far behind, and could even be new mega-science facilities. The main centres of particle physics around the world have changed significantly in the last several decades due to this trend. But a new picture is emerging, one where co-ordination among and participation in these mega-efforts is required at an international level. A more co-ordinated global playing field is developing, but “in transition” would best describe the situation at present.

Europe and North America have clear directions of travel. CERN has emerged as the leader of particle physics in Europe; the US has only Fermilab devoted entirely to particle physics. Other national labs in the US and Europe have diversified into other areas of science and some have found a niche for particle physics while also connecting strongly to the research programmes at Fermilab and CERN. Indeed, in the US a network of laboratories has emerged involving SLAC, Brookhaven, Argonne and Berkeley. Once quite separate, now these labs contribute to the LHC and to the neutrino programme based at Fermilab. Furthermore, each lab is becoming specialised in certain science and technology areas, while their collective talents are being developed and protected by the Department of Energy. As a result, CERN and Fermilab are stronger partners than ever before. Likewise, several European labs are now engaged or planning to engage in the ambitious neutrino programme being hosted by Fermilab, for which CERN continues to be the largest partner.

Will this trend continue, with laboratories worldwide functioning more like a network, or will competition slow down the inevitable end game? There are at least a couple of wild cards in the deck at the moment. Chief issues for the future are how aggressively and at what scale will China enter the field with its Circular Electron Positron Collider and Super Proton–Proton Collider projects, and whether Japan will move forward to build the ILC. The two projects have obvious science overlap and some differences, and if either project moves forward, Europe and the US will want to be involved and the impact could be large.

The global political environment is also fast evolving, and science is not immune from fiscal cuts or other changes taking place. While it is difficult to predict the future, fiscal austerity is here to stay for the near term. The consequences may be dramatic or could continue the trend of the last few decades, shrinking and consolidating our field. Rising above this trend will take focus, co-ordination and hard work. More importantly than ever, the worldwide community needs to demonstrate the value of basic science to funding stakeholders and the public, and to propose compelling reasons why the pursuit of particle physics deserves its share of funding.

Top-quality, world-leading science should be the primary theme, but it is not enough. The importance of social and economic impact will loom ever larger, and the usual rhetoric about it taking 40 years to reap the benefits from basic research – even if true – will no longer suffice. Innovation, more specifically the role of science in fuelling economic growth, is the favourite word of many governments around the world. Particle physics needs to be engaged in this discussion and contribute its talent. Is this a laboratory effort, a network of laboratories effort, or a global effort? For the moment, innovation is local, sometimes national, but our field is used to thinking even bigger. The opportunity to lead in globalisation is on our doorstep once again.

Theory of Quantum Transport at Nanoscale: An Introduction

By Dmitry A Ryndyk
Springer

51IDr5vOk+L

This book provides an introduction to the theory of quantum transport at the nanoscale – a rapidly developing field that studies charge, spin and heat transport in nanostructures and nanostructured materials. The theoretical models and methods recollected in the volume are widely used in nano-, molecular- and bio-electronics, as well as in spin-dependent electronics (spintronics).

The book begins by introducing the basic concepts of quantum transport, including the Landauer–Büttiker method; the matrix Green function formalism for coherent transport; tunnelling (transfer) Hamiltonian and master equation methods for tunnelling; Coulomb blockade; and vibrons and polarons.

In the second part of the book, the author gives a general introduction to the non-equilibrium Green function theory, describing first the approach based on the equation-of-motion technique, and then a more sophisticated one based on the Dyson–Keldysh diagrammatic technique. The book focuses in particular on the theoretical methods able to describe the non-equilibrium (at finite voltage) electron transport through interacting nanosystems, specifically the correlation effects due to electron–electron and electron–vibron interactions.

The book would be useful for both masters and PhD students and for researchers or professionals already working in the field of quantum transport theory and nanoscience.

Thermodynamics and Equations of State for Matter: From Ideal Gas to Quark–Gluon Plasma

By Vladimir Fortov
World Scientific

41DLBMZ6iqL._SX312_BO1,204,203,200_

This monograph presents a comparative analysis of different thermodynamic models of the equation of state (EOS). The author aims to present in a unified way both the theoretical methods and experimental material relating to the field.

Particular attention is given to the description of extreme states reached at high pressure and temperature. As a substance advances along the scale of pressure and temperature, its composition, structure and properties undergo radical changes, from the ideal state of non-interacting neutral particles described by the classical statistical Boltzmann function to the exotic forms of baryon and quark–gluon matter.

Studying the EOS of matter under extreme conditions is important for the study of astrophysical objects at different stages of their evolution as well as in plasma, condensed-matter and nuclear physics. It is also of great interest for the physics of high-energy concentrations that are either already attained or can be reached in the near future under controlled terrestrial conditions.

Ultra-extreme astrophysical and nuclear-physical applications are also analysed. Here, the thermodynamics of matter is affected substantially by relativity, high-power gravitational and magnetic fields, thermal radiation, the transformation of nuclear particles, nucleon neutronisation, and quark deconfinement.

The book is intended for a wide range of specialists who study the EOS of matter and high-energy-density physics, as well as for senior students and postgraduates.

Big Data: Storage, Sharing, and Security

By Fei Hu (ed.)
CRC Press

CCboo2_04_17

Nowadays, enormous quantities of data in a variety of forms are generated rapidly in fields ranging from social networks to online shopping portals to physics laboratories. The field of “big data” involves all the tools and techniques that can store and analyse such data, whose volume, variety and speed of production are not manageable using traditional methods. As such, this new field requires us to face new challenges. These challenges and their possible solutions are the subject of this book of 17 chapters, which is clearly divided into two sections: data management and security.

Each chapter, written by different authors, describes the state-of-the-art for a specific issue that the reader may face when implementing a big-data solution. Far from being a manual to follow step-by-step, topics are treated theoretically and practical uses are described. Every subject is very well referenced, pointing to many publications for readers to explore in more depth.

Given the diversity of topics addressed, it is difficult to give a detailed opinion on each of them, but some deserve particular mention. One is the comparison between different communication protocols, presented in depth and accompanied by many graphs that help the reader to understand the behaviour of these protocols under different circumstances. However, the black-and-white print makes it difficult to differentiate between the lines in these graphs. Another topic that is nicely introduced is the SP (simplicity and power) system, which makes use of innovative solutions to aspects such as the variety of data when dealing with huge amounts. Even though the majority of the topics in the book are clearly linked to big data, some of them are related to broader computing topics such as deep-web crawling or malware detection in Android environments.

Security in big-data environments is widely covered in the second section of the  book, spanning cryptography, accountability and cloud computing. As the authors point out, privacy and security are key: solutions are proposed to successfully implement a reliable, safe and private platform. When managing such amounts of data, privacy needs to be carefully treated since delicate information could be extracted.  The topic is addressed in several chapters from different points of view, from looking at outsourced data to accountability and integrity. Special attention is also given to cloud environments, since they are not as controlled as those “in house”. Cloud environments may require data to be securely transmitted, stored and analysed to avoid access by unauthorised sources. Proposed approaches to apply security include encryption, authorisation and authentication methods.

The book is a good introduction to many of the aspects that readers might face or want to improve in their big-data environment.

Challenges and Goals for Accelerators in the XXI Century

By Oliver Brüning and Stephen Myers (eds)
World Scientific

Also available at the CERN bookshop

CCboo1_04_17

This mighty 840 page book covers an impressive range of subjects divided into no less than 45 chapters. Owing to the expertise and international reputations of the authors of the individual chapters, few if any other books in this field have managed to summarise such a broad topic with such authority. While too numerous to list in the space provided, the full list of authors – a veritable “who’s who” of the accelerator world – can be viewed at worldscientific.com/worldscibooks/10.1142/8635#t=toc.

The book opens with two chapters devoted to a captivating historical review of the Standard Model and a general introduction to accelerators, and closes with two special sections. The first of these is devoted to novel accelerator ideas: plasma accelerators, energy-recovery linacs, fixed-field alternating-gradient accelerators, and muon colliders. The last section describes European synchrotrons used for tumour therapy with carbon ions and covers, in particular, the Heidelberg Ion Therapy Centre designed by GSI and the CERN Proton Ion Medical Machine Study. The last chapter describes the transformation of the CERN LEIR synchrotron into an ion facility for radiobiological studies.

Concerning the main body of the book, 17 chapters look back over the past 100 years, beginning with a concise history of the three first lepton colliders: AdA in Frascati, VEP-1 in Novosibirsk and the Princeton–Stanford electron–electron collider. A leap in time then takes the reader to CERN’s Large Electron–Positron collider (LEP), which is followed by a description of the Stanford Linear Collider. Unfortunately, this latter chapter is too short to do full justice to such an innovative approach to electron–positron collisions.

The next section is devoted to beginnings, starting from the time of the Brookhaven Cosmotron and Berkeley Bevatron. The origin of alternating-gradient synchrotrons is well covered through a description of the Brookhaven AGS and the CERN Proton Synchrotron. The first two hadron colliders at CERN – the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) proton–antiproton collider – are then discussed. The ISR’s breakthroughs were numerous, including the discovery of Schottky scans, the demonstration of stochastic cooling and absolute luminosity measurements by van der Meer scans. Even more remarkable was the harvest of the SPS proton–antiproton collider, culminating with the Nobel prize awarded to Carlo Rubbia and Simon van der Meer. The necessary Antiproton Accumulator and Collector are discussed in a separate chapter, which ends with an amusing recollection: “December 1982 saw the collider arriving at an integrated luminosity of 28 inverse nanobarns and Rubbia offering a ‘champagne-only’ party with 28 champagne bottles!” Antiproton production methods are covered in detail, including a description of the manoeuvres needed to manipulate antiproton bunches and of the production of cold antihydrogen atoms. This subject is continued in a later chapter dedicated to CERN’s new ELENA antiproton facility.

The Fermilab proton–antiproton collider started later than the SPS, but eventually led to the discovery of the top quark by the CDF and D0 collaborations. The Fermilab antiproton recycler and main ring are described, followed by a chapter dedicated to the Tevatron, which was the first superconducting collider. The first author remarks that, over the years, some 1016 antiprotons were accumulated at Fermilab, corresponding to about 17 nanograms and more than 90% of the world’s total man-made quantity of nuclear antimatter. This section of the book concludes with a description of the lepton–proton collider HERA at DESY, the GSI heavy-ion facility, and the rare-isotope facility REX at ISOLDE. Space is also given to the accelerator that was never built, the US Superconducting Super Collider (SSC), of which “the hopeful birth and painful death” is recounted.

The following 25 chapters are devoted to accelerators for the 21st century, with the section on “Accelerators for high-energy physics” centred on the Large Hadron Collider (LHC). In the main article, magisterially written, it is recalled that the 27 km length of the LEP tunnel was chosen having already in mind the installation of a proton–proton collider, and the first LHC workshop was organised as early as 1984. The following chapters are dedicated to ion–ion collisions at the LHC and to the upgrades of the main ring and the injector. The high-energy version of the LHC and the design of a future 100 km-circumference collider (with both electron–positron and proton–proton collision modes) are also covered, as well as the proposed TeV electron–proton collider LHeC. The overall picture is unique, complete and well balanced.

Other chapters discuss frontier accelerators: super B-factories, the BNL Relativistic Heavy Ion Collider (RHIC)  and its electron–ion extension, linear electron–positron colliders, electron–positron circular colliders for Higgs studies and the European Spallation Source. Special accelerators for nuclear physics, such as the High Intensity and Energy ISOLDE at CERN and the FAIR project at GSI, are also discussed. Unfortunately, the book does not deal with synchrotron light sources, free electron lasers and high- power proton drivers. However, the latter are discussed in connection with neutrino beams by covering the CERN Neutrinos to Gran Sasso project and neutrino factories.

The book is aimed at engineers and physicists who are already familiar with particle accelerators and may appreciate the technical choices and stories behind existing and future facilities. Many of its chapters could also be formative for young people thinking of joining one of the described projects. I am convinced that these readers will receive the book very positively.

LHC cold and preparing for beam

Following a longer than usual technical stop, which began in December last year to allow for the replacement of the CMS inner tracker, all eight sectors of the Large Hadron Collider (LHC) have been cooled to their operating temperature of 1.9 K. The machine is now being prepared for the return of proton beams in May.

During the extended year-end technical stop (EYETS), one full sector of the LHC – lying between the ATLAS and ALICE experiments – was warmed up to room temperature to replace of one of its 15 m-long superconducting dipoles, which had exhibited abnormal behaviour on a few occasions during the 2016 physics run. The rest of the machine was maintained at 20 K. To make sure none of the LHC’s precious liquid-helium coolant would be lost during the EYETS interventions, the machine was emptied and its 130-tonne supply was temporarily stored on the surface.

Following the successful replacement and reconnection of the dipole magnet, pre-cooling of the sector to 80 K started on 17 February using 1200 tonnes of liquid nitrogen carried by 60 thermally insulated trucks, and was completed by 4 March. The re-filling of the arcs with liquid helium started about one week later, with electrical quality-assurance tests taking place at the end of the month. Powering tests took place during April, with the machine check-out scheduled for mid-April and the start of commissioning with beam during the first week of May.

Several other changes were made during the EYETS, not only to the LHC but also to the injectors. The PS Booster (PSB) underwent a massive de-cabling campaign, which has paved the way for the installation of new equipment for the LHC Injector Upgrade (LIU) project in the coming years. In response to a vacuum leak that developed in the SPS internal beam dump in April last year, a new internal beam dump was designed and constructed in record time and installed in the SPS. This will allow the SPS to reach its full performance again for the 2017 run, and in particular will lift the limit on the number of bunches for the LHC from 96 to 288 per SPS extraction.

With the LHC handed from the engineering department to the operations group on 14 April, the second phase of LHC Run 2 will soon be under way, with first collisions due approximately end of May.

Crab cavities promise brighter collisions

As the Large Hadron Collider (LHC) gears up for its 2017 restart, teams in the background at CERN and around the world are making rapid progress towards a major LHC upgrade due to be operational from 2025. A significant milestone towards the High-Luminosity LHC (HL-LHC), which will boost the number of proton–proton collisions, was passed in late February with the first tests of a new accelerator structure called a crab cavity.

Increasing the luminosity of the LHC, which is a measure of its collision rate, requires new magnets located on either side of the LHC detectors that squeeze the incoming proton beams into tighter bunches and force them to cross one another at a steeper angle. These upgraded inner-triplet quadrupoles for the HL-LHC, prototypes of which have recently been completed at CERN and in the US, have larger apertures than the present LHC magnets and are based on more advanced niobium-tin superconducting technology (CERN Courier March 2017 p23).

Crab cavities are essential to fully exploit the inner-triplet upgrade, since they allow the crossing angle of the proton beams to be compensated so as to maximise their overlap at the collision points. Constructed from high-purity niobium sheets, the HL-LHC crab cavities will operate at 2 K to reach their nominal performance. Unlike the accelerating RF cavities currently used at the LHC, which accelerate protons in their direction of motion, the crab cavities give the bunches a time-dependent transverse kick in the plane perpendicular to their motion to improve “luminosity levelling”.

So far, two superconducting crab cavities have been manufactured at CERN and RF tests at 2 K performed in a superfluid helium bath. The first cavity tests demonstrated a maximum transverse kick voltage exceeding 5 MV, surpassing the nominal operational voltage of 3.4 MV. This kick voltage corresponds to extremely high electric and magnetic fields on the cavity surfaces: 57 MV/m and 104 mT, respectively. By the end of 2017, the two crab cavities will have been inserted into a specially designed cryomodule that will be installed in the Super Proton Synchrotron (SPS) to undergo validation tests with proton beams. This will be the first time that a crab cavity has ever been used for manipulating proton beams, and a total of 16 cavities (eight near ATLAS and eight near CMS) will be required for the HL-LHC project.

Sterile neutrinos in retreat

An experiment in Korea designed to search for light sterile neutrinos has published its first results, further constraining the possible properties of such a particle. Even though the number of light neutrinos cannot exceed three, it is still possible to have additional neutrinos if they are “sterile”. Such particles, which are right-handed singlets under the electromagnetic, strong and weak interactions, are predicted by extensions of the Standard Model and would reveal themselves by altering the rate of oscillation between the three standard neutrino flavours. An early hint for such a state came from observations of the mixing between electron and muon neutrinos by the LSND experiment, although more recent results from other experiments are so far inconclusive.

The NEOS detector is a Gd-loaded liquid scintillator located just 24 m from the core of the 2.8 GW Hanbit nuclear power plant in South Korea, which generates a high flux of antineutrinos. Based on precise measurements of an antineutrino energy spectrum over an eight-month period, the NEOS team found no evidence for oscillations involving sterile neutrinos. On the other hand, the team recorded a small excess of antineutrinos above an energy of around 5 MeV that is consistent with anomalies seen at longer-baseline neutrino experiments.

With no strong evidence for “3 + 1” neutrino oscillations, the new results set stringent upper limits on the θ14 mixing angle (see figure) of sin214 less than 0.1 for Δm241 ranging from 0.2–2.3 eV2 at 90% confidence level. The results further improve the constraints to the LSND anomaly parameter space, say the team. With the NEOS experiment now completed, the team is discussing a further reactor neutrino programme using commercial reactors to be built in the near future in Korea.

GBAR falls into place

On 1 March, the first component of a new CERN experiment called GBAR (Gravitational Behaviour of Antihydrogen at Rest) was installed: a 1.2 m-long linear accelerator that will be used to generate positrons. Located in the Antiproton Decelerator (AD) hall, GBAR is the first of five experiments that will be connected to the new ELENA deceleration ring and it is specifically designed to measure the effect of gravity on antihydrogen atoms. The experiment will use antiprotons supplied by ELENA and positrons created by the linac to produce antihydrogen ions, which will be slowed almost to a standstill using lasers and then allowed to fall under gravity over a vertical distance of 20 cm.

Although antimatter is not expected to fall “up”, detecting even the tiniest difference between the rate at which matter and antimatter fall would have profound implications for fundamental laws such as Einstein’s equivalence principle. Two further experiments that are based at the AD, AEGIS and ALPHA, are also studying the effect of gravity on antimatter. First results on anti-ion production are expected next year, with gravity studies following later.

ATLAS pushes SUSY beyond 2 TeV

The ATLAS experiment has released several new results in its search for supersymmetry (SUSY) using the full 13 TeV LHC data set from 2015 and 2016, obtaining sensitivity for certain new particles with masses exceeding 2 TeV.

ATLAS

SUSY is one of the most studied extensions of the Standard Model (SM) and, if realised in nature, it would introduce partners for all the SM particles. Under the assumption of R-parity conservation, SUSY particles would be pair-produced and the lightest SUSY particle (LSP) would be stable. The strongly produced partners of the gluon and quarks, the gluino and squarks, would decay to final states containing energetic jets, possibly leptons, and two LSPs. If the LSP is only weakly interacting, which would make it a dark-matter candidate, it would escape the detector unseen, resulting in a signature with missing transverse momentum.

A recent ATLAS analysis [1] searched for this signature, while a second [2] targets models where each gluino decays via the partner of the top quark (the “stop”), producing events with many jets originating from a b quark (b jets). Both analyses find consistency with SM expectations, excluding squarks and gluinos from the first two generations at 95% confidence level up to masses of 2 TeV (see figure). Pair-produced stops could decay to final states containing up to six jets, including two b jets, or through the emission of a Higgs or Z boson. Two dedicated ATLAS searches [3, 4] find no evidence for these processes, excluding stop masses up to 950 GeV.

SUSY might alternatively be manifested in more complicated ways. R-parity violating (RPV) SUSY features an LSP that can decay and hence evade missing transverse momentum-based searches. Moreover, SUSY particles could be long-lived or metastable, leading to unconventional detector signatures. Two dedicated searches [5, 6] for the production of gluino pairs and stop pairs decaying via RPV couplings have recently been studied by ATLAS, both looking for final states with multiple jets but little missing transverse momentum. In the absence of deviations from background predictions, strong exclusion limits are extracted that complement those of R-parity conserving scenarios.

The production of metastable SUSY particles could give rise to decay vertices that are separated by from the proton–proton collision point in a measurable way. An ATLAS search [7] based on a dedicated tracking and vertexing algorithm has now ruled out large regions of the parameter space of such models. A second search [8] exploited the new layer of the ATLAS pixel tracking detector to identify short track segments produced by particles decaying close to the LHC beam pipe, yielding sensitivity to non-prompt decays of SUSY charginos with lifetimes of the order of a nanosecond. The result constrains an important class of SUSY models where the dark-matter candidate is the partner of the W boson.

The ATLAS SUSY search programme with the new data set is in full swing, with many more signatures being investigated to close in on models of electroweak-scale supersymmetry.

bright-rec iop pub iop-science physcis connect