Bluefors – leaderboard other pages

Topics

Injecting new life into the LHC

The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV. But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS). The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 and the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS. The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC). Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

Immense challenges

To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation.

One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

Execution phase

Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade (see panel opposite). The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018  to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

Looking forward to LS2

The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets, crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

Linac 4 complete and preparing to serve the high-luminosity LHC

Inaugurated in May, Linac 4 is CERN’s newest accelerator acquisition since the LHC and is key to increasing the luminosity delivered to the LHC. Linac 4 will send negative hydrogen ions with an energy of 160 MeV – more than three times the energy of its predecessor Linac 2 (which has been in service since 1978) – to the Proton Synchrotron Booster, which further accelerates the negative ions and removes the electrons. Almost 90 m long, it took nearly 10 years to build. After an extensive testing period that is already under way, Linac 4 will be connected to CERN’s accelerator complex during the long technical shutdown in 2019–2020.

Overhauling CERN’s accelerator complex

The demands of the high-luminosity LHC (HL-LHC) will see major changes across CERN’s accelerator complex (above). The LHC Injectors Upgrade project is organised into five baseline work packages to boost the performance of the LHC injectors and match the challenging HL-LHC requirements:• Improving ion source and low-energy transport in Linac 3, alleviating ion losses in LEIR and SPS, and implementing momentum slip stacking for ion beams in the SPS.

• Replacing Linac 2 with Linac 4 and using H charge-exchange injection into the PSB at the increased energy of 160 MeV.

• Raising the injection energy into the PS from the present 1.4 GeV to 2 GeV.

• Doubling the RF power, reducing the longitudinal impedance and mitigating the electron cloud in the SPS.

• Putting in place all of the other necessary equipment and operational upgrades across PSB, PS and SPS to make them capable of accelerating and manipulating higher-intensity/brightness beams (e.g. intercepting and dump devices, feedback systems, beam instrumentation and resonance compensation).

 

CERN’s recipe for knowledge transfer

Understanding what the universe is made of and how it started are the fundamental questions behind CERN’s existence. This quest alone makes CERN a unique knowledge-focused organisation and an incredible human feat. To achieve its core mission, CERN naturally creates new opportunities for innovation. A myriad of engineers, technicians and scientists develop novel technology and know-how that can be transferred to industry for the benefit of society. Twenty years ago, with the support of CERN Council, a reinforced structure for knowledge and technology transfer was established to strengthen these activities.

Advances in fields including accelerators, detectors and computing have had a positive impact outside of CERN. Although fundamental physics might not seem the most obvious discipline in which to find technologies with marketable applications, the many examples of applications of CERN’s technology and know-how – whether in medical technology, aerospace, safety, the environment and “industry 4.0” – constitute concrete evidence that high-energy physics is a fertile ground for innovation. That CERN’s expertise finds applications in multinational companies, small and medium enterprises and start-ups alike is further proof of CERN’s broader impact (see “From the web to a start-up near you”)

As an international organisation, CERN has access to a wealth of diverse viewpoints, skills and expertise. But what makes CERN different from other organisations in other fields of research? Sociologists have long studied the structure of scientific organisations, several using CERN as a basis, and they find that particle-physics collaborations uniquely engage in “participatory collaboration” that brings added value in knowledge generation, technology development and innovation. This type of collaboration, along with the global nature of the experiments hosted by CERN, adds high value to the laboratory’s knowledge-transfer activities.

Despite its achievements in knowledge transfer internationally, CERN is seldom listed in international innovation rankings; when it is present, it is never at the top. This is mainly a selection effect due to methodology. For example, the Reuters “Top 25 Global Innovators – Government” ranking relies on patents as a proxy for innovation (of the 10 innovation criteria used, seven are based on patents). CERN’s strategy is to focus on open innovation and to maximise the dissemination of our technologies and know-how, rather than focus on revenue. Although there is a wide range of intellectual-property tools useful for knowledge transfer, patent volume is not a relevant measure of successful intellectual-property management at CERN.

Instead, the CERN Knowledge Transfer group measures the number of new technology disclosures (91 in 2016), and the number of contracts and agreements signed with external partners and industry (42 in 2016, and totalling 251 since 2011). We also monitor spin-off and start-up companies – there are currently 18 using CERN technology, some of which are hosted directly in CERN’s network of business incubation centres. Together with the impressive breadth of application fields of CERN technologies, we believe these are clearer measures of impact

In the future, CERN will continue to pursue and promote open innovation. We want to build a culture of entrepreneurship whereby more people leaving CERN consider starting a business based on CERN technologies, and use a wide range of metrics to quantify our innovation. Strong links with industry are important to help reinforce a market-pull rather than technology-push approach. The Knowledge Transfer group will also continue to provide a service to the CERN community through advice, support, training, networks and infrastructure for those who wish to engage with industry through our activities.   

Human capital is vital in our equation, since knowledge transfer cannot happen without CERN’s engineers, technicians and physicists. Our role is to facilitate their participation, which could start with a visit to our new website, an Entrepreneurship Meet-Up (EM-U), or a visit to one of our seminars. Since they were launched roughly two years ago, EM-Us and knowledge-transfer seminars have together attracted more than 2000 people. Whether you want to tell us about an idea you have, or are curious about the impact of our technologies on society, we hope to hear from you soon.

• Find out more at kt.cern.

An Overview of Gravitational Waves: Theory, Sources and Detection

By Gerard Auger and Eric Plagnol (eds)
World Scientific

51ufUkgRCsL._SX312_BO1,204,203,200_

In 2016, the first direct detection of gravitational waves – produced more than a billion years ago during the coalescence of two black holes of stellar origin – by the two detectors of the LIGO experiment was a tremendous milestone in the history of science. This timely book provides an overview of the field, presenting the basics of the theory and the main detection techniques.

The discovery of gravitational radiation is extraordinarily important, not only for confirming the key predictions of Einstein’s general relativity, but also for its implications. A new window on the universe is opening up, with more experiments – already built or in the planning stage – joining the effort to perform precise measurements of gravitational waves.

The book, composed of eight chapters, collects the contributions of many experts in the field. It first introduces the theoretical basics needed to follow the discussion on gravitational waves, so that no prior knowledge of general relativity is required. A long chapter dedicated to the sources of such radiation accessible to present and future observations follows. A section is then devoted to the principles of gravitational-wave detection and to the description of present and future Earth- and space-based detectors. Finally, an alternative detection technique based on cold atom interferometry is presented.

The Meaning of the Wave Function: In Search of the Ontology of Quantum Mechanics

By Shan Gao
Cambridge University Press

413x+E4UVzL._SX355_BO1,204,203,200_

Does the wave function directly represent a state of reality, or merely a state of (incomplete) knowledge of it, or something else? This question is the starting point of this book, in which the author – a professor of philosophy – aims to make sense of the wave function in quantum mechanics and investigate the ontological content of the theory. A very powerful mathematical object, the wave function has always been the focus of a debate that goes beyond physics and mathematics to the philosophy of science.

The first part of the book (chapters 1–5) deals with the nature of the wave function and provides a critical review of its competing interpretations. In the second part (chapters 6 and 7), the author focuses on the ontological meaning of the wave function and proposes his view, which is that the wave function in quantum mechanics is real and represents the state of random discontinuous motion of particles in 3D space. He offers two main arguments supporting this new interpretation. The third part (chapters 8 and 9) is devoted to investigating possible implications. In particular, the author discusses whether the quantum ontology described by the wave function is enough to account for our definite experience, or whether additional elements, such as many worlds or hidden variables, are needed.

Aimed at readers familiar with the basics of quantum mechanics, the book could also appeal to students and researchers interested in the philosophical aspects of modern science theories.

Problem Solving in Quantum Mechanics: From Basics to Real-World Applications for Materials Scientists, Applied Physicists, and Device Engineers

By Marc Cahay and Supriyo Bandyopadhyay
Wiley

1118988752

With the rapid development of nanoscience and nano-engineering, quantum mechanics can no longer be considered exclusively the interest of physicists. Indeed, a fundamental understanding of physical phenomena at the nanoscale will require future electronic engineers, condensed-matter physicists and material scientists to master the fundamental principles of quantum theory.

Noticing that many textbooks on quantum mechanics are not meant for a wide audience of scientists, in particular those interested in practical applications and technologies at the nanoscale, the authors decided to fill this gap. In particular, they focus on the solution of problems that students and researchers working on state-of-the-art material and device applications might have to face. The problems are grouped by theme in 13 chapters, each completed by a section of further readings.

An ideal resource for graduate students, the book is also of value to professionals who need to update their knowledge or to refocus their expertise towards nanotechnologies.

Anomaly! Collider Physics and the Quest for New Phenomena at Fermilab

By Tommaso Dorigo
World Scientific

Also available at the CERN bookshop

CCboo1_07_17

Anomaly! is a captivating story of supposed discoveries that turned out not to be. The book provides an honest and not always flattering description of how large high-energy physics collaborations work, what makes experimental physicists excited, and of the occasional interference between scientific goals and personal factors such as ambition, career issues, personality clashes and fear of being scooped. Dorigo, who complements his recollections with many interviews and archival searches, proves to be a highly skilled communicator of science to the general public, as already known to the readers of his often controversial blog A Quantum Diaries Survivor. Thanks to well-chosen alternation of narration and explanation, several sections of the book read like a novel.

The main theme, as indicated by the title, is the anomalies (or outliers) that tantalised members of the CDF collaboration at Fermilab – and sometimes the external world – but ultimately turned out to be red herrings. The author uses these stories to show how cautious experimental particle physicists have to be when applying statistics in their data analysis. He also makes a point about the arbitrariness of the conventional 3σ and 5σ thresholds for claiming “evidence” and “discovery” of a new phenomenon.

Slightly off topic, given the title of the book, three chapters are devoted to the ultimately successful search for the top quark, the first evidence of which was very far from being an “anomaly”: its existence was expected in the mainstream and the “global fits” of other collider data were already pointing at the right mass range. Here Dorigo is interested in the opposite lesson: the conventional thresholds on p-values, originally motivated by the principle “extraordinary claims demand extraordinary proofs”, are hard to justify when a discovery is actually a confirmation of the dominant paradigm. (The author explicitly comments on the similarity with the Higgs boson discovery two decades later.) The saga of the top-quark hunt, which contains many funny and even heroic moments, is also an occasion for the author to elaborate on what he describes as over-conservative attitudes dominating in large teams when stakes are high.

In general, the book’s topics have clearly been chosen more by the importance of the lesson they teach than by their ultimate impact on science. Almost an entire chapter is devoted to a measurement of the Z boson mass at Fermilab, which was already known in advance to be doomed to obsolescence very soon, as the experiments at the upcoming LEP accelerator were more suited to that kind of measurement. Still, the chapter turns out to be an enthralling story, ending with a mysterious attempt by an unsporting competitor from another US laboratory to sabotage the first CDF report of this measurement at an international conference. In some other cases, the choice of topics is driven by their entertainment value, as in the case of the episode of the “Sacred Sword”, a radioactive-contamination incident that luckily ended well for its protagonists.

The author’s role in the book is at the same time that of an insider and of a neutral observer, attending crucial meetings and observing events unfold as a collaboration member among many others, with the remarkable exception of the final story where he plays the role of internal reviewer of one of the eponymous anomalies. In spirit and form, Anomaly! reminds me of Gary Taubes’ celebrated Nobel Dreams, but with more humour and explicit subjectivity. Although far from being scholarly, Anomaly! may also appeal to readers interested in the sociology of science or in the epistemological problem of how a scientific community finally settles on a single consensus, in the vein of Andrew Pickering’s Constructing Quarks, Peter Galison’s How Experiments End and Kent Staley’s The Evidence for the Top Quark: Objectivity and Bias in Collaborative Experimentation. The latter, in particular, is interesting to compare with the chapters of Anomaly! that narrate the same story.

Supersymmetry, Supergravity, and Unification

By Pran Nath
Cambridge

CCboo2_7_17

This book discusses the role played by supersymmetry, and especially supergravity, in the quest for a unified theory of fundamental interactions. These are vast subjects, which not only embrace particle physics but also have ramifications in many other fields, such as modern mathematics, statistical physics and condensed-matter systems.

The author focuses on a rather specific subject: supergravity as a plausible scenario (perhaps more convincing than supersymmetry itself) for physics beyond the Standard Model. This justifies the way the author has chosen to distribute the material over the 24 chapters, for a total of 500 pages.

The first seven chapters introduce the field theories and symmetry principles on which a framework for the unification of particle forces would be based. After a short history of force unification, the author covers general relativity, Yang–Mills theories, spontaneous symmetry breaking, the basics of the Standard Model, the theory of gauge anomalies, effective Lagrangians and current algebra.

Supersymmetry is introduced next, with a short mathematical formulation including the concepts of graded lie algebras, superfields and the basic tools needed to construct (rigid) supersymmetric field theories, their multiplets and invariant Lagrangians. Non-supersymmetric grand unified theories and their supersymmetric extensions are also reviewed, investigating in particular the potential role they play in gauge coupling unification. It is surprising that the author does not discuss the original motivation for advocating supersymmetry in this context, which is related to the hierarchy problem and to the issue of naturalness of scales. No such discussion occurs in this chapter nor in the following one, devoted to the minimal supersymmetric Standard Model. The theory of supergravity and its mathematical structure, including matter couplings, is briefly exposed as well.

The second half of the book includes five chapters dedicated to the phenomenology of supergravity, covering in detail supergravity unification, CP violation, proton decay and supergravity in cosmology and astroparticle physics. In particular, supergravity inflation and supersymmetric candidates for dark matter are discussed at length. Further theories of supergravity and their connection to string theories in diverse dimensions are only briefly touched upon.

The last part of the book provides some tools, such as anti-commuting variables and spinor formalism, which are needed to write supersymmetric Lagrangians and to extract physical consequences. Notations, conventions and other miscellaneous arguments including further references conclude the volume.

The book can be considered as a valuable and updated addition to Steven Weinberg’s third volume on supersymmetry in The Quantum Theory of Fields series (2000, Cambridge University Press).

The author is a world expert on supersymmetry and supergravity phenomenology, who has contributed to the field with many original and outstanding works.

Certainly useful to graduate students in physics, the book could also prove to be a resource for advanced graduate courses in experimental high-energy physics.

A quarter century of DIS workshops

With a total of 304 talks, Deep Inelastic Scattering 2017 (DIS17) demonstrated how deep inelastic scattering (DIS) and related topics permeate most aspects of high-energy physics and how we still have a huge amount to learn about strong interactions. Held at the University of Birmingham in the UK from 3–7 April, more than 300 participants from 41 countries enjoyed a week of lively scientific discussion and largely unanticipated sunshine.

The first of this series of annual international workshops on DIS and related topics took place in Durham, UK, in the Spring of 1993, when the first results from the world’s only lepton-hadron collider, HERA at DESY, were discussed by around 80 participants. A quarter of a century later, the workshop series has toured the globe, digested data from the full lifetime of HERA and numerous fixed-target DIS experiments, as well as playing a major role in the development and understanding of hadron-collider physics.

The dominant theme of DIS17 this year was the relevance of strong interactions, parton densities (PDFs) and DIS to the LHC. But a wide and eclectic range of other topics was included, notably new results from experiments at the Relativistic Heavy Ion Collider (RHIC), JLab and HERA, as well as theoretical advances and future plans for the field.

Following plenary review talks covering the latest news from the field, there followed two and a half days during which seven working groups operated in up to six simultaneous parallel sessions, covering: PDFs; low proton momentum fraction (Bjorken-x) physics; Higgs and beyond-the-Standard Model (BSM) studies in hadron collisions; hadronic, electroweak and heavy-flavour observables; spin and 3D hadron structure; and future facilities. The Birmingham event included a topical lecture on probing ultra-low-x QCD with cosmic neutrinos at IceCube and Auger, and a special session was devoted to the status and scientific opportunities offered by future proposed DIS facilities at CERN such as the Large Hadron electron Collider, LHeC) and at BNL or JLab in the US (the Electron Ion Collider, EIC).

All aspects of proton–proton collisions at the LHC featured during this year’s DIS event, from the role of parton densities and perturbative QCD dynamics in beyond-the Standard Model searches and Higgs boson studies, through the measurement and interpretation of processes that are sensitive to parton densities (such as electroweak gauge boson production), to topics that challenge our understanding of strong-interaction dynamics in the semi- and non-perturbative regimes. Ten years after HERA completed data-taking, the collider still featured strongly. The final round of combined inclusive DIS data published in 2016 by the H1 and ZEUS experiments have been integrated into global PDF fits, and also for a handful of new measurements and combinations. Heavy-ion collision results from RHIC and the LHC were also well represented, as were insights into 3D proton structure and hadron spin from semi-inclusive DIS and polarised proton–proton collisions at COMPASS, JLab and RHIC, and current and future DIS measurements with neutrinos.

Data from HERA and the LHC have brought a new level of precision to the parton densities of the proton, with associated theoretical advances including the push towards higher order (next-to-next-to-next-to leading order) descriptions. Taming the “pathological” rise of the proton gluon density at low-x in the perturbative domain remains a major topic, which is now being addressed experimentally in ultra-peripheral collisions and forward measurements at the LHC, as well as through theoretical modelling of low-x, low-Q2 HERA data with nonlinear parton dynamics and resummation techniques. The related topic of diffractive electron–proton scattering and the heavily gluon-dominated diffractive PDFs is benefiting from the full HERA statistics. New insights into elastic and total cross-sections, such as TOTEM’s observation of a non-exponential term in the four-momentum transfer dependence of the elastic cross-section, are emerging from the LHC data. Uncertainties in PDFs remain large at high x, and intense work is ongoing to understand LHC observables such as top-quark pair production, which are sensitive in this region. New data and theoretical work are revealing the transverse structure of the proton for the first time in terms of transverse-momentum-dependent parton densities. The LHC’s proton–lead collision data are also constraining nuclear PDFs in an unprecedented low-x kinematic region.

Concerning the future of DIS, potential revolutions in our understanding could be made with polarised proton and heavy-ion targets and with step changes in energy and luminosity becoming abundantly clear. The EIC offers 3D hadron tomography and an unprecedented window on the spin and flavour structure of protons and ions. Its eA scattering programme would probe low-x parton dynamics in a region where collective effects ultimately leading to gluon saturation are expected to become important. The LHeC offers a standalone Higgs production programme complementary to that of the LHC, as well as a new level in precision in PDFs that could be applied to extend the sensitivity to new physics at the LHC. The ep and eA scattering programme also would probe low-x parton dynamics in the region where gluon saturation is expected to be firmly established. Together, the proposed facilities open up an exciting set of new windows on hadronic matter with relevance to major questions such as quark confinement and hadronic mass generation.

The next instalment of DIS in April 2018, to be held in Kobe, Japan, is eagerly awaited.

Venice EPS event showcases the best of HEP

Major scientific gatherings such as the European Physical Society (EPS) biennial international conference on High Energy Physics offer a valuable opportunity to reflect on the immense work and progress taking place in our field, including the growing connections between particle physics and the universe at large. This year’s EPS conference, held in Venice, Italy, from 5–12 July, was also the first large conference where the results from the 2015 and 2016 runs of the Large Hadron Collider (LHC) at 13 TeV were presented.

Setting the bar just a day into the Venice event, LHCb announced the discovery of a new doubly charmed baryon from precision measurements of B decays, with heavy-flavour analyses continuing to offer a rich seam of understanding. LHCb also presented the intriguing anomalies being seen in the ratios of certain Standard Model decays that hint at deviations from lepton universality, with further data from LHC Run 2 hotly anticipated.

The LHC is firmly in the precision business these days. In the last two years, the machine has delivered large amounts of collision data to the experiments and striking progress has been made in analysis techniques. These have enabled measurements of rare electroweak processes such as the associated production of a top quark, a Z boson and a quark (tZq) by ATLAS, for example, and the definitive observation of WW scattering by CMS. Top physics is another booming topic, with new top-mass and single-top production measurements and many other results, including “legacy” measurements from the Tevatron experiments, on show.

At the core of the LHC’s analysis programme is the exploration of the Higgs boson, which now enters its sixth year. Particularly relevant is how the Higgs interacts with other particles, since this could be altered by physics beyond the Standard Model. While the Higgs was first spotted decaying into other bosons (W, Z, γ), ATLAS reported the first evidence for the decay of the Higgs boson to a pair of bottom quarks, with a significance of 3.6σ, while CMS presented the first observation by a single experiment of the decay to a pair of τ leptons, with a significance of 5.9σ. The Higgs mass is also narrowing to 125 GeV, while the fundamental scalar nature of the new particle continues to raise hope that it will lead to new insights.

The lack of direct signs of new physics at the LHC is an increasing topic of discussion, and underlies the importance of precision measurements. Direct searches are pushing the mass limits for new particles well into the TeV range, but new physics could be hiding in small and subtle effects. It is clear that there is physics beyond the Standard Model, just not what it is, and one issue is how to communicate this scientifically fascinating but non-headline-worthy aspect of today’s particle-physics landscape.

High precision is also being attained in studies of the strong interaction. ALICE, for example, reported an increase in strangeness production with charged multiplicity that seems to connect smoothly the regimes seen in pp, pPb and PbPb collisions. Overall, and increasingly with complementary results from the other LHC experiments, ALICE is closing in on the evolution of the quark–gluon plasma, and thus on understanding the very early universe.

Particle physics, astrophysics and cosmology are closer today than ever, as several sessions at the Venice event demonstrated. One clear area of interplay is dark matter: if dark matter interacts only through gravity, then finding it will be very difficult for accelerator-based studies, but if it has a residual interaction with some known particles, then accelerators will be leading the hunt for direct detection. Cosmology’s transformation to a precision science continues with the recent detection of gravitational waves, with LIGO’s results already placing the first limits on the mass of the graviton at less than 7.7 × 10–23 eV/c2. There were also updates from dark-energy studies, and about precision CMB explorers beyond Planck.

Neutrino physics is also an extremely vibrant field, with neutrino oscillations continuing to offer chances for discovery. The various neutrino-mixing angles are starting to be well measured and Nova and T2K are zooming in on the value of the CP-violating phase, which seems to be large, given tantalising hints from T2K. The hunt for sterile neutrinos continues, and for neutrinoless double beta decay, with several searches ongoing worldwide.

In summary, the 2017 EPS-HEP conference clearly demonstrated how we are progressing towards a full understanding both of the vastness of the universe and of the tiniest constituents of matter. There are many more results to look forward to, many of which will be ready for the next EPS-HEP event in Ghent, Belgium, in 2019. As summed up by the conference highlights: the field is advancing on all fronts – and it’s impressive.

ITER’s massive magnets enter production

The ITER site

It is 14 m high, 9 m wide and weighs 110 tonnes. Fresh off a production line at ASG in Italy, and coated in epoxy Kapton-glass panels (image top left), it is the first superconducting toroidal-field coil for the ITER fusion experiment under construction in Cadarache, Southern France. The giant D-shaped ring contains 4.5 km of niobium-tin cable (each containing around 1000 individual superconducting wires) wound into a coil that will carry a current of 68,000 A, generating a peak magnetic field of 11.8 T to confine a plasma at a temperature of 150 million degrees. The coil will soon be joined by 18 others like it, 10 manufactured in Europe and nine in Japan. After completion at ASG, the European coils will be shipped to SIMIC in Italy, where they will be cooled to 78 K, tested and welded shut in a 180 tonne stainless-steel armour. They will then be impregnated with special resin and machined using one of the largest machines in Europe, before being transported to the ITER site.

Science doesn’t get much bigger than this, even by particle-physics standards. ITER’s goal is to demonstrate the feasibility of fusion power by maintaining a plasma in a self-sustaining “ignition” phase, and was established by an international agreement ratified in 2007 by China, the European Union (EU), Euratom, India, Japan, Korea, Russia and the US. Following years of delay relating to the preferred site and project costs, ITER entered construction a decade ago and is scheduled to produce first plasma by December 2025. The EU contribution to ITER, corresponding to roughly half the total cost, amounts to €6.6 billion for construction up to 2020.

Fusion for energy

The scale of ITER’s components is staggering. The vacuum vessel that will sit inside the field coils is 10 times bigger than anything before it, measuring 19.4 m across, 11.4 m high and requiring new welding technology to be invented. The final ITER experiment will weigh 23,000 tonnes, almost twice that of the LHC’s CMS experiment. The new toroidal-field coil is the first major magnetic element of ITER to be completed. A series of six further poloidal coils, a central solenoid and a number of correction coils will complete ITER’s complex magnetic configuration. The central solenoid (a 1000 tonne superconducting electromagnet in the centre of the machine) must be strong enough to contain a force of 60 MN – twice the thrust of the Space Shuttle at take-off.

Vacuum-pressure impregnation tooling

Fusion for Energy (F4E), the EU organisation managing Europe’s contribution to ITER, has been collaborating with industrial partners such as ASG Superconductors, Iberdrola Ingeniería y Construcción, Elytt Energy, CNIM, SIMIC, ICAS consortium and Airbus CASA to deliver Europe’s share of components in the field of magnets. At least 600 people from 26 companies have been involved in the toroid production and the first coil is the result of almost a decade of work. This involved, among other things, developing new ways to jacket superconducting cables based on materials that are brittle and much more difficult to handle than niobium-titanium. In total, 100,000 km of niobium-tin strands are necessary for ITER’s toroidal-field magnets, increasing worldwide production by a factor 10.

Since 2008, F4E has signed ITER-related contracts reaching approximately €5 billion, with the magnets amounting to €0.5 billion. Firms that are involved, such as SIMIC where the coils will be tested and Elytt, which has developed some of the necessary tooling, have much to gain from collaborating in ITER. According to Philippe Lazare, CEO of CNIM Industrial Systems Division: “In order to manufacture our share of ITER components, we had to upgrade our industrial facilities, establish new working methods and train new talent. In return, we have become a French reference in high-precision manufacturing for large components.”

CERN connection

Cooling the toroidal-field magnets requires about 5.8 tonnes of helium at a temperature of 4.5 K and a pressure of 6 bar, putting helium in a supercritical phase slightly warmer than it is in the LHC. But ITER’s operating environment is totally different to an accelerator, explains head of F4E’s magnets project team Alessandro Bonito-Oliva: “The magnets have to operate subject to lots of heat generated by neutron irradiation from the plasma and AC losses generated inside the cable, which has to be removed, whereas at CERN you don’t have this problem. So the ITER coolant has to be fairly close to the wire – this is why we used forced-flow of helium inside the cable.” A lot of ITER’s superconductor technology work was driven by CERN in improving the characteristics of superconductors, says Bonito-Oliva: “High-energy physics mainly looks for very high current performance, while in fusion it is also important to minimise the AC losses, which generally brings a reduction of current performance. This is why Nb3Sn strands for fusion and accelerators are slightly different.

CERN entered formal collaboration with ITER in March 2008 via a co-operation agreement concerning the design of high-temperature superconducting current leads and other magnet technologies, with CERN’s superconducting laboratory in building 163 becoming one of the “reference” laboratories for testing ITER’s superconducting strands. Niobium-tin is the same material that CERN is pursuing for the high-field magnets of the High Luminosity LHC and also a possible future circular collider, although the performance demands of accelerator magnets requires significant further R&D. Head of CERN’s technology department, Jose Miguel Jimenez, who co-ordinates the collaboration between CERN and ITER, says that in addition to helping with the design of the cable, CERN played a big role in advising for high-voltage testing of the cable insulation and, in particular, with the metallurgical aspect. “Metallurgy is one of the key areas of technology transfer from CERN to ITER. Another is the HTS current leads, which CERN has helped to design in collaboration with the Chinese group working on the ITER tokamak, and in simulating the heat transfer under real conditions,” he explains. “We also helped with the cryoplants, magnetic-field quality, and on central interlocks and safety systems based on our experience with the LHC.”

bright-rec iop pub iop-science physcis connect