Comsol -leaderboard other pages

Topics

A unique vacuum environment

Vacuum technology for particle accelerators has been pioneered by CERN since its early days. The Intersecting Storage Rings (ISR) brought the most important breakthroughs. Half a century ago, this technological marvel – the world’s first hadron collider – required proton beams of unprecedented intensity and extremely low vacuum pressures in the interaction areas (below 10–11 mbar). Addressing the former challenge led to innovative surface treatments such as glow-discharge cleaning, while the low-vacuum requirement drove the development of materials and their treatments. It also led to novel high-performance cryogenic pumps and vacuum gauges that are still in use today, and CERN’s record for the lowest ever achieved pressure at room temperature (2 × 10–14 mbar) still stands.

The Large Electron Positron (LEP) collider opened a new chapter in CERN’s vacuum story. Even though LEP’s residual gas density and current intensities were less demanding than those of the ISR, its exceptional length and intense synchrotron-light power triggered the need for unconventional solutions at reasonable cost. Responding to this challenge, the LEP vacuum team developed extruded aluminium vacuum chambers and introduced, for the first time, linear pumping by non-evaporable getter (NEG) strips. In parallel, LEP project leader Emilio Picasso launched another fruitful development that led to the production of the first superconducting radio-frequency cavities based on niobium thin-film coating on copper substrates. It was a great success, and the present accelerating RF cavities of the LHC and HIE-ISOLDE are essentially based on the expertise assimilated for LEP.

The coexistence at CERN of both NEG and thin-film expertise was the seed for another breakthrough in vacuum technology: NEG thin-film coatings, driven by the requirements of the LHC and its project leader Lyn Evans. The NEG material, a micron-thick coating made of a mixture of titanium, zirconium and vanadium, is deposited onto the inner wall of vacuum chambers and, after activation by heating in the accelerator, provides pumping for most of the gas species present in accelerators. The Low Energy Ion Ring was the first CERN accelerator to implement extensive NEG coating in around 2006. For the LHC, one of the technology’s key benefits is its low secondary-electron emission, which suppresses the growth of electron clouds in the room-temperature part of the machine.

New concepts

Synchrotron radiation-induced desorption and electron clouds at temperatures of around 4.3 K had to be studied in depth for the LHC, leading CERN’s vacuum experts to develop new concepts for vacuum systems at cryogenic temperatures, in particular the beam screen. The more intense beams of the high-luminosity LHC (HL-LHC) upgrade, presently under way, will amplify the effect of electron clouds on both the beam stability and the thermal load to the cryogenic systems. Since NEG coatings are limited for room-temperature beam pipes, an alternative strategy had to be found for the parts of the accelerators that cannot be heated, such as those in the HL-LHC’s inner triplet magnets.

Following an idea that originated at CERN in 2006, thin-film coatings made from carbon offer a solution and the material has already been deposited on tens of SPS vacuum chambers within the LHC Injectors Upgrade project. Another idea to fight electron clouds for the HL-LHC involves laser-treating surfaces to make them more rough, so that secondary electrons are intercepted by the surrounding surfaces. In collaboration with UK researchers and GE Inspection Robotics, CERN’s vacuum team has recently developed a miniature robot that can direct the laser onto the LHC beam screen (see image above). The possibility of in situ surface treatments by lasers opens new perspectives for vacuum technology in the next decades, including studies for future circular colliders. Another benefit of this study is the development of small robots for the in situ inspection of long ultra-high vacuum beam pipes, such as those of the LHC’s arcs.

The Compact Linear Collider (CLIC) project, which envisages a high-energy linear electron–positron collider at CERN, demands quadrupole magnets with a very small-diameter beam pipe (about 8 mm) supporting pressures in the ultra-high vacuum range. This can be obtained by NEG-coating the vacuum vessel, but the coating process in such a high aspect-
ratio geometry is not easy due to the very small space available for the material source and the plasma needed for its sputtering. This troublesome issue has been solved by a complete change of the production process in which the NEG material is no longer directly coated on the wall of the tiny pipe, but instead is coated on the external wall of a sacrificial mandrel made of high-purity aluminium.

Next-generation synchrotron-light sources share CLIC’s need for very compact magnets with magnetic poles as close as possible to the beam, so as to reduce costs and improve beam performance. CERN’s vacuum group collaborates closely with vacuum experts of light sources, MAX IV in Sweden and PSI in Switzerland being prominent examples, to develop the required very-small-diameter vacuum pipes. Further technology transfer has come from the sophisticated simulations necessary for the HL-LHC and the Future Circular Collider study, which have also found applications beyond the accelerator field, from the coating of electronic devices to space simulation.

Relations with industry are key to the operation of CERN’s accelerators, especially those in the LHC chain. The vacuum industry across CERN’s member countries provides us with state-of-art components, valves, pumps, gauges and control equipment that have contributed to the high reliability of the lab’s vacuum systems. In return, the LHC gives high visibility to industrial products. Indeed, the variety of projects and activities performed at CERN provide us with a continuous stimulation to improve and extend our competences in vacuum technology. In addition to future colliders are: antimatter physics, which requires very low gas density; radioactive-beam physics, which imposes severe controls on contamination and gas exhausting; and gravity-wave physics for which the tradeoff between cost and performance of vacuum systems is essential for the approval of next-generation observatories.

An orthogonal driver of innovation in vacuum technology is the reduction of costs and operational downtime of CERN’s accelerators. Achieving ultra-high vacuum in a matter of a few hours at a reduced cost would also have an impact well beyond the high-energy physics community. This and other challenges involved in fundamental exploration are guaranteed to drive further advances in vacuum technology.

The difficult work begins

The open symposium of the European Strategy for Particle Physics (ESPP) update, which drew to a close last week in Granada, Spain, was a moment for physicists to take stock of their field’s status and future. A week of high-quality presentations and focused discussions proved how far things have moved on since the previous strategy update concluded in 2013. In the past few years the LHC has proved the existence of the Higgs boson and so far suggested that there are no new particles beyond the SM at the electroweak scale. Spectacular progress has been made with neutrinos, dark-matter searches, flavour and electroweak physics, and gravitational-wave astronomy is beginning to take off. The deepest puzzles of the standard models of particle physics and cosmology remain at large, however, and large colliders are one of the best tools to address them.

Recommendations from the ESPP are due early next year. Dominating discussions at the open symposium last week was which project should succeed the LHC after its operations cease in the 2030s. The decision has significant consequences for the next generation of particle physicists, not just in Europe but internationally. Perspectives from Asia and the Americas, in addition to national views and inputs from the astroparticle– and nuclear-physics communities, brought into sharp focus the global nature of modern high-energy physics and the need for greater coordination at all levels.

The 130 or so talks and discussion sessions in Granada revealed a community united in its desire for a post-LHC collider, but less so in its choice of that collider’s form. Enormous efforts have gone into weighing up the physics reach of the various projects under study, a task complicated by the complexity of future accelerator technologies, detectors and analyses. Stimulating some heated exchanges, the ESPP saw the International Linear Collider (ILC) in Japan, a Compact Linear Collider (CLIC) or future circular electron–positron collider (FCC-ee) at CERN and a Circular Electron Positron Collider in China (CEPC) pitted against each other and against expectations from the high-luminosity LHC in terms of their potential in key areas such as Higgs physics.

Summary sessions

Summing up the situation for beyond-SM (BSM) physics, Gian Giudice of CERN said that the remaining BSM-physics space is “huge”, and pointed to four big questions for colliders: to what extent can we tell whether the Higgs is fundamental or composite? Are there new interactions or new particles around or above the electroweak scale? What cases of thermal relic WIMPs are still unprobed and can be fully covered by future collider searches? And to what extent can current or future accelerators probe feebly interacting sectors?

Neutrinos, the least well known of all the SM particles, were the subject of numerous presentations. The ESPP audience was reminded that neutrino masses, as established by neutrino oscillations, are the first particle-physics evidence for BSM phenomena. A vibrant programme is under way to fully measure the neutrino mixing matrix and in particular the neutrino mass ordering and CP violation phase. Other experiments are probing the neutrino’s absolute mass scale and testing whether they are of a Dirac or Majorana nature. Along with gravitational waves, neutrinos play a powerful role in multimesseneger astronomy.

Around a fifth of the 160 input documents to the ESPP were linked to flavour physics, covering topics such as lepton-flavour universality, electric-dipole moments and heavy-flavour studies.

Flavour physics is crucial for BSM searches since it is potentially sensitive to effects at scales as high as 105 TeV, said Antonio Zoccoli of INFN in his summary. There is also much complementarity between low-energy physics, the high-energy frontier and searches for feebly interacting particles, he said. Oddities in b-decays seen by the LHCb collaboration are of particular interest. “Flavour is a major legacy of LHC,” Zoccoli concluded. “Charged hadron particle-ID should be mandatory for a full physics programme at future colliders.”

Summarising ESPP sessions on dark-matter and dark-sector physics, Shoji Asai of the University of Tokyo drew attention to a shift in sociology that is taking place. In the old view, dark-matter solutions arose as a byproduct of “top-down” approaches (such as supersymmetry) to solve the SM’s problems. The “new sociology” holds that dark matter needs an explanation of its own, and it’s to be considered a bonus if such a solution also elucidates important issues such as the strong-CP problem or baryogenesis. Among the “big questions” identified in this sector at the ESPP update were: What are the main differences between light hidden-sector dark matter and WIMPs? How broad is the parameter space for the QCD axion? How do we compare the results of different experiments in a more model-independent way? And how will direct and indirect dark-matter detection experiments inform/guide accelerator searches and vice versa? Asai said that consensus has emerged on the need for more coordination and support between accelerator-based direct detection and indirect detection dark-sector searches, as exemplified by the new European Center for AstroParticle Theory.

In summarising interests in the strong sector, Jorgen D’Hondt of Vrije Universiteit Brussel listed the many dedicated experiments in this area and the open questions identified at the ESPP symposium: “What are the experimental and theoretical prerequisites to reach an adequate precision of perturbative and non-perturbative QCD predictions at the highest energies? What can be learned from beams-on-target experiments at current and potential future accelerators? How to probe the quark–gluon plasma equation of state and to establish whether there is a first-order phase transition at high baryon density? What is known about the make-up of the proton (mass, radius, spin, etc) and how to extract it? And what is the role of strong interactions at very low and very high (up to astrophysical) energies?”

Electroweak sparks

Of all the scientific themes of the week, electroweak physics generated the most lively discussions, especially concerning how well the Higgs boson’s couplings to fermions, gauge bosons and to itself can be probed at current and future colliders. Summary speaker Beate Heinemann of DESY cautioned that such quantitative estimates should be treated with a degree of flexibility at this time, though a few things stand out: one is the impressive estimated performance from the HL-LHC in the next 15 or so years; another is that a long-term physics programme based on successive machines in a 100 km-circumference tunnel offers the largest overall physics reach on the Higgs boson and other key parameters. The long timescales required to master the technology for the next hadron collider were well noted. There is broad agreement that the next major collider after the LHC should collide electrons and positrons to fully explore the Higgs boson and make precision measurements of other electroweak parameters that are sensitive to phenomena at higher energy scales. Whether that machine is circular or linear, and built in Asia or Europe, are the billion-dollar questions facing the community now.

The closer involvement of particle physics with astroparticle physics, in particular following the discovery of gravitational waves, was the running theme of the open symposium. It was argued that, in terms of technology, next-generation gravitational-wave detectors such as the Einstein Telescope are essentially “accelerators without beams” and that CERN’s expertise in vacuum and cryogenic technologies (a result of the lab’s continual pursuit and execution of big-collider projects) would help to make such facilities a reality.

The closing discussion of the symposium offered a final hour for physicists to air their views, many of which were met with applause. Proponents of circular machines highlighted the high flexibility and exploratory potential of projects such as FCC-ee, pointing out that it would serve as an electroweak as well as a Higgs factory. Linear-minded participants cited factors such as the extendable nature of linacs, and the independence of their tunnels from a subsequent hadron collider. For others, the priority for CERN should be to enter negotiations as soon as possible for a 100 km tunnel in the Geneva region, buying time to decide which physics option should be installed. Warm applause followed a remark that CERN decides for itself what its next project should be, without relying on other labs. But there were reminders from others that high-energy physics is an international field and that, in times of scarce resources, all options should be considered.

The high-energy physics community has risen to the occasion of the ESPP update. New thinking, from basic theory to instrumentation, computing, analysis and global organisation, is clearly required to sustain the recent rate of progress. Now that the open symposium is over, the European Strategy Group (ESG) will start to prepare a briefing book. Further input can be submitted to the strategy secretariat during the next months, and at a special session organised by the European Committee for Future Accelerators on 14 July 2019 during the European Physical Society Conference on High Energy Physics in Ghent, Belgium. An ESG drafting session will take place on 20–24 January 2020 in Bad Honnef, Germany, and the update of the ESPP is due to be completed and approved by the CERN Council in May 2020.

Hans-Jürg Gerber 1929–2018

Swiss physicist Hans-Jürg Gerber passed away on 28 August last year. Born in Langnau/Kanton Bern, he studied and did his PhD from 1949 to 1959 at ETH Zurich on “Scattering and polarization effects of 3.27 MeV neutrons on deuterons”. He then worked at the University of Illinois in the US, before joining CERN from 1962 to 1968. There, he carried out experiments at the 28 GeV Proton Synchrotron (PS). He studied high-energy neutrino interactions using a spark chamber, and performed measurements of lepton universality. He also tested time-reversal invariance in the charged decay mode of the Λ hyperon. He was also PS coordinator from 1965 to 1966.

In 1968, Gerber became head of the research department at the Swiss Institute for Nuclear Physics (SIN). He was elected by the Swiss Federal Council to become associate professor of experimental physics in 1970, and in 1977 promoted to full professor. Gerber initiated basic research at SIN and later at the Paul Scherrer Institute (PSI) with his precision experiments on the decay of charged muons – experiments that continue to this day at PSI (see p45). His flair for the fundamental led to the most general determination of the leptonic four-fermion interaction for the normal and inverse muon decay using experimental data, which brought him international recognition.

In the 1980s and 1990s, Hans-Jürg returned to CERN to help set up and operate experiment PS195 (CPLEAR) for studying CP violation using a tagged neutral-kaon beam. The concept of the experiment, which involved tagging the flavour of the neutral kaon at the point of production, was opposite to already operational kaon experiments based on K-short and K-long beams. As a skilled experimenter, he contributed significantly to the success of CPLEAR with unconventional ideas. For example, during a crisis when the liquid-scintillator started to develop air bubbles due to the heat from nearby electronics, he invented a system to remove the air dissolved in the liquid using ultrasound. CPLEAR’s measurements on the violation of time-reversal invariance (T-invariance) and tests of quantum mechanics were the starting point for significant theo­retical work he undertook on T, CP and CPT invariance.

While he retired in the spring of 1997 after a long and extremely successful career, he still continued working on particle physics with various publications on the interpretation of the CPLEAR results regarding testing of quantum mechanics, T- and CPT-violation.  He was also a contributor to the review of particle physics in the Particle Data Group.

Experiment, theory and teaching formed a unity for Hans-Jürg. This was particularly evident in his lectures, in which he enthusiastically conveyed the joy of physics to his students. We also remember dinners with Hans-Jürg after long working days setting up experiments, where we talked about all possible physics questions.

He is survived by his wife Hildegard, his three children and grandchildren.

Michael Atiyah 1929–2019

The eminent mathematician Michael Atiyah died in Edinburgh on 11 January, aged 89. He was one of the giants of mathematics whose work influenced an enormous range of subjects, including theoretical high-energy physics.

Atiyah’s most notable achievement, with Isadore Singer, is the “index theorem”, which occupied him for more than 20 years and generated results in topology, geometry and number theory using the analysis of elliptic differential operators. In mid-life, he learned that theoretical physicists also made use of the theorem and this opened the door to an interaction between the two disciplines, which he pursued energetically until the end of his life. It led him not only to mathematical results on Yang–Mills equations, but also to encouraging the importation of concepts from quantum field theory into pure mathematics.

Early years

Born of a Lebanese father and a Scottish mother, his early years were spent in English schools in the Middle East. He then followed the natural course for a budding mathematician in that environment by attending the University of Cambridge, where he ended up writing his thesis under William Hodge and becoming a fellow at Trinity College. As a student he had little interest in physics, but went to hear Dirac lecture largely because of his fame. The opportunity then arose to spend a year at the Institute for Advanced Study in Princeton in the US, where he met his future collaborators and close friends Raoul Bott, Fritz Hirzebruch and Singer.

A visit by Singer to the University of Oxford (where Atiyah had recently moved) in 1962 began the actual work on the index theorem, where the Dirac operator would play a fundamental role. This ultimately led to Atiyah being awarded a Fields Medal in 1966 and, with Singer, the Abel Prize in 2004. Over the years, proofs and refinements of the index theorem evolved. Although topology was at the forefront of the first approaches, in the early 1970s techniques using “heat kernels” became more analytic and closer to the calculations that theoretical physicists were performing, especially in the context of anomalies in quantum field theory. In the 1980s, in a proof by Luis Álvarez-Gaumé (who subsequently became a member of the CERN theoretical physics unit for 30 years), Hirzebruch’s polynomials in the Pontryagin classes – which form the topological expression for the index – emerged as a natural consequence of supersymmetry.

Singer visited Oxford again in 1977, this time bringing mathematical questions concerning Yang-Mills theory. Using quite sophisticated algebraic geometry and the novel work of Roger Penrose, this yielded a precise answer to physicists’ questions about instantons, specifically the so-called ADHM (Atiyah, Drinfeld, Hitchin, Manin) construction. That mathematicians and physicists had common ground in a completely new context made a huge impression on Michael, and he was energetic in facilitating this cooperation thereafter. He frequently engaged in correspondence and discussions with Edward Witten, out of which emerged the current fashion in mathematics of topological quantum field theories – beginning with a formalism that described new invariants of knots. Despite the quantum language of this domain, Michael’s mathematical work with a physical interface was more concerned with classical solutions, and the soliton-like behaviour of monopoles and skyrmions.

Founding father

During his life he took on many administrative tasks, including the presidency of the Royal Society and mastership of Trinity College. He was also the founding director of the Isaac Newton Institute for Mathematical Sciences in Cambridge.

With his naturally effervescent personality he possessed, in Singer’s words, “speed, depth, power and energy”. Collaborations were all-important, bouncing ideas around with both mathematicians and physicists. Beauty in mathematics was also a feature he took seriously, as was a respect for the mathematicians and physicists of the past. He even campaigned successfully for a statue of James Maxwell to be erected in Edinburgh, his home city, in later years.

As for the index theorem itself, it is notable that one of the more subtle versions – the “mod 2 index” – played an important role in Kane and Mele’s theoretical prediction of topological insulators. As they wrote in their 2005 paper: “it distinguishes the quantum spin-Hall phase from the simple insulator.” A fitting tribute to an outstanding pure mathematician, whose intuition and technical power revealed so much in so many domains.

Dream machine

Muon colliders have been the topic of much discussion this week during the update of the European Strategy for Particle Physics (ESPP) in Granada. Proposals for such a machine are much less advanced than those for other projects under consideration for a post-LHC collider. A muon collider is therefore not being considered on the same footing as circular and linear electron—positron projects such as FCC-ee, CEPC, ILC and CLIC. Nevertheless, its potential to produce very high-energy collisions that can transfer all energy into the production of new particles in a relatively small facility is proving difficult for physicists to resist.

In one of 160 written inputs to the ESPP update, the Muon Collider Working Group points out that a 14 TeV muon collider provides an effective energy reach similar to that of a 100 TeV FCC proton—proton machine. In addition to its energy reach, the report states, a dedicated muon collider is able to precisely measure the Higgs boson’s mass and width, along with other observables: “A muon collider is therefore ideal to search for and/or study new physics and for resolving narrow resonances both as a precision facility and/or as an exploratory machine”. A 14 TeV muon machine would fit neatly into the tunnel that is currently occupied by the LHC, and in principle the technology could go to much higher energies.

Feynman diagram

Being some 200 times more massive than electrons, muons suffer significantly less from synchrotron-radiation losses and therefore can be accelerated efficiently in a circular machine. But reaching high luminosities is extremely tough owing to short muon lifetime at rest (2.2 μs) and the difficulty of producing large numbers of muons in suitably shaped bunches.

One way to conquer this problem is to “cool” an initial low-energy muon beam, which has large transverse and longitudinal emittances, by several orders of magnitude in the 6D phase-space and then rapidly accelerate it. Last year, the UK-based Muon Ionisation Cooling Experiment (MICE) demonstrated the principle of ionisation cooling, by observing 4D cooling in a low-flux muon beam. “The results point the way to an exciting programme in which muon beams of high-brightness are exploited to seek new insights into the properties of neutrinos and to explore the energy frontier with multi-TeV lepton-antilepton collisions,” says MICE spokesperson Ken Long of Imperial College, London, who was present at the ESPP symposium. Fermilab in the US also pursued such technologies in its dedicated Muon Accelerator Program (MAP), while an experiment at J-PARC in Japan last year accelerated muons by a radio-frequency accelerator for the first time.

Recently, physicists in Italy and France proposed an alternative concept for a muon collider called the Low Emittance Muon Accelerator (LEMMA), which offers a naturally cooled muon beam with a long lifetime in the laboratory frame by capturing muon–antimuon pairs created in electron–positron annihilation.

The Muon Collider Working Group, which was established by CERN in 2017, has performed a first, high-level review of these two muon collider schemes. “The focus has been on the positron-based scheme, which it was really promising but it has been found to require consolidation,” said Daniel Schulte of CERN, reporting the group’s findings in Granada this week. The group recommends that an international collaboration “promote muon colliders and organise the effort on the development of both accelerators and detectors and to define the road-map towards a CDR by the next ESPP update”. The production of muon neutrinos with a well-known flux and energy spectrum would also serve as the source for a neutrino factory.

Given the broad spectrum of views expressed in Granada this week, both during the sessions and in the sidelines, the path to a muon collider could be bumpy. “At the Higgs pole, it’s not competitive with any of the other machines, although the perspective for Higgs physics at a very high-energy muon collider is actually very good,” pointed out one participant during a discussion session about Higgs and electroweak physics. “It’s total fantasy!” said another. But the view from the expert working group is more positive. Can muon colliders at this moment be considered for the next project? “Not yet,” said Schulte, but enormous progress has been made and it is clearly worthwhile to continue R&D. “A muon collider may be the best option for very high lepton collider energies beyond 3 TeV, and has strong synergies with other projects such as magnet and RF development,” concluded Schulte. “We should not miss this opportunity.”

Energy efficiency – a new frontier

A household freezer consumes about 1 kWh of electrical energy per day. An average household in Western Europe uses about 6 MWh per year. CERN’s total daily electricity consumption on average is about 3.5 GWh, about half of which is needed for the LHC. For reference, the total daily energy consumption of humankind is about 440 TWh – some three quarters of which is currently produced from a finite source of fossil fuels that is driving global temperature rises.

Speaking on the second day of the update of the European Strategy for Particle Physics, which is taking place this week in Granada, Erk Jensen of CERN used these striking figures to illustrate the importance of energy efficiency in high-energy physics. In some proposals for post-LHC colliders, the energy consumed by radio-frequency (RF) and cryogenics systems is in the region of gigawatt hours per day, he said. This puts accelerators into the range where they become relevant for society and public discussion.

The production of renewable energy is enjoying huge growth. Recently, the SESAME lightsource in Jordan became the first major accelerator facility to be powered entirely by renewable energy (solar). For larger research infrastructures, in the absence of better energy-storage technologies, this approach is not yet realistic. The only alternative is to make facilities much more energy efficient. “This is our duty to society, but also a necessity for acceptance!” stated Jensen. The large scale of projects in high-energy physics allows dedicated R&D into more efficient technologies and practices to take place. Not only would this bring significant cost savings, explained Jensen, but concepts and designs developed to improve energy efficiency in accelerators will be relevant for society at large.

A new energy-management panel established at CERN in 2015 has already led to actions that significantly reduce energy consumption in specific areas. These include 5 GWh/y from free cooling and air-flow optimisation, 20 GWh/y from better optimised LHC cryogenics, and 40 GWh/y from the implantation of SPS magnetic cycles and stand-by modes. Recovering waste heat is another line of attack. A project at CERN that is now in its final phase will see thermal energy from LHC Point 8 (where the LHCb experiment is located) to a heating network in the nearby town of Ferney-Voltaire.

For a collider that requires an RF power of 105 MW, such as the proposed electron–positron Future Circular Collider, a 10% increase (from 70 to 80%) in the efficiency of technologies such as high-efficiency klystrons could reduce energy consumption by around 1 TWh in a period of 10 years. This corresponds to a saving of tens of millions of Swiss francs. The adoption of novel neon–helium refrigeration cycles to cool the magnets of a future hadron collider could save up to 3 TWh in 10 years, offering even greater cost reductions. Such savings could, for example, go into R&D for more performant power converters, better designed magnets and RF cavities, and other technologies. Novel accelerator schemes such as energy recovery linacs are another way in which the field can reduce the energy consumption and thus cost of its machines. “Energy efficiency is not an option, it is a must!” concluded Jensen. “A few million investment, to my mind, is well worth it.”

Sustainable future

Energy efficiency is one of several important factors in making high-energy physics more sustainable in the long term. In one of the 160 written inputs to the ESPP Veronique Boisvert of Royal Holloway, University of London, and colleagues made three recommendations to ensure a more sustainable future in view of climate change.

The first is that European laboratories and funding agencies should include, as part of their grant-giving process, criteria evaluating the energy efficiency and carbon footprint of particle physics proposals. The second is that designs for major experiments and their associated buildings should consider plans for reduction of energy consumption, increased energy efficiency, energy recovery and carbon offset mechanisms. The third is that European laboratories should invest in the development and affordable deployment of next-generation digital meeting spaces such as virtual-reality tools, to minimise travel.

“Following the Paris agreement, it will be imperative to have a climate-neutral Europe by 2050,” says Boisvert. “It is therefore vital that big-science initiatives lead the way in greening their technologies and facilities.”

Addressing the outstanding questions

The success of the Standard Model (SM) in describing elementary particles and their interactions is beyond doubt. Yet, as an all-encompassing theory of nature, it falls short. Why are the fermions arranged into three neat families? Why do neutrinos have a vanishingly small but non-zero mass? Why does the Higgs boson discovered fit the simplest “toy model” of itself? And what lies beneath the SM’s 26 free parameters? Similarly profound questions persist in the universe at large: the mechanism of inflation; the matter–antimatter asymmetry; and the nature of dark energy and dark matter.

Surveying outstanding questions in particle physics during the opening session of the update of the European Strategy for Particle Physics (ESPP) on Monday, theorist Pilar Hernández of the University of Valencia discussed the SM’s unique weirdness. Quoting Newton’s assertion “that truth is ever to be found in simplicity, and not in the multiplicity and confusion of things”, she argued that a deeper theory is needed to solve the model’s many puzzles. “At some energy scale the SM stops making sense, so there is a cut off,” she stated. “The question is where?”

This known unknown has occupied theorists ever since the SM came into existence. If it is assumed that the natural cut-off is the Planck scale, 12 orders of magnitude above the energies at the LHC where gravity becomes relevant to the quantum world, then fine tuning is necessary to explain why the Higgs boson (which generates mass via its interactions) is so light. Traditional theoretical solutions to this hierarchy problem – such as supersymmetry or large extra dimensions – imply the existence of new phenomena at scales higher than the mass of the Higgs boson. While initial results from the LHC severely constrain the most natural parameter spaces, the 10­–100 TeV region is still an interesting scale to explore, says Hernández. At the same time, continues Hernández, there is a shift to more “bottom-up, rather than top-down”, approaches to beyond-SM (BSM) physics. “Particle physics could be heading to crisis or revolution. New BSM avenues focus on solving open problems such as the flavour puzzle, the origin of neutrino masses and the baryon asymmetry at lower scales.”

Introducing a “motivational toolkit” to plough the new territories ahead, Hernández named targets such as axion-like and long-lived particles, and the search for connections between the SM’s various puzzles. She noted in particular that 23 of the 26 free parameters of the SM are related in one way or another to the Higgs boson. “If we are looking for the suspect that could be hiding some secret, obviously the Higgs is the one!”

Linear versus circular

The accelerator, detector and computing technology needed for future fundamental exploration was the main focus of the scientific plenary session on day one of the ESPP update. Reviewing Higgs factory programmes, Vladimir Shiltsev, head of Fermilab’s Accelerator Physics Center, weighed up the pros and cons of linear versus circular machines. The former includes the International Linear Collider (ILC) and the Compact Linear Collider (CLIC); the latter a future circular electron–positron collider at CERN (FCCee) and the Circular Electron Positron Collider in China (CEPC). All need a high luminosity at the Higgs energy scale.

Linear colliders, said Shiltsev, are based on mature designs and organisation, are expandable to higher energies, and draw a wall-plug power similar to that of the LHC. On the other hand, they face potential challenges linked to their luminosity spectrum and beam current. Circular Higgs factories are also based on mature technology, with a strong global collaboration in the case of FCC. They offer a higher luminosity and more interaction points than linear options but require strategic R&D into high-efficiency RF sources and superconducting cavities, said Shiltsev. He also described a potential muon collider with a centre of mass energy of 126 GeV, which could be realised in a machine as short as 10 km. Although the cost would be relatively low, he said, the technology is not yet ready.

coffee break at open symposium of the European Strategy for Particle Physics

For energy-frontier colliders, the three current options – CERN’s HE-LHC (27 TeV) and FCC-hh (100 TeV), and China’s SppC (75 TeV) – demand high-field superconducting dipole magnets. These machines also present challenges such as how to deal with extreme levels of synchrotron radiation, collimation, injection and the overall machine design and energy efficiency. In a talk about the state-of-the-art and challenges in accelerator technology, Akira Yamamoto of CERN/KEK argued that, while a lepton collider could begin construction in the next few years, the dipoles necessary for a hadron collider might take 10 to 15 years of R&D before construction could start. There are natural constraints in such advanced-magnet development regardless of budget and manpower, he remarked.

Concerning more futuristic acceleration technologies based on plasma wakefields, which offer a factor 1000 more power than today’s RF systems, impressive results have been achieved recently at facilities such as BELLA at Berkeley and AWAKE at CERN. Responding to a question about when these technologies might supersede current ones, Shiltsev said: “Hopefully 20–30 years from now we should be able to know how many thousands of TeV will be possible by the end of the century.”

Recognising detectors and computing

An energy-frontier hadron collider would produce radiation environments that current detectors cannot deal with, said Francesco Forti of INFN and the University of Pisa in his talk about the technological challenges of particle-physics experiments. Another difficulty for detectors is how to handle non-standard physics signals, such as long-lived particles and monopoles. Like accelerators, detectors require long time scales – it was the very early 1990s when the first LHC detector CDRs were written. From colliders to fixed-target to astrophysics experiments, detectors in high-energy physics face a huge variety of operating conditions and employ technologies that are often deeply entwined with developments in industry. The environmental credentials of detectors are also increasingly in the spotlight.

The focus of detector R&D should follow a “70–20–10” model, whereby 70% of efforts go into current detectors, 20% on future detectors and 10% blue-sky R&D, argued Forti. Given that detector expertise is distributed among many institutions, the field also needs solid co-ordination. Forti cited CERN’s “RD” projects in diamond detectors, silicon radiation-hard devices, micro-pattern gas detectors and pixel readout chips for ATLAS and CMS as good examples of coordination towards common goals. Finally, he argued strongly for greater consideration of the “human factor”, stating that the current career model “just doesn’t work very well.” Your average particle physicist cannot be expert and innovative simultaneously in analysis, detectors, computing, teaching, outreach and other areas, he reasoned. “Career opportunities for detector physicists must be greatly strengthened and kept open in a systematic way, he said. “Invest in the people and in the murky future.”

Computing for high-energy physics faces similar challenges. “There is an increasing gap between early-career physicists and the profile needed to program new architectures, such as greater parallelisation,” said Simone Campana of CERN and the HEP software foundation in a presentation about future computing challenges. “We should recognise the efforts of those who specialise in software because they can really change things like the speed of analyses and simulations.”

In terms of data processing, the HL-LHC presents a particular challenge. DUNE, FAIR, BELLE II and other experiments will also create massive data samples. Then there is the generation of Monte Carlo samples. “Computing resources in HEP will be more constrained in the future,” said Campana. “We enter a regime where existing projects are entering a challenging phase, and many new projects are competing for resources – not just in HEP but in other sciences, too.” At the same time, the rate of advances in hardware performance has slowed in recent years, encouraging the community to adapt to take advantage of developments such as GPUs, high-performance computing and commercial cloud services.

The HEP software foundation released a community white paper in 2018 setting out the radical changes in computing and software – not just for processing but also for data storage and management – required to ensure the success of the LHC and other high-energy physics experiments into the 2020s.

Closing out

Closer examination of linear and circular colliders took place during subsequent parallel sessions on the first day of the ESPP update. Dark matter, flavour physics and electroweak and Higgs measurements were the other parallel themes. A final discussion session focusing on the capability of future machines for precision Higgs physics generated particularly lively exchanges between participants. It illuminated both the immensity of efforts to evaluate the physics reach of the high-luminosity LHC and future colliders, and the unenviable task faced by ESPP committees in deciding which post-LHC project is best for the field. It’s a point summed up well in the opening address by the chair of the ESPP strategy secretariat, Halina Abramowicz: “This is a very strange symposium. Normally we discuss results at conferences, but here we are discussing future results.”

Communicating the next collider

These days, in certain parts of the world at least, “hadron collider” and “Higgs boson” are practically household names. This is a consequence of the LHC, and the global communications that surrounded its construction, switch-on and eventual discovery of the Higgs boson. How should the next major project in particle physics be communicated to ensure its reach and success?

Communication is increasingly seen as integral to the research process, and is one of the strands of the open symposium of the European Strategy for Particle Physics (ESPP) update, which takes place from today in Granada, Spain. The ESPP update takes on board worldwide activities in particle physics and related topics, and is due to conclude early next year. It aims to reach a consensus on the scientific goals of the community and assess the proposed projects and technologies to achieve these goals. Though no decisions will be made now, the process is hoped to bring clarity to the question of which project will succeed the LHC following the end of its high-luminosity operations in the mid-2030s.

The landscape of communications has changed dramatically since the pre-web/mobile days of the nascent LHC. In one of 160 written contributions to the ESPP update, the International Particle Physics Outreach Group (IPPOG) emphasises the strategic relevance of concerted, global outreach activities for future colliders, stating that the success of such endeavours “depends greatly on the establishment of broad public support, as well as the commitment of key stakeholders and policymakers throughout Europe and the world”. IPPOG proposes that particle physics outreach and communication be explicitly recognised as strategic pillars in the final ESPP update document in 2020.

A contribution from the European Particle Physics Communication Network with support from the Interactions Collaboration highlights specific challenges that communicators face. These include the pace of change in social media, the speed of dissemination of good news, bad news and rumours, and the need to maintain trust and transparency in an era where there appears to be a popular backlash against expert opinion. The document notes the complexities of maintaining press interest over long timescales, and in conveying the costs involved: “Proposals for major international particle-physics experiments are infrequent, and when they are proposed, they seem disproportionately expensive when compared to other science disciplines”. A plenary talk on education, communication and outreach will take place on Wednesday at this week’s symposium. The European Strategy Group has also established a working group to recognise and support researchers who devote their time to such activities.

ESPP participants

Consensus in the community is a further factor for communications. In the early 1990s, when the LHC was seeking approval, there was broad agreement that a circular hadron collider was the right step for the field. The machine had a new energy territory to explore and a clear physics target (the mechanism of electroweak symmetry breaking), around which narratives could be built. The ability to witness the construction of the LHC and its four detectors itself was a massive draw. On the big-collider menu today, against a backdrop of the LHC’s discovery of a light Higgs boson but no particles beyond the Standard Model, is an International Linear Collider in Japan, a Compact Linear Collider or Future Circular Collider at CERN, and a Circular Electron Positron Collider in China. The projects would span decades and may require international collaboration on an entirely new scale. While not all equally mature, each has its own detailed physics and technology case that will be dissected this week.

Whether straight or circular, European or Asian, the next big collider requires a fresh narrative if it is to inspire the wider world. The rosy picture of eager experimentalists uncovering new elementary particles and wispy-haired theorists travelling to Stockholm to pick up prizes seems antiquated, now that all the particles of the Standard Model have been found. Short of major new theoretical insights, the best signposts in the dark and possibly hidden sectors ahead may come from experimental exploration. As Nima Arkani-Hamed put it recently in an interview with the Courier: “When theorists are more confused, it’s the time for more, not less experiments.”

Interrogating the Higgs boson – a completely new form of scalar matter with connections to the dynamics of the vacuum and other deep puzzles in the Standard Model – is the focus of all future collider proposals. Direct and indirect searches for new physics at much higher energy scales is another. However, as the range of contributions to the ESPP update illustrates, and which is integral to communications efforts, frontier colliders are only one tool to enable progress. Enigmas such as dark matter and energy are being probed from multiple angles both on the ground and in space; gravitational-wave astronomy is revolutionising astroparticle physics. Experiments large and small are closing in on the neutrino’s unique properties; heavy-ion, flavour, antimatter, fixed-target and numerous other programmes are thriving. A CERN initiative to specifically explore experimental programmes beyond high-energy colliders is advancing rapidly.

The LHC has demonstrated that there is a huge public appetite for the abstract, mind-expanding science made possible by awesomely large machines. There is no reason to think that the next leg of the journey in fundamental exploration is any less inspiring, and every reason to shout about its impact. Above and beyond the knowledge it creates and the advanced technologies that it drives, particle physics is one of the subjects that attracts young people into STEM subjects, many going on to pursue more applied research or industry careers. Large research infrastructures also have direct, though little reported, economic and societal benefits. Last but not least, the success of big science sends a positive message about human progress and global collaboration at a time when many nations are looking inwards. Clearly, engaging the public, politicians and fellow scientists in the next high-energy physics adventure presents a golden opportunity for those of us in the comms business.

For now, though, it’s over to the 600 or so physicists here in Granada to carve out the new physics avenues ahead. The Courier will be following discussions throughout the week in an attempt to unravel the big picture.

Science communication: a new frontier

In the world of communication, everyone has a role to play. During the past two decades, the ability of researchers to communicate their work to funding agencies, policymakers, entrepreneurs and the public at large has become an increasingly important part of their job. Scientists play a fundamental role in society, generally enjoying an authoritative status, and this makes us accountable.

Science communication is not just a way to share knowledge, it is also about educating new generations in the scientific approach and attracting young people to scientific careers. In addition, fundamental research drives the development of technology and innovation, playing an important role in providing solutions in challenging areas such as health care, the provision of food and safety. This obliges researchers to disseminate the results of their work.

Evolving attitudes

Although science communication is becoming increasingly unavoidable, the skills it requires are not yet universal and some scientists are not prepared to do it. Of course there are risks involved. Communication can distract individuals from research and objectives, or, if done badly, can undermine the very messages that the scientist needs to convey. The European Researchers’ Night is a highly successful annual event that was initiated in 2005 as a European Commission Marie Skłodowska-Curie Action, and offers an opportunity for scientists to get more involved in science communication. It falls every final Friday of September, and illustrates how quickly attitudes are evolving.

In 2006, with a small group of researchers from the Italian National Institute for Nuclear Physics (INFN) located close to Frascati, we took part in one of the first Researchers’ Night events. Frascati is surrounded by important scientific institutions and universities, and from the start the Italian National Agency for New Technologies, Energy and Sustainable Economic Development, the European Space Agency and the National Institute for Astrophysics joined the collaboration with INFN, along with the Municipality of Frascati and the Cultural and Research Department of the Lazio region, which co-funded the initiative.

Since then, thousands of researchers, citizens, public and private institutions have worked together to change the public perception of science and of the infrastructure in the Frascati and Lazio regions, supported by the programme. Today, after 13 editions, it involves more than 60 scientific partners spread from the north to the south of Italy in 30 cities, and attracts more than 50,000 attendees, with significant media impact (figure 1). Moreover, it has now evolved to become a week-long event, is linked to many related events throughout the year, and has triggered many institutions to develop their own science-communication projects.

Analysing the successive Frascati Researchers’ Night projects allows a better understanding of the evolution of science-communication methodology. Back in 2006, scientists started to open their laboratories and research infrastructures to present their jobs in the most comprehensible way, with a view to increasing the scientific literacy of the public and to fill their “deficit” of knowledge. They then tried to create a direct dialogue by meeting people in public spaces such as squares and bars, discussing the more practical aspects of science, such as how public money is spent, and how much researchers are responsible for their work. Those were the years in which the socio-economic crisis started to unfold. It was also the beginning of the European Union’s Horizon 2020 programme, when economic growth and terms such as “innovation” started to substitute scientific progress and discovery. It was therefore becoming more important than ever to engage with the public and keep the science flag flying.

In recent years, this approach has changed. Two biannual projects that are also part of a Marie Skłodowska-Curie Action – Made in Science and BEES (BE a citizEn Scientist) underline a different vision of science and of the methodology of communication. Made in Science (which was live between 2016 and 2017) was supposed to represent the “trademark” of research, aiming to communicate to society the importance of the science production chain in terms of quality, identity, creativity, know-how and responsibility. In this chain, which starts from fundamental research and ends with social benefits, no one is excluded and must take part in the decision process and, where possible, in the research itself. Its successor, BEES (2018–2019), on the other hand, aims to bring citizens up close to the discovery process, showing how long it takes and how it can be tough and frustrating. Both projects follow the most recent trends in science communication based on a participative or “public engagement” model, rather than the traditional “deficit” model. Here, researchers are not the main actors but facilitators of the learning process with a specific role: the expert one.

Nerd or not a nerd?

Nevertheless, this evolution of science communication isn’t all positive. There are many examples of problems in science communication: the explosion of concerns about science (vaccines, autism, GMO, homeopathy, etc); the avoidance of science and technology in preference to returning to a more “natural” life; the exploitation of science results (positive or negative) to support conspiracy theories or influence democracies; and overplaying the benefits for knowledge and technology transfer, to list a few examples. Last but not least, some strong bias still remains among both scientists and audiences, limiting the effectiveness of communication.

The first, and probably the hardest, is the stereotype bias: are you a “nerd”, or do you feel like a nerd? Often scientists refer to themselves as a category that can’t be understood by society, consequently limiting their capacity to interact with the public. On the other hand, scientists are sometimes real nerds, and seen by the public as nerds. This is true for all job categories, but in the case of scientists this strongly conditions their ability to communicate.

Age, gender and technological bias also still play a fundamental role, especially in the most developed European countries. Young people may understand science and technology more easily, while women still do not seem to have full access to scientific careers and to the exploitation of technology. Although the transition from a deficit to a participative model is already common in education and democratic societies, it is not yet completed in science, which is likely because of the strong bias that still seems to exist among researchers and audiences. The Marie Skłodowska-Curie European Researchers’ Night is a powerful way in which scientists can address such issues.

Artistic encounters of the quantum kind

Take a leap and enter, past the chalkboard wall filled with mathematical equations written, erased and written again, into the darkened room of projected questions where it all begins. What is reality? How do we describe nature? And for that matter, what is science and what is art?

Quàntica, which opened on 9 April at the Centre de Cultural Contemporània de Barcelona, invites you to explore quantum physics through the lens of both art and science. Curated by Mónica Bello, head of Arts at CERN, and art curator José-Carlos Mariátegui, with particle physicist José Ignacio Latorre serving as its scientific adviser, Quàntica is the second iteration of an exhibition that brings together 10 artworks resulting from Collide International art residences at CERN.

The exhibition illustrates how interdisciplinary intersections can present scientific concepts regarded by the wider public as esoteric, in ways that bridge the gap, engage the senses and create meaning. Punctuating each piece is the idea that the principles of quantum physics, whether we like it or not, are pervasive in our lives today – from technological applications in smart phones and satellites to our philosophies and world views.

Nine key concepts – “scales”, “quantum states”, “overlap”, “intertwining”, “indeterminacy”, “randomness”, “open science”, “everyday quantum” and “change-evolution” – guide visitors through the meandering hallway. Each display point prompts pause to consider a question that underlies the fundamental principles of quantum physics. Juxtaposed in the shared space is an artist-made particle detector and parts of experiments displayed as artistic objects. Video art installations are interspersed with video interviews of CERN physicists, including Helga Timko, who asks: what if we were to teach children quantum physics at a very young age, would they perceive the world as we do? On the ceiling above is a projection of a spiral galaxy, a part of Juan Cortés’ Supralunar. Inspired by Vera Rubin’s work on dark matter and the rotational motion of galaxies, Cortés made a two-part multisensorial installation: a lens through which you see flashing lights and vibrating plates to rest your chin and experience, on some level, the intensity of a galaxy’s formation.

From the very large scale, move to the very small. A recording of Richard Feynman explaining the astonishing double-slit experiment plays next to a standing demonstration allowing you to observe the counterintuitive possibilities that exist at the subatomic level. You can put on goofy glasses for Lea Porsager’s Cosmic Strike, an artwork with a sense of humour, which offers an immersive 3D animation described as “hard science and loopy mysticism”. She engages the audience’s imagination to meditate on being a neutrino as it travels through the neutrino horn, one of the many scientific artefacts from CERN’s archives that pepper the path.

Around the corner is Erwin Schrödinger’s 1935 article where he first used the word “Verschränkung” (or entanglement) and Anton Zeilinger’s notes explaining the protocol for quantum teleportation. Above these is projected a scene from Star Trek, which popularised the idea of teleportation.

The most visually striking piece in the exhibition is Cascade by Yunchul Kim, made up of three live elements. The first part is Argos (see image), splayed metallic hands that hang like lamps from the ceiling – an operational muon detector made of 41 channels blinking light as it records the particles passing through the gallery. Each signal triggers the second element, Impulse, a chandelier-like fluid-transfer system that sends drops of liquid through microtubes that flow into transparent veins of the final element, Tubular. Kim, who won the 2016 Arts at CERN Collide International Award, is an artist who employs rigorous methods and experiments in his laboratory with liquid and materials. Cascade encapsulates the surprising results knowledge-sharing can yield.

Quàntica is a must-see for anyone who views art and science as opposite ends of the academic spectrum. The first version of the exhibition was held at Liverpool in the UK last year. Co-produced by the ScANNER network (CERN, FACT, CCCB, iMAL and Le Lieu Unique), the exhibition continues until 24 September in Barcelona, before travelling to Brussels.

bright-rec iop pub iop-science physcis connect