Comsol -leaderboard other pages

Topics

Energy efficiency – a new frontier

A household freezer consumes about 1 kWh of electrical energy per day. An average household in Western Europe uses about 6 MWh per year. CERN’s total daily electricity consumption on average is about 3.5 GWh, about half of which is needed for the LHC. For reference, the total daily energy consumption of humankind is about 440 TWh – some three quarters of which is currently produced from a finite source of fossil fuels that is driving global temperature rises.

Speaking on the second day of the update of the European Strategy for Particle Physics, which is taking place this week in Granada, Erk Jensen of CERN used these striking figures to illustrate the importance of energy efficiency in high-energy physics. In some proposals for post-LHC colliders, the energy consumed by radio-frequency (RF) and cryogenics systems is in the region of gigawatt hours per day, he said. This puts accelerators into the range where they become relevant for society and public discussion.

The production of renewable energy is enjoying huge growth. Recently, the SESAME lightsource in Jordan became the first major accelerator facility to be powered entirely by renewable energy (solar). For larger research infrastructures, in the absence of better energy-storage technologies, this approach is not yet realistic. The only alternative is to make facilities much more energy efficient. “This is our duty to society, but also a necessity for acceptance!” stated Jensen. The large scale of projects in high-energy physics allows dedicated R&D into more efficient technologies and practices to take place. Not only would this bring significant cost savings, explained Jensen, but concepts and designs developed to improve energy efficiency in accelerators will be relevant for society at large.

A new energy-management panel established at CERN in 2015 has already led to actions that significantly reduce energy consumption in specific areas. These include 5 GWh/y from free cooling and air-flow optimisation, 20 GWh/y from better optimised LHC cryogenics, and 40 GWh/y from the implantation of SPS magnetic cycles and stand-by modes. Recovering waste heat is another line of attack. A project at CERN that is now in its final phase will see thermal energy from LHC Point 8 (where the LHCb experiment is located) to a heating network in the nearby town of Ferney-Voltaire.

For a collider that requires an RF power of 105 MW, such as the proposed electron–positron Future Circular Collider, a 10% increase (from 70 to 80%) in the efficiency of technologies such as high-efficiency klystrons could reduce energy consumption by around 1 TWh in a period of 10 years. This corresponds to a saving of tens of millions of Swiss francs. The adoption of novel neon–helium refrigeration cycles to cool the magnets of a future hadron collider could save up to 3 TWh in 10 years, offering even greater cost reductions. Such savings could, for example, go into R&D for more performant power converters, better designed magnets and RF cavities, and other technologies. Novel accelerator schemes such as energy recovery linacs are another way in which the field can reduce the energy consumption and thus cost of its machines. “Energy efficiency is not an option, it is a must!” concluded Jensen. “A few million investment, to my mind, is well worth it.”

Sustainable future

Energy efficiency is one of several important factors in making high-energy physics more sustainable in the long term. In one of the 160 written inputs to the ESPP Veronique Boisvert of Royal Holloway, University of London, and colleagues made three recommendations to ensure a more sustainable future in view of climate change.

The first is that European laboratories and funding agencies should include, as part of their grant-giving process, criteria evaluating the energy efficiency and carbon footprint of particle physics proposals. The second is that designs for major experiments and their associated buildings should consider plans for reduction of energy consumption, increased energy efficiency, energy recovery and carbon offset mechanisms. The third is that European laboratories should invest in the development and affordable deployment of next-generation digital meeting spaces such as virtual-reality tools, to minimise travel.

“Following the Paris agreement, it will be imperative to have a climate-neutral Europe by 2050,” says Boisvert. “It is therefore vital that big-science initiatives lead the way in greening their technologies and facilities.”

Addressing the outstanding questions

The success of the Standard Model (SM) in describing elementary particles and their interactions is beyond doubt. Yet, as an all-encompassing theory of nature, it falls short. Why are the fermions arranged into three neat families? Why do neutrinos have a vanishingly small but non-zero mass? Why does the Higgs boson discovered fit the simplest “toy model” of itself? And what lies beneath the SM’s 26 free parameters? Similarly profound questions persist in the universe at large: the mechanism of inflation; the matter–antimatter asymmetry; and the nature of dark energy and dark matter.

Surveying outstanding questions in particle physics during the opening session of the update of the European Strategy for Particle Physics (ESPP) on Monday, theorist Pilar Hernández of the University of Valencia discussed the SM’s unique weirdness. Quoting Newton’s assertion “that truth is ever to be found in simplicity, and not in the multiplicity and confusion of things”, she argued that a deeper theory is needed to solve the model’s many puzzles. “At some energy scale the SM stops making sense, so there is a cut off,” she stated. “The question is where?”

This known unknown has occupied theorists ever since the SM came into existence. If it is assumed that the natural cut-off is the Planck scale, 12 orders of magnitude above the energies at the LHC where gravity becomes relevant to the quantum world, then fine tuning is necessary to explain why the Higgs boson (which generates mass via its interactions) is so light. Traditional theoretical solutions to this hierarchy problem – such as supersymmetry or large extra dimensions – imply the existence of new phenomena at scales higher than the mass of the Higgs boson. While initial results from the LHC severely constrain the most natural parameter spaces, the 10­–100 TeV region is still an interesting scale to explore, says Hernández. At the same time, continues Hernández, there is a shift to more “bottom-up, rather than top-down”, approaches to beyond-SM (BSM) physics. “Particle physics could be heading to crisis or revolution. New BSM avenues focus on solving open problems such as the flavour puzzle, the origin of neutrino masses and the baryon asymmetry at lower scales.”

Introducing a “motivational toolkit” to plough the new territories ahead, Hernández named targets such as axion-like and long-lived particles, and the search for connections between the SM’s various puzzles. She noted in particular that 23 of the 26 free parameters of the SM are related in one way or another to the Higgs boson. “If we are looking for the suspect that could be hiding some secret, obviously the Higgs is the one!”

Linear versus circular

The accelerator, detector and computing technology needed for future fundamental exploration was the main focus of the scientific plenary session on day one of the ESPP update. Reviewing Higgs factory programmes, Vladimir Shiltsev, head of Fermilab’s Accelerator Physics Center, weighed up the pros and cons of linear versus circular machines. The former includes the International Linear Collider (ILC) and the Compact Linear Collider (CLIC); the latter a future circular electron–positron collider at CERN (FCCee) and the Circular Electron Positron Collider in China (CEPC). All need a high luminosity at the Higgs energy scale.

Linear colliders, said Shiltsev, are based on mature designs and organisation, are expandable to higher energies, and draw a wall-plug power similar to that of the LHC. On the other hand, they face potential challenges linked to their luminosity spectrum and beam current. Circular Higgs factories are also based on mature technology, with a strong global collaboration in the case of FCC. They offer a higher luminosity and more interaction points than linear options but require strategic R&D into high-efficiency RF sources and superconducting cavities, said Shiltsev. He also described a potential muon collider with a centre of mass energy of 126 GeV, which could be realised in a machine as short as 10 km. Although the cost would be relatively low, he said, the technology is not yet ready.

coffee break at open symposium of the European Strategy for Particle Physics

For energy-frontier colliders, the three current options – CERN’s HE-LHC (27 TeV) and FCC-hh (100 TeV), and China’s SppC (75 TeV) – demand high-field superconducting dipole magnets. These machines also present challenges such as how to deal with extreme levels of synchrotron radiation, collimation, injection and the overall machine design and energy efficiency. In a talk about the state-of-the-art and challenges in accelerator technology, Akira Yamamoto of CERN/KEK argued that, while a lepton collider could begin construction in the next few years, the dipoles necessary for a hadron collider might take 10 to 15 years of R&D before construction could start. There are natural constraints in such advanced-magnet development regardless of budget and manpower, he remarked.

Concerning more futuristic acceleration technologies based on plasma wakefields, which offer a factor 1000 more power than today’s RF systems, impressive results have been achieved recently at facilities such as BELLA at Berkeley and AWAKE at CERN. Responding to a question about when these technologies might supersede current ones, Shiltsev said: “Hopefully 20–30 years from now we should be able to know how many thousands of TeV will be possible by the end of the century.”

Recognising detectors and computing

An energy-frontier hadron collider would produce radiation environments that current detectors cannot deal with, said Francesco Forti of INFN and the University of Pisa in his talk about the technological challenges of particle-physics experiments. Another difficulty for detectors is how to handle non-standard physics signals, such as long-lived particles and monopoles. Like accelerators, detectors require long time scales – it was the very early 1990s when the first LHC detector CDRs were written. From colliders to fixed-target to astrophysics experiments, detectors in high-energy physics face a huge variety of operating conditions and employ technologies that are often deeply entwined with developments in industry. The environmental credentials of detectors are also increasingly in the spotlight.

The focus of detector R&D should follow a “70–20–10” model, whereby 70% of efforts go into current detectors, 20% on future detectors and 10% blue-sky R&D, argued Forti. Given that detector expertise is distributed among many institutions, the field also needs solid co-ordination. Forti cited CERN’s “RD” projects in diamond detectors, silicon radiation-hard devices, micro-pattern gas detectors and pixel readout chips for ATLAS and CMS as good examples of coordination towards common goals. Finally, he argued strongly for greater consideration of the “human factor”, stating that the current career model “just doesn’t work very well.” Your average particle physicist cannot be expert and innovative simultaneously in analysis, detectors, computing, teaching, outreach and other areas, he reasoned. “Career opportunities for detector physicists must be greatly strengthened and kept open in a systematic way, he said. “Invest in the people and in the murky future.”

Computing for high-energy physics faces similar challenges. “There is an increasing gap between early-career physicists and the profile needed to program new architectures, such as greater parallelisation,” said Simone Campana of CERN and the HEP software foundation in a presentation about future computing challenges. “We should recognise the efforts of those who specialise in software because they can really change things like the speed of analyses and simulations.”

In terms of data processing, the HL-LHC presents a particular challenge. DUNE, FAIR, BELLE II and other experiments will also create massive data samples. Then there is the generation of Monte Carlo samples. “Computing resources in HEP will be more constrained in the future,” said Campana. “We enter a regime where existing projects are entering a challenging phase, and many new projects are competing for resources – not just in HEP but in other sciences, too.” At the same time, the rate of advances in hardware performance has slowed in recent years, encouraging the community to adapt to take advantage of developments such as GPUs, high-performance computing and commercial cloud services.

The HEP software foundation released a community white paper in 2018 setting out the radical changes in computing and software – not just for processing but also for data storage and management – required to ensure the success of the LHC and other high-energy physics experiments into the 2020s.

Closing out

Closer examination of linear and circular colliders took place during subsequent parallel sessions on the first day of the ESPP update. Dark matter, flavour physics and electroweak and Higgs measurements were the other parallel themes. A final discussion session focusing on the capability of future machines for precision Higgs physics generated particularly lively exchanges between participants. It illuminated both the immensity of efforts to evaluate the physics reach of the high-luminosity LHC and future colliders, and the unenviable task faced by ESPP committees in deciding which post-LHC project is best for the field. It’s a point summed up well in the opening address by the chair of the ESPP strategy secretariat, Halina Abramowicz: “This is a very strange symposium. Normally we discuss results at conferences, but here we are discussing future results.”

Cross-fertilisation in detector development

More than 300 experts convened from 18-22 February for the 15th Vienna Conference on Instrumentation to discuss ongoing R&D efforts and set future roadmaps for collaboration. “In 1978 we discussed wire chambers as the first electronic detectors, and now we have a large number of very different detector types with performances unimaginable at that time,” said Manfred Krammer, head of CERN’s experimental physics department, recalling the first conference of the triennial series. “In the long history of the field we have seen the importance of cross-fertilisation as developments for one specific experiment can catalyse progress in many fronts.”

Following this strong tradition, the conference covered fundamental and technological issues associated with the most advanced detector technologies as well as the value of knowledge transfer to other domains. Over five days, participants covered topics ranging from sensor types and fast and efficient electronics to cooling technologies and their mechanical structures.

Contributors highlighted experiments proposed in laboratories around the world, spanning gravitational-wave detectors, colliders, fixed-target experiments, dark-matter searches, and neutrino and astroparticle experiments. A number of talks covered upgrade activities for the LHC experiments ahead of LHC Run 3 and for the high-luminosity LHC. An overview of LIGO called for serious planning to ensure that future ground-based gravitational-wave detectors can be operational in the 2030s. Drawing a comparison between the observation of gravitational waves and the discovery of the Higgs boson, Christian Joram of CERN noted “Progress in experimental physics often relies on breakthroughs in instrumentation that lead to substantial gains in measurement accuracy, efficiency and speed, or even open completely new approaches.”

Beyond innovative ideas and cross-disciplinary collaboration, the development of new detector technologies calls for good planning of resources and times. The R&D programme for the current LHC upgrades was set out in 2006, and it is already timely to start preparing for the third long shutdown in 2023 and the High-Luminosity LHC. Meanwhile, the CLIC and Future Circular Collider studies are developing clear ideas of the future experimental challenges in tackling the next exploration frontier.

Upping the tempo on wakefield accelerators

Around 50 experts from around the world met at CERN from 26 to 29 March for the second ALEGRO workshop to discuss advanced linear-collider concepts at the energy frontier.

ALEGRO, the Advanced Linear Collider Study Group, was formed as an outcome of an ICFA workshop on adv­anced accelerators held at CERN in 2017 (CERN Courier December 2017 p31). Its purpose is to unite the accelerator community behind a > 10 TeV electron–positron collider based on advanced and novel accelerators (ANAs), which use wakefields driven by intense laser pulses or relativistic particle bunches in plasma, dielectric or metallic structures to reach gradients as high as 1 GeV/m. The proposed Advanced Linear International Collider – ALIC for short – would be shorter than linear colliders based on more conventional acceleration technologies such as CLIC and ILC, and would reach higher energies.

The main research topics ALEGRO identified are the preservation of beam quality, the development of stable and efficient drivers (in particular laser systems), wall-plug-to-beam-power efficiency, operation at high-repetition rates, tolerance studies, the staging of two structures and the development of suitable numerical tools to allow for the simulation of the accelerator as a whole.

The next ALEGRO workshop will be held in March 2020 in Germany.

Particle colliders: accelerating innovation

Around 100 researchers, academics and industry delegates joined a co-innovation workshop in Liverpool, UK, on 22 March to discuss the strategic R&D programme for a Future Circular Collider (FCC) and associated benefits for industry. Motivated by the FCC study, the aim of the event was to identify joint R&D opportunities across accelerator projects and disciplines.

New particle colliders provide industry with powerful test-beds with a high publicity factor. Well-controlled environments allow novel technologies and processes to be piloted, and SMEs are ideal partners to bring these technologies – which include superconducting magnets, cryogenics, civil engineering, detector development, energy efficiency and novel materials and material processing techniques – to maturity.

Short talks about FCC-related areas for innovation, examples of successful technology-transfer projects at CERN, as well as current and future funding opportunities stimulated interesting discussions. Several areas were identified as bases for co-innovation, including resource-efficient tunnelling, the transfer of bespoke machine-learning techniques from particle physics to industry, detector R&D, cooling and data handling. The notes from all the working groups will be used to establish joint funding bids between participants.

The co-innovation workshop was part of a bigger event, “Particle Colliders – Accelerating Innovation”, which was devoted to the benefits of fundamental science to society and industry, co-hosted by the University of Liverpool and CERN together with partners from the FCC and H2020 EuroCirCol projects, and supported by EU-funded MSCA training networks. Almost 1000 researchers and industrialists from across Europe, including university and high-school students, took part. An industry exhibition allowed more than 60 high-tech companies to showcase their latest products, also serving university students as a careers fair, and more than a dozen different outreach activities were available to younger students.

A separate event, held at CERN on 4 and 5 March, reviewed the FCC physics capabilities following the publication of the FCC conceptual design report in January (CERN Courier January/February 2019 p8). The FCC study envisages the construction of a new 100 km-circumference tunnel at CERN hosting an intensity-frontier lepton collider (FCC-ee) as a first step, followed by an energy-frontier hadron machine (FCC-hh). It offers substantial and model-independent studies of the Higgs boson by extending the range of measurable Higgs properties to its total width and self-coupling. Moreover, the combination of superior precision and energy reach allows a complementary mix of indirect and direct probes of new physics. For example, FCC-ee would enable the Higgs couplings to the Z boson to be measured with an accuracy better than 0.17%, while FCC-hh will determine model-independent ttH coupling to <1%.

Physics discussions were accompanied by a status report of the overall FCC project, reviewing the technological challenges for both accelerator and detectors, the project implementation strategy, and cost estimates. Construction of the more technologically ready FCC-ee could start by 2028, delivering first physics beams a decade later, right after the end of the HL-LHC programme. Another important aspect of the two-day meeting was the need for further improving theoretical predictions to match the huge step in experimental precision possible at the FCC.

Planning now for a 70-year-long programme may sound a remote goal. However, as Alain Blondel of the University of Geneva remarked in the concluding talk of the conference, the first report on the LHC dates back more than 40 years. “Progress in knowledge has no price,” he said. “The FCC sets ambitious but feasible goals for the global community resembling previous leaps in the long history of our field.”

BELLA sets new record for plasma acceleration

A world record for laser-driven wakefield acceleration has been set by a team at the Berkeley Lab Laser Accelerator (BELLA) Center in the US. Physicists used a novel scheme to channel 850 TW laser pulses through a 20 cm-long plasma, allowing electron beams to be accelerated to an energy of 7.8 GeV – almost double the previous record set by the same group in 2014.

Proposed 40 years ago, plasma-wakefield acceleration can produce gradients hundreds of times higher than those achievable with conventional techniques based on radio-frequency cavities. It is often likened to surfing a wave. Relativistic laser pulses with a duration of the order of the plasma period generate large-amplitude electron plasma waves that displace electrons with respect to the background ions, allowing the plasma waves to accelerate charged particles to relativistic energies. Initial work showed that TeV energies could be reached in just a few hundred metres using multiple laser-plasma accelerator stages, each driven by petawatt laser pulses propagating through a plasma with a density of about 1017 cm–3. However, this requires the focused laser pulses to be guided over distances of tens of centimetres. While a capillary discharge is commonly used to create the necessary plasma channel, achieving a sufficiently deep channel at a plasma density of 1017 cm–3 is challenging.

In the latest BELLA demonstration, the plasma channel produced by the capillary discharge was modified by a nanosecond-long “heater” pulse that confined the focused laser pulses over the 20 cm distance. This allowed for the acceleration of electron beams with quasi-monoenergetic peaks up to 7.8 GeV. “This experiment demonstrates that lasers can be used to accelerate electrons to energies relevant to X-ray free-electron lasers, positron generation, and high-energy collider stages,” says lead author Tony Gonsalves. “However, the beam quality currently available from laser-wakefield accelerators is far from that required by future colliders.”

The quality of the accelerated electron beam  is determined by how background plasma electrons are trapped in the accelerating and focusing “bucket” of the plasma wave. Several different methods of initiating electron trapping have been proposed to improve the beam emittance and brightness significantly beyond state-of-the-art particle sources, representing an important area of research. Another challenge, says Gonsalves, is to improve the stability and reproducibility of the accelerated electron beams, which are currently limited by fluctuations in the laser systems caused by air and ground motion.

In addition to laser-driven schemes, particle-driven plasma acceleration holds promise for high-energy physics applications. Experiments using electron-beam drivers are ongoing and planned at various facilities including FLASHForward at DESY and FACET-II at SLAC (CERN Courier January/February 2019 p10). The need for staging multiple plasma accelerators may even be circumvented by using energetic proton beams as drivers. Recent experiments at CERN’s Advanced Wakefield Experiment demonstrated electron acceleration gradients of around 200 MV/m using proton-beam-driven plasma wakefields (CERN Courier October 2018 p7).

Experiments at Berkeley in the next few years will focus on demonstrating the staging of laser-plasma accelerators with multi-GeV energy gains. “The field of plasma wakefield acceleration is picking up speed,” writes Florian Grüner of the University of Hamburg in an accompanying APS Viewpoint article. “If plasma wakefields can have gradients of 1 TV/m, one might imagine that a ‘table-top version of CERN’ is possible.”

Deciphering elementary particles

Particle physics began more than a century ago with the discoveries of radioactivity, the electron and cosmic rays. Photographic plates, gas-filled counters and scintillating substances were the early tools of the trade. Studying cloud formation in moist air led to the invention of the cloud chamber, which, in 1932, enabled the discovery of the positron. The photographic plate soon morphed into nuclear-emulsion stacks, and the Geiger tube of the Geiger–Marsden–Rutherford experiments developed into the workhorse for cosmic-ray studies. The bubble chamber, invented in 1952, represented the culmination of these “imaging detectors”, using film as the recording medium. Meanwhile, in the 1940s, the advent of photomultipliers had opened the way to crystal-based photon and electron energy measurements and Cherenkov detectors. This was the toolbox of the first half of the 20th century, credited with a number of groundbreaking discoveries that earned the toolmakers and their artisans more than 10 Nobel Prizes.

extraction of the ALICE time projection chamber

Game changer

The invention of the Multi Wire Proportional Chamber (MWPC) by Georges Charpak in 1968 was a game changer, earning him the 1992 Nobel Prize in Physics. Suddenly, experimenters had access to large-area charged particle detectors with millimetre spatial resolution and staggering MHz-rate capability. Crucially, the emerging integrated-circuit technology could deliver amplifiers so small in size and cost to equip many thousands of proportional wires. This ingenious and deceptively simple detector is relatively easy to construct. The workshops of many university physics departments could master the technology, attracting students and “democratising” particle physics. So compelling was experimentation with MWPCs that within a few years, large detector facilities with tens of thousands of wires were constructed – witness the Split Field Magnet at CERN’s Intersecting Storage Rings (ISR). Its rise to prominence was unstoppable: it became the detector of choice for the Proton Synchrotron, Super Proton Synchrotron (SPS) and ISR programmes. An extension of this technique is the drift chamber, a MWPC-type geometry, with which the time difference between the passage of the particle and the onset of the wire signal is recorded, providing a measure of position with 100 µm-level resolution. The MWPC concept lends itself to a multitude of geometries and has found its “purest” application as the readout of time projection chambers (TPCs). Modern derivatives replace the wire planes with metallised foils with holes in a sub-millimetre pattern, amplifying the ionisation signals.

The ambition, style and success of these large, global collaborations was contagious

The ISR was a hotbed for accelerator and detector inventions. The world’s first proton–proton collider, an audacious project, was clearly ahead of its time and the initial experiments could not fully exploit its discovery potential. It prompted, however, the concept of multi-purpose facilities capable of obtaining “complete” collision information. For the first time, a group developed and used transition-radiation detectors for electron detection and liquid-argon calorimetry. The ISR’s Axial Field Spectrometer (AFS) provided high-quality hadron calorimetry with close to 4π coverage. These technologies are now widely used at accelerators and for non-accelerator experiments. The stringent performance requirements for experiments at the ISR encouraged the detector developers to explore and reach a measurement quality only limited by the laws of detector physics: science-based procedures had replaced the “black magic” of detector construction. With collision rates in the 10 MHz range, these experiments (and the ISR) were forerunners of today’s Large Hadron Collider (LHC) experiments. Of course, the ISR is most famous for its seminal accelerator developments, in particular the invention of stochastic cooling, which was the enabling technology for converting the SPS into a proton–antiproton collider.

The SPS marked another moment of glory for CERN. In 1976 first beams were accelerated to 400 GeV, initiating a diverse physics programme and motivating a host of detector developments. Advances in semiconductor technology led to the silicon-strip detector. With the experiments barely started, Carlo Rubbia and collaborators launched the idea, as ingenious as it was audacious, to convert the SPS into a proton–antiproton collider. The goal was clear: orchestrate quickly and rather cheaply a machine with enough collision energy to produce the putative W and Z bosons. Simon van der Meer’s stochastic-cooling scheme had to deliver the required beam intensity and lifetime, and two experimental teams were charged with the conception and construction of the equally novel detectors. The centrepiece of the UA1 detector was a 6 m-long and 2 m-diameter “electronic bubble chamber”, which adapted the drift-chamber concept to the event topology and collision rate, combined with state-of-the-art electronic readout. The electronic images were of such illuminating quality that “event scanning”, the venerable bubble- chamber technique, was again a key tool in data analysis. The UA2 team pushed calorimetry and silicon detectors to new levels of performance, provided healthy competition and independent discoveries. The discovery of the W and Z bosons was achieved in 1983 and, the following year, Rubbia and van der Meer became Nobel Laureates.

Laying foundations

In 1981, with the approval of the Large Electron Positron (LEP) collider, the community laid the foundation for decades of research at CERN. Mastering the new scale of the accelerator dimension also brought a new approach to managing the larger experimental collaborations and to meeting their more stringent experimental requirements. For the first time, mostly outside collaborators developed and built the experimental apparatus, a non-trivial, but needed success in technology transfer. The detection techniques reached a new state of matureness. Silicon-strip detectors became ubiquitous. Gaseous tracking in a variety of forms, such as TPCs and jet chambers, reached new levels of size and performance. There were also some notable firsts. The DELPHI collaboration developed the Ring Imaging Cherenkov Counter, a delicate technology in which the distribution of Cherenkov photons, imaged with mirrors onto photon-sensitive MWPC-type detectors, provides a measure of the particle’s velocity. The L3 collaboration aimed at ultimate-precision energy measurements of muons, photons and electrons, and put its money on a recently discovered scintillating crystal, bismuth germanate. Particle physicists, material scientists and crystallographers from academia and industry transformed this laboratory curiosity into mass-producible technology: ultimately, 12,000 crystals were grown, cut to size as truncated pyramids and assembled into the calorimeter, a pioneering trendsetter.

the multi-wire proportional chamber

The ambition, style and success of these large, global collaborations was contagious. It gave the cosmic-ray community a new lease of life. The Pierre Auger Observatory, one of whose initiators was particle physicist and Nobel Laureate James Cronin, explores cosmic rays at extreme energies with close to 2000 detector stations spread over an area of 3000 km2. The IceCube collaboration has instrumented around a cubic kilometre of Antarctic ice to detect neutrinos. One of the most ambitious experiments is the Alpha Magnetic Spectrometer, hosted by the International Space Station – again with a particle physicist and Nobel Prize winner, Samuel Ting, as a prime mover and shaker.

These decade-long efforts in experimentation find their present culmination at the LHC. Experimenters had to innovate on several fronts: all detector systems were designed for and had to achieve ultimate performance, limited only by the laws of physics; the detectors must operate at a GHz or more collision rate, generating some 100 billion particles per second. “Impossible” was many an expert’s verdict in the early 1990s. The successful collaboration with industry giants in the IT and electronics sectors was a life-saver; and achieving all this – fraught with difficulties, technical and sociological – in international collaborations of several thousand scientists and engineers was an immense achievement. All existing detection technologies – ranging from silicon-tracking, to transition-radiation and RICH detectors, liquid-argon, scintillator and crystal calorimeters to 10,000 m3-scale muon spectrometers – needed novel ideas, major improvements and daring extrapolations. The success of the LHC experiments is beyond the wildest dreams: hundreds of measurements achieve a precision, previously considered only possible at electron–positron colliders. The Higgs boson, discovered in 2012, will be part of the research agenda for most of the 21st century, and CERN is in the starting block with ambitious plans.

Sharing with society

Worldwide, more than 30,000 accelerators are in operation. Particle and nuclear physics research uses barely more than 100 of them. Society is the principal client, and many of the accelerator innovations and particle detectors have found their way into industry, biology and health applications. A class of accelerators, to which CERN has contributed significantly, is specifically dedicated to tumour therapy. Particle detectors have made a particular impact on medical imaging, such as positron emission tomography (PET), whose origin dates back to CERN with a MWPC-based detector in the 1970s. Today’s clinical PETs use crystals, very similar to those used in the discovery of the Higgs boson.

Possibly the most important benefit of particle physics to society is the collaborative approach developed by the community, which underpins the incredible success that has led us to the LHC experiments today. There are no signs that the rate of innovation in detectors and instrumentation is slowing. Currently the LHC experiments are undergoing major upgrades and plans for the next generation of experiments and colliders are already well under way. These collaborations succeed in being united and driven by a common goal, bridging cultural and political divides. 

Harnessing the web for humanity

What would you do if you were thrust into a world where suddenly you lacked control over who you were? If you had no way to prove where you were from, who you were related to, or what you had accomplished? If you lost all your documentation in a natural disaster, or were forced to leave your home without taking anything with you? Without proof of identity, people are unable access essential systems such as health, education and banking services, and they are also exceedingly vulnerable to trafficking and incarceration. Having and owning your identity is an essential human right that too many people are lacking.

More than 68 million people worldwide have been displaced by war and conflict, and over 25 million have fled their countries and gone from the designation of “citizen” to “refugee”. They are often prevented from working in their new countries, and, even if they are allowed to work, many nations will not let professional credentials, such as licences to practise law or medicine, follow these people across their borders. We end up stripping away fundamental human dignities and leaving exorbitant amounts of untapped potential on the table. Countries need to recognise not just the right to identity but also that identity is portable across nation states.

The issue of sovereign identities extends much further than documentation. All over the world, individuals are becoming commodified by companies offering “free” services because their actual products are the users and their data. Every individual should have the right to decide to monetise their data if they want. But the speed, scale and stealth of such practises is making it increasingly difficult to retain control of our data.

All of this is happening as we celebrate the 30th anniversary of the web. While there is no doubt that the web has been incredibly beneficial for humanity, it has also turned people into pawns and opened them up to new security risks. I believe that we can not only remedy these harms, but that we’ve yet to harness even a small fraction of the good that the web can do. Enter The Humanized Internet – a non-profit movement founded in 2017 that is working to use new technologies to give every human being secure, sovereign control over their own digital identity.

New technologies like blockchain, which allows digital information to be distributed but not copied, can allow us to tackle this issue. Blockchain has some key differences with today’s databases. First, it allows participants to see and verify all data involved, minimising chances of fraud. Second, all data is verified and encrypted before being added to an individual block in such a way that a hacker would need to have exponentially more computing power to break in than is required in today’s systems. These characteristics allow blockchain to provide public ledgers that participants trust based on the agreed-upon consensus protocol. Once data transactions are on a block, they cannot be overwritten, and no central institution holds control, as these ledgers are visible to all the users connected to them. Users’ identities within a ledger are known only to the users themselves.

The first implication of this technology is that it can help to establish a person’s citizenship in their state of origin and enable registration of official records. Without this many people would be considered stateless and granted almost no rights or diplomatic protections. For refugees, digital identities also allow peer-to-peer donation and transparent public transactions. Additionally, digital identities create the ability to practise selective disclosure, where individuals can choose to share their records only at their own discretion.

We now need more people to get on board. We are already working with experts to discuss the potential of blockchain to improve inclusion in state-authenticated identity programmes and how to combat potential privacy challenges, in addition to e-voting systems that could allow inclusive participation in voting at all policy levels. We should all be the centre of our universe; our identity should be wholly and irrevocably our own.

Inspired by software

High-energy code

Of all the movements to make science and technology more open, the oldest is “open source” software. It was here that the “open” ideals were articulated, and from which all later movements such as open-access publishing derive. Whilst it rightly stands on this pedestal, from another point of view open-source software was simply the natural extension of academic freedom and knowledge-sharing into the digital age.

Open-source has its roots in the free software movement, which grew in the 1980s in response to monopolising corporations and restrictions on proprietary software. The underlying ideal is open collaboration: peers freely, collectively and publicly build software solutions. A second ideal is recognition, in which credit for the contributions made by individuals and organisations worldwide is openly acknowledged. A third ideal concerns rights, specifically the so-called four freedoms granted to users: to use the software for any purpose; to study the source code to understand how it works; to share and redistribute the software; and to improve the software and share the improvements with the community. Users and developers therefore contribute to a virtuous circle in which software is continuously improved and shared towards a common good, minimising vendor lock-in for users.

Today, 20 years after the term “open source” was coined, and despite initial resistance from traditional software companies, many successful open-source business models exist. These mainly involve consultancy and support services for software released under an open-source licence and extend beyond science to suppliers of everyday tools such as the WordPress platform, Firefox browser and the Android operating system. A more recent and unfortunate business model adopted by some companies is “open core”, whereby essential features are deemed premium and sold as proprietary software on top of existing open-source components.

Founding principles

Open collaboration is one of CERN’s founding principles, so it was natural to extend the principle into its software. The web’s invention brought this into sharp focus. Having experienced first-hand its potential to connect physicists around the globe, in 1993 CERN released the web software into the public domain so that developers could collaborate and improve on it (see CERN’s ultimate act of openness). The following year, CERN released the next web-server version under an open-source licence with the explicit goal of preventing private companies from turning it into proprietary software. These were crucial steps in nurturing the universal adoption of the web as a way to share digital information, and exemplars of CERN’s best practice in open-source software.

Nowadays, open-source software can be found in pretty much every corner of CERN, as in other sciences and industry. Indico and Invenio – two of the largest open-source projects developed at CERN to promote open collaboration – rely on the open-source framework Python Flask. Experimental data are stored in CERN’s Exascale Open Storage system, and most of the servers in the CERN computing centre are running on Openstack – an open-source cloud infrastructure to which CERN is an active contributor. Of course, CERN also relies heavily on open-source GNU/Linux as both a server and desktop operating system. On the accelerator and physics analysis side, it’s all about open source. From C2MON, a system at the heart of accelerator monitoring and data acquisition, to ROOT, the main data-analysis framework used to analyse experimental data, the vast majority of the software components behind the science done at CERN are released under an open-source licence.

Open hardware

The success of the open-source model for software has inspired CERN engineers to create an analogous “open hardware” licence, enabling electronics designers to collaborate and use, study, share and improve the designs of hardware components used for physics experiments. This approach has become popular in many sciences, and has become a lifeline for teaching and research in developing countries.

Being a scientist in the digital age means being a software producer and a software consumer. As a result, collaborative software-development platforms such as GitHub and GitLab have become as important to the physics department as they are to the IT department. Until recently, the software underlying an analysis has not been easily shared. CERN has therefore been developing research data-management tools to enable the publication of software and data, forming the basis of an open-data portal (see Preserving the legacy of particle physics). Naturally, this software itself is open source and has also been used to create the worldwide open-data service Zenodo, which is connected to GitHub to make the publication of open-source software a standard part of the research cycle.

Interestingly, as with the early days of open source, many corners of the scientific community are hesitant about open science. Some people are concerned that their software and data are not of sufficient quality or interest to be shared, or that they will be helping others to the next discovery before them. To triumph over the sceptics, open science can learn from the open-source movement, adopting standard licences, credit systems, collaborative development techniques and shared governance. In this way, it too will be able to reap the benefits of open collaboration: transparency, efficiency, perpetuity and flexibility. 

Open science: A vision for collaborative, reproducible and reusable research

The goal of practising science in such a way that others can collaborate, question and contribute – known as “open science” – long predates the web. One could even argue that it began with the first academic journal 350 years ago, which enabled scientists to share knowledge and resources to foster progress. But the web offered opportunities way beyond anything before it, quickly transforming academic publishing and giving rise to greater sharing in areas such as software. Alongside the open-source (Inspired by software), open-access (A turning point for open-access publishing) and open-data (Preserving the legacy of particle physics) movements grew the era of open science, which aims to encompass the scientific process as a whole.

Today, numerous research communities, political circles and funding bodies view open science and reproducible research as vital to accelerate future discoveries. Yet, to fully reap the benefits of open and reproducible research, it is necessary to start implementing tools to power a more profound change in the way we conduct and perceive research. This poses both sociological and technological challenges, starting from the conceptualisation of research projects, through conducting research, to how we ensure peer review and assess the results of projects and grants. New technologies have brought open science within our reach, and it is now up to scientific communities to agree on the extent to which they want to embrace this vision.

Particle physicists were among the first to embrace the open-science movement, sharing preprints and building a deep culture of using and sharing open-source software. The cost and complexity of experimental particle physics, making complete replication of measurements unfeasible, presents unique challenges in terms of open data and scientific reproducibility. It may even be considered that openness itself, in the sense of having an unfettered access to data from its inception, is not particularly advantageous.

Take the existing data-management policies of the LHC collaborations: while physicists generally strive to be open in their research, the complexity of the data and analysis procedures means that data become publicly open only after a certain embargo period that is used to assess its correctness. The science is thus born “closed”. Instead of thinking about “open data” from its inception, it is more useful to speak about FAIR (findable, accessible, interoperable and reusable) data, a term coined by the FORCE11 community. The data should be FAIR throughout the scientific process, from being initially closed to being made meaningfully open later to those outside the experimental collaborations.

True open science demands more than simply making data available: it needs to concern itself with providing information on how to repeat or verify an analysis performed over given datasets, producing results that can be reused by others for comparison, confirmation or simply for deeper understanding and inspiration. This requires runnable examples of how the research was performed, accompanied by software, documentation, runnable scripts, notebooks, workflows and compute environments. It is often too late to try to document research in such detail once it has been published.

True open science demands more than simply making data available

FAIR data repositories for particle physics, the “closed” CERN Analysis Preservation portal and the “open” CERN Open Data portal emerged five years ago to address the community’s open-science needs. These digital repositories enable physicists to preserve, document, organise and share datasets, code and tools used during analyses. A flexible metadata structure helps researchers to define everything from experimental configurations to data samples, from analysis code to software libraries and environments used to analyse the data, accompanied by documentation and links to presentations and publications. The result is a standard way to describe and document an analysis for the purposes of discoverability and reproducibility.

Recent advancements in the IT industry allow us to encapsulate the compute environments where the analysis was conducted. Capturing information about how the analysis was carried out can be achieved via a set of runnable scripts, notebooks, structured workflows and “containerised” pipelines. Complementary to data repositories, a third service named REANA (reusable analyses) allows researchers to submit parameterised computational workflows to run on remote compute clouds. It can be used to reinterpret preserved analyses but also to run “active” analyses before they are published and preserved, with the underlying philosophy that physics analyses should be automated from inception so that they can be executed without manual intervention. Future reuse and reinterpretation starts with the first commit of the analysis code; altering an already-finished analysis to facilitate its eventual reuse after publication is often too late.

Full control

The key guiding principle of the analysis preservation and reuse framework is to leave the decision as to when a dataset or a complete analysis is shared, privately or publicly, in the hands of the researchers. This gives the experiment collaborations full control over the release procedures, and thus fully supports internal processing and review protocols before the results are published on community services, such as arXiv, HEPData and INSPIRE.

The CERN Open Data portal was launched in 2014 amid a discussion as to whether primary particle-physics data would find any use outside of the LHC collaborations. Within a few years, the first paper based on open data from the CMS experiment was published (see Preserving the legacy of particle physics).

Three decades after the web was born, science is being shared more openly than ever and particle physics is at the forefront of this movement. As we have seen, however, simple compliance with data and software openness is not enough: we also need to capture, from the start of the research process, runnable recipes, software environments, computational workflows and notebooks. The increasing demand from funding agencies and policymakers for open data-management plans, coupled with technological progress in information technology, leads us to believe that the time is ripe for this change.

Sharing research in an easily reproducible and reusable manner will facilitate knowledge transfer within and between research teams, accelerating the scientific process. This fills us with hope that three decades from now, even if future generations may not be able to run our current code on their futuristic hardware platforms, they will be at least well equipped to understand the processes behind today’s published research in sufficient detail to be able to check our results and potentially reveal something new.

bright-rec iop pub iop-science physcis connect