Comsol -leaderboard other pages

Topics

LHCb observes CP violation in charm decays

On the morning of 21 March, at the 2019 Rencontres de Moriond in La Thuile, Italy, the LHCb collaboration announced the discovery of charge-parity (CP) violation in the charm system. Met with an impromptu champagne celebration, the result represents a milestone in particle physics and opens a new area of investigation in the charm sector.

CP violation, which results in differences in the properties of matter and antimatter, was first observed in the decays of K mesons (which contain strange quarks) in 1964 by James Cronin and Val Fitch. Even though parity (P) violation had been seen eight years earlier, the discovery that the combined C and P symmetries are not conserved was unexpected. The story deepened in the early 1970s, when, building on the foundations laid by Nicola Cabibbo and others, Makoto Kobayashi and Toshihide Maskawa showed that CP violation could be included naturally in the Standard Model (SM) if at least six different quarks existed in nature. Their fundamental idea – whereby direct CP violation arises if a complex phase appears in the CKM matrix describing quark mixing – was confirmed 30 years later by the discovery of CP violation in B-meson decays by the BaBar and Belle collaborations. Despite decades of searches, CP violation in the decays of charmed particles escaped detection.

LHCb physicists used the unprecedented dataset accumulated in 2011–2018 to study the difference in decay rates between D0 and D̅0 (which contain a c quark or antiquark) decaying into K+K or π+π pairs. To differentiate between the identical D0 and D̅0 decays, the collaboration exploited two different classes of decays: those of D*+/- mesons decaying into a D0 and a charged pion, where the presence of a π+) indicates the presence of a D0(D̅0) meson; and those of B mesons decaying into a D0, a muon and a neutrino, in which the presence of a μ+) identifies a D0(D̅0). Counting the number of decays present in the data sample, the final result is ΔACP= -0.154±0.029%. At 5.3 standard deviations from zero, it represents the first observation of CP violation in the charm system.

“This is a major result that could be obtained thanks to the very high charm- production cross section at LHC, and to the superb performance of both the LHC machine and the LHCb detector, which provided the largest sample of charm particles ever collected,” says LHCb spokesperson Giovanni Passaleva. “Analysing the tens of millions of D0 mesons needed for such a precise measurement was a remarkable collective effort by the collaboration. The result opens up a new field in particle physics, involving the study of CP-violating effects in the sector of up-type quarks and searches for new-physics effects in a completely new domain.”

CP violation is a thought to be an essential ingredient to explain the observed cosmological matter-antimatter asymmetry, but the level of CP violation observed in the SM is only able to explain a fraction of the imbalance. In addition to hunting for novel sources of CP violation, physicists are making precise measurements of known sources to look for deviations that could indicate physics beyond the SM. The SM prediction for the amount of CP violation in charm decays is estimated to be in the range of 10-4 – 10-3 in the decay modes of interest. The new LHCb measurement is consistent with the SM expectation but falls at the upper end of the range, generating much discussion at Moriond 2019. Unusually for particle physics, the experimental measurement is much more precise than the SM prediction. This is due to the lightness of charm quarks, which means that reliable perturbative QCD and other approximate calculation techniques are not possible. Future theoretical improvements, and data, will establish whether the seminal LHCb result is consistent with the SM.

“This is an important milestone in the study of CP violation,” Kobayashi, now professor emeritus at KEK in Japan, tells CERN Courier. “I hope that analysis of the results will provide a clue to new physics.”

Deciphering elementary particles

Particle physics began more than a century ago with the discoveries of radioactivity, the electron and cosmic rays. Photographic plates, gas-filled counters and scintillating substances were the early tools of the trade. Studying cloud formation in moist air led to the invention of the cloud chamber, which, in 1932, enabled the discovery of the positron. The photographic plate soon morphed into nuclear-emulsion stacks, and the Geiger tube of the Geiger–Marsden–Rutherford experiments developed into the workhorse for cosmic-ray studies. The bubble chamber, invented in 1952, represented the culmination of these “imaging detectors”, using film as the recording medium. Meanwhile, in the 1940s, the advent of photomultipliers had opened the way to crystal-based photon and electron energy measurements and Cherenkov detectors. This was the toolbox of the first half of the 20th century, credited with a number of groundbreaking discoveries that earned the toolmakers and their artisans more than 10 Nobel Prizes.

extraction of the ALICE time projection chamber

Game changer

The invention of the Multi Wire Proportional Chamber (MWPC) by Georges Charpak in 1968 was a game changer, earning him the 1992 Nobel Prize in Physics. Suddenly, experimenters had access to large-area charged particle detectors with millimetre spatial resolution and staggering MHz-rate capability. Crucially, the emerging integrated-circuit technology could deliver amplifiers so small in size and cost to equip many thousands of proportional wires. This ingenious and deceptively simple detector is relatively easy to construct. The workshops of many university physics departments could master the technology, attracting students and “democratising” particle physics. So compelling was experimentation with MWPCs that within a few years, large detector facilities with tens of thousands of wires were constructed – witness the Split Field Magnet at CERN’s Intersecting Storage Rings (ISR). Its rise to prominence was unstoppable: it became the detector of choice for the Proton Synchrotron, Super Proton Synchrotron (SPS) and ISR programmes. An extension of this technique is the drift chamber, a MWPC-type geometry, with which the time difference between the passage of the particle and the onset of the wire signal is recorded, providing a measure of position with 100 µm-level resolution. The MWPC concept lends itself to a multitude of geometries and has found its “purest” application as the readout of time projection chambers (TPCs). Modern derivatives replace the wire planes with metallised foils with holes in a sub-millimetre pattern, amplifying the ionisation signals.

The ambition, style and success of these large, global collaborations was contagious

The ISR was a hotbed for accelerator and detector inventions. The world’s first proton–proton collider, an audacious project, was clearly ahead of its time and the initial experiments could not fully exploit its discovery potential. It prompted, however, the concept of multi-purpose facilities capable of obtaining “complete” collision information. For the first time, a group developed and used transition-radiation detectors for electron detection and liquid-argon calorimetry. The ISR’s Axial Field Spectrometer (AFS) provided high-quality hadron calorimetry with close to 4π coverage. These technologies are now widely used at accelerators and for non-accelerator experiments. The stringent performance requirements for experiments at the ISR encouraged the detector developers to explore and reach a measurement quality only limited by the laws of detector physics: science-based procedures had replaced the “black magic” of detector construction. With collision rates in the 10 MHz range, these experiments (and the ISR) were forerunners of today’s Large Hadron Collider (LHC) experiments. Of course, the ISR is most famous for its seminal accelerator developments, in particular the invention of stochastic cooling, which was the enabling technology for converting the SPS into a proton–antiproton collider.

The SPS marked another moment of glory for CERN. In 1976 first beams were accelerated to 400 GeV, initiating a diverse physics programme and motivating a host of detector developments. Advances in semiconductor technology led to the silicon-strip detector. With the experiments barely started, Carlo Rubbia and collaborators launched the idea, as ingenious as it was audacious, to convert the SPS into a proton–antiproton collider. The goal was clear: orchestrate quickly and rather cheaply a machine with enough collision energy to produce the putative W and Z bosons. Simon van der Meer’s stochastic-cooling scheme had to deliver the required beam intensity and lifetime, and two experimental teams were charged with the conception and construction of the equally novel detectors. The centrepiece of the UA1 detector was a 6 m-long and 2 m-diameter “electronic bubble chamber”, which adapted the drift-chamber concept to the event topology and collision rate, combined with state-of-the-art electronic readout. The electronic images were of such illuminating quality that “event scanning”, the venerable bubble- chamber technique, was again a key tool in data analysis. The UA2 team pushed calorimetry and silicon detectors to new levels of performance, provided healthy competition and independent discoveries. The discovery of the W and Z bosons was achieved in 1983 and, the following year, Rubbia and van der Meer became Nobel Laureates.

Laying foundations

In 1981, with the approval of the Large Electron Positron (LEP) collider, the community laid the foundation for decades of research at CERN. Mastering the new scale of the accelerator dimension also brought a new approach to managing the larger experimental collaborations and to meeting their more stringent experimental requirements. For the first time, mostly outside collaborators developed and built the experimental apparatus, a non-trivial, but needed success in technology transfer. The detection techniques reached a new state of matureness. Silicon-strip detectors became ubiquitous. Gaseous tracking in a variety of forms, such as TPCs and jet chambers, reached new levels of size and performance. There were also some notable firsts. The DELPHI collaboration developed the Ring Imaging Cherenkov Counter, a delicate technology in which the distribution of Cherenkov photons, imaged with mirrors onto photon-sensitive MWPC-type detectors, provides a measure of the particle’s velocity. The L3 collaboration aimed at ultimate-precision energy measurements of muons, photons and electrons, and put its money on a recently discovered scintillating crystal, bismuth germanate. Particle physicists, material scientists and crystallographers from academia and industry transformed this laboratory curiosity into mass-producible technology: ultimately, 12,000 crystals were grown, cut to size as truncated pyramids and assembled into the calorimeter, a pioneering trendsetter.

the multi-wire proportional chamber

The ambition, style and success of these large, global collaborations was contagious. It gave the cosmic-ray community a new lease of life. The Pierre Auger Observatory, one of whose initiators was particle physicist and Nobel Laureate James Cronin, explores cosmic rays at extreme energies with close to 2000 detector stations spread over an area of 3000 km2. The IceCube collaboration has instrumented around a cubic kilometre of Antarctic ice to detect neutrinos. One of the most ambitious experiments is the Alpha Magnetic Spectrometer, hosted by the International Space Station – again with a particle physicist and Nobel Prize winner, Samuel Ting, as a prime mover and shaker.

These decade-long efforts in experimentation find their present culmination at the LHC. Experimenters had to innovate on several fronts: all detector systems were designed for and had to achieve ultimate performance, limited only by the laws of physics; the detectors must operate at a GHz or more collision rate, generating some 100 billion particles per second. “Impossible” was many an expert’s verdict in the early 1990s. The successful collaboration with industry giants in the IT and electronics sectors was a life-saver; and achieving all this – fraught with difficulties, technical and sociological – in international collaborations of several thousand scientists and engineers was an immense achievement. All existing detection technologies – ranging from silicon-tracking, to transition-radiation and RICH detectors, liquid-argon, scintillator and crystal calorimeters to 10,000 m3-scale muon spectrometers – needed novel ideas, major improvements and daring extrapolations. The success of the LHC experiments is beyond the wildest dreams: hundreds of measurements achieve a precision, previously considered only possible at electron–positron colliders. The Higgs boson, discovered in 2012, will be part of the research agenda for most of the 21st century, and CERN is in the starting block with ambitious plans.

Sharing with society

Worldwide, more than 30,000 accelerators are in operation. Particle and nuclear physics research uses barely more than 100 of them. Society is the principal client, and many of the accelerator innovations and particle detectors have found their way into industry, biology and health applications. A class of accelerators, to which CERN has contributed significantly, is specifically dedicated to tumour therapy. Particle detectors have made a particular impact on medical imaging, such as positron emission tomography (PET), whose origin dates back to CERN with a MWPC-based detector in the 1970s. Today’s clinical PETs use crystals, very similar to those used in the discovery of the Higgs boson.

Possibly the most important benefit of particle physics to society is the collaborative approach developed by the community, which underpins the incredible success that has led us to the LHC experiments today. There are no signs that the rate of innovation in detectors and instrumentation is slowing. Currently the LHC experiments are undergoing major upgrades and plans for the next generation of experiments and colliders are already well under way. These collaborations succeed in being united and driven by a common goal, bridging cultural and political divides. 

Paris event reflects on the history of the neutrino

Neutrinos, discovered in 1956, play an exceptional role in particle and nuclear physics, as well as astrophysics, and their study has led to the award of several Nobel prizes. In recognition of their importance, the first International Conference on the History of the Neutrino took place at the Université Paris Diderot in Paris on 5–7 September 2018.

The purpose of the conference, which drew 120 participants, was to cover the main steps in the history of the neutrino since 1930, when Wolfgang Pauli postulated its existence to explain the continuous energy spectrum of the electrons emitted in beta decay. Specifically, for each topic in neutrino physics, the aim was to pursue an historical approach and follow as closely as possible the discovery or pioneering papers. Speakers were chosen as much as possible for their roles as authors or direct witnesses, or as players in the main events.

The first session, “Invention of a new particle”, started with the prehistory of the neutrino – that is, the establishment of the continuous energy spectrum in beta decay – before moving into the discoveries of the three flavour neutrinos. The second session, “Neutrinos in nature”, was devoted to solar and atmospheric neutrinos, as well as neutrinos from supernovae and Earth. The third session covered neutrinos from reactors and beams including the discovery of neutral-current neutrino interactions, in which the neutrino is not transformed into another particle like a muon or an electron. This discovery was made in 1973 by the Gargamelle bubble chamber team at CERN after a race with the HPWF experiment team at Fermilab.

The major theme of neutrino oscillations from the first theoretical ideas of Bruno Pontecorvo (1957) to the Mikheyev–Smirnov–Wolfenstein effect (1985), which can modify the oscillations when neutrinos travel through matter, was complemented by talks on the discovery of neutrino oscillations by Nobel laureates Takaaki Kajita and Art McDonald. In 1998, the Super-Kamiokande experiment, led by Kajita, observed the oscillation of atmospheric neutrinos, and in 2001 the Sudbury Neutrino Observatory experiment, led by McDonald, observed the oscillation of solar neutrinos.

The role of the neutrino in the Standard Model was discussed, as was its intrinsic nature. Although physicists have observed the rare process of double beta decay with neutrinos in the final state, neutrinoless double beta decay with no neutrinos produced has been searched  for for more than 30 years because its observation would prove that the neutrino is Majorana-type (its own antiparticle) and not Dirac-type.

To complete the panorama, the conference discussed neutrinos as messengers from the wider universe, from the Big Bang to violent phenomena such as gamma-ray bursts or active galactic nuclei. Delegates also discussed wrong hints and tracks, which play a positive role in the development of science, and the peculiar sociological aspects that are common to particle physics and astrophysics.

Following the conference, a website dedicated to the history of this fascinating particle was created: https://neutrino-history.in2p3.fr.

Physics Beyond Colliders initiative presents main findings

In a workshop held at CERN on 16–17 January, researchers presented the findings of the Physics Beyond Colliders (PBC) initiative, which was launched in 2016 to explore the opportunities at CERN via projects complementary to the LHC and future colliders (CERN Courier November 2016 p28). PBC members have weighed up the potential for such experiments to explore open questions in QCD and the existence of physics beyond the Standard Model (BSM), in particular including searches for signatures of hidden-sector models in which the conjectured dark matter does not couple directly to Standard Model particles.

The BSM and QCD groups of the PBC initiative have developed detailed studies of CERN’s options and compared them to other worldwide possibilities. The results show the international competitiveness of the PBC options.

The Super Proton Synchrotron (SPS) remains a clear attraction, offering the world’s highest-energy beams to fixed-target experiments in the North Area (see Fixed target, striking physics). The SPS high-intensity muon beam could allow a better understanding of the theoretical prediction of the muon anomalous magnetic moment (MUonE project), and a significant contribution to the resolution of the proton radius puzzle by COMPASS(Rp). The NA61 experiment could explore QCD in the interesting region of “criticality”, while upgrades of NA64 and a few months of NA62 operation in beam-dump mode (whereby a target absorbs most of the incident protons and contains most of the particles generated by the primary beam interactions) would explore the hidden-sector parameter space. In the longer term, the KLEVER experiment could probe rare decays of neutral kaons, and NA60 and DIRAC could enhance our understanding of QCD.

A novel North Area proposal is the SPS Beam Dump Facility (BDF). Such a facility could, in the first instance, serve the SHiP experiment, which would perform a comprehensive investigation of the hidden sector with discovery potential in the MeV–GeV mass range, and the TauFV experiment, which would search for forbidden τ decays. The BDF team has made excellent progress with the facility design and is preparing a comprehensive design study report. Options for more novel exploitation of the SPS have also been considered: proton-driven plasma- wakefield acceleration of electrons for a dark-matter experiment (AWAKE++); the acceleration and slow extraction of electrons to light–dark-matter experiments (eSPS); and the production of well-calibrated neutrinos via a muon decay ring (nuSTORM).

Fixed-target studies at the LHC are also considered within PBC, and these could improve our understanding of QCD in regions where it is relevant for new-physics searches at the high-luminosity LHC upgrade. The LHC could also be supplemented with new experiments to search for long-lived particles, and PBC support for a small experiment called FASER has helped pave the way for its installation in the ongoing long shutdown of CERN’s accelerator complex.

2018 was a notable year for the gamma factory, a novel concept that would use the LHC to produce intense gamma-ray beams for precision measurements and searches (CERN Courier November 2017 p7). The team has already demonstrated the acceleration of partially stripped ions in the LHC, and is now working towards a proof-of-principle experiment in the SPS. Meanwhile, the Electric Dipole Moment (CPEDM) collaboration has continued studies, supported by experiments at the COSY synchrotron in Germany (CERN Courier September 2016 p27), towards a prototype storage ring to measure the proton EDM.

The PBC technology team has also been working to leverage CERN’s skills base to novel experiments, for example by exploring synergies across experiments and collaboration in technologies – in particular, concerning light-shining-through-walls experiments and QED vacuum-birefringence measurements.

Finally, some PBC projects are likely to flourish outside CERN: the IAXO axion helioscope, now under consideration at DESY; the proton EDM ring, which could be prototyped at the Jülich laboratory, also in Germany; and the REDTOP experiment devoted to η meson rare decays, for which Fermilab in the US seems better suited.

The PBC groups have submitted their full findings to the European Particle Physics Strategy Update (http://pbc.web.cern.ch/).

BaBar celebrates its 25th anniversary

On 11 December 2018, 25 years after its inaugural meeting, the BaBar collaboration came together at the SLAC National Accelerator Laboratory in California to celebrate its many successes. David Hitlin, BaBar’s first spokesperson, described the inaugural meeting of what was then called the Detector Collaboration for the PEP-II “asymmetric” electron–positron collider, which took place at SLAC at the end of 1993. By May 1994 the collaboration had chosen the name BaBar in recognition of its primary goal to study CP violation in the neutral B-B̅ meson system. Jonathan Dorfan, PEP-II project director, recounted how PEP-II was constructed by SLAC, LBL and LLNL. Less than six years later, PEP-II and the BaBar detector were built and the first collision events were collected on 26 May 1999. Twenty-five years on, and BaBar has now chalked up more than 580 papers on CP violation and many other topics.

BaBar has now chalked up more than 580 papers on CP violation and many other topics.

The “asymmetric” descriptor of the collider refers to Pier Oddone’s concept of using unequal electron and positron beam energies – tuned to 10.58 GeV, the mass of the ϒ(4S) meson and just above the threshold for producing a pair of B mesons. This relativistic boost enabled measurements of the distance between the points where the mesons decay, which is critical for the study of CP violation. Equally critical was the entanglement of the B meson and anti-B meson produced in the ϒ(4S) decay, as it marked whether it was the B0 or B̅0 that decayed to the same CP final state by tagging the flavour of the other meson.

By October 2000 PEP-II had achieved its design luminosity of 3 × 1033 cm–2 s–1 and less than a year later BaBar published its observation of CP violation in the B0 meson system based on a sample of 32 × 106 pairs of B0-B̅0 mesons – on the same day that Belle, its competitor at Japan’s KEK laboratory, published the same observation. These results led to Makoto Kobayashi and Toshihide Maskawa sharing the 2008 Nobel Prize in Physics. The ultimate luminosity achieved by PEP-II, in 2006, was 1.2 × 1034 cm–2s–1. BaBar continued to collect data on or near the ϒ(4S) meson until 2007 and in 2008 collected large samples of ϒ(2S) and ϒ(3S) mesons before PEP-II was shut down. In total, PEP-II produced 471 × 106 B-B̅ pairs for BaBar studies – as well as a myriad of other for other investigations.

The anniversary event also celebrated technical innovations, including “trickle injection” of beam particles into  PEP-II, which provided a nearly 40% increase in integrated luminosity; BaBar’s impressive particle identification, made possible by the DIRC detector; and the implementation of a computing model – spurred by PEP-II delivering significantly more than design luminosity – whereby countries provided in-kind computing support via large “Tier-A” centres. This innovation paved the way for CERN’s Worldwide LHC Computing Grid.

Notable physics results from BaBar include the first observation in 2007 of D–D̅  mixing, while in 2008 the collaboration discovered the long-sought ηb, the lowest energy particle of the bottomonium family. The team also searched for lepton-flavour violation in tau–lepton decays, publishing in 2010 what remain the most stringent limits on τ → μγ and τ → eγ branching fractions. In 2012, making it onto Physics World’s top-ten physics results of the year, the BaBar collaboration made the first direct observation of time-reversal violation by measuring the rates at which the B0 meson changes quantum states. Also published in 2012 was evidence for an excess of B̅→ D(*)τ ν̅τ decays, which challenges lepton universality and is an important part of the current Belle II and LHCb physics programmes. Several years after data-taking ended, it was recognised that BaBar’s data could also be mined for evidence of dark-sector objects such as dark photons, leading to the publication of two significant papers in 2014 and 2017. Another highlight, published last year, is a joint BaBar–Belle paper that resolved an ambiguity concerning the quark-mixing unitarity triangle.

Although BaBar stopped collecting data in 2008, this highly collegial team of researchers continues to publish impactful results. Moreover, BaBar alumni continue to bring their experience and expertise to subsequent experiments, ranging from ATLAS, CMS and LHCb at the LHC, Belle II at SuperKEKB, and long-baseline neutrino experiments (T2K, DUNE, HyperK) to dark-matter (LZ, SCDMS) and dark-energy (LSST) experiments in particle astrophysics.

Harnessing the web for humanity

What would you do if you were thrust into a world where suddenly you lacked control over who you were? If you had no way to prove where you were from, who you were related to, or what you had accomplished? If you lost all your documentation in a natural disaster, or were forced to leave your home without taking anything with you? Without proof of identity, people are unable access essential systems such as health, education and banking services, and they are also exceedingly vulnerable to trafficking and incarceration. Having and owning your identity is an essential human right that too many people are lacking.

More than 68 million people worldwide have been displaced by war and conflict, and over 25 million have fled their countries and gone from the designation of “citizen” to “refugee”. They are often prevented from working in their new countries, and, even if they are allowed to work, many nations will not let professional credentials, such as licences to practise law or medicine, follow these people across their borders. We end up stripping away fundamental human dignities and leaving exorbitant amounts of untapped potential on the table. Countries need to recognise not just the right to identity but also that identity is portable across nation states.

The issue of sovereign identities extends much further than documentation. All over the world, individuals are becoming commodified by companies offering “free” services because their actual products are the users and their data. Every individual should have the right to decide to monetise their data if they want. But the speed, scale and stealth of such practises is making it increasingly difficult to retain control of our data.

All of this is happening as we celebrate the 30th anniversary of the web. While there is no doubt that the web has been incredibly beneficial for humanity, it has also turned people into pawns and opened them up to new security risks. I believe that we can not only remedy these harms, but that we’ve yet to harness even a small fraction of the good that the web can do. Enter The Humanized Internet – a non-profit movement founded in 2017 that is working to use new technologies to give every human being secure, sovereign control over their own digital identity.

New technologies like blockchain, which allows digital information to be distributed but not copied, can allow us to tackle this issue. Blockchain has some key differences with today’s databases. First, it allows participants to see and verify all data involved, minimising chances of fraud. Second, all data is verified and encrypted before being added to an individual block in such a way that a hacker would need to have exponentially more computing power to break in than is required in today’s systems. These characteristics allow blockchain to provide public ledgers that participants trust based on the agreed-upon consensus protocol. Once data transactions are on a block, they cannot be overwritten, and no central institution holds control, as these ledgers are visible to all the users connected to them. Users’ identities within a ledger are known only to the users themselves.

The first implication of this technology is that it can help to establish a person’s citizenship in their state of origin and enable registration of official records. Without this many people would be considered stateless and granted almost no rights or diplomatic protections. For refugees, digital identities also allow peer-to-peer donation and transparent public transactions. Additionally, digital identities create the ability to practise selective disclosure, where individuals can choose to share their records only at their own discretion.

We now need more people to get on board. We are already working with experts to discuss the potential of blockchain to improve inclusion in state-authenticated identity programmes and how to combat potential privacy challenges, in addition to e-voting systems that could allow inclusive participation in voting at all policy levels. We should all be the centre of our universe; our identity should be wholly and irrevocably our own.

In it for the long haul

Nima Arkani-Hamed

How do you view the status of particle physics?

There has never been a better time to be a physicist. The questions on the table today are not about this-or-that detail, but profound ones about the very structure of the laws of nature. The ancients could (and did) wonder about the nature of space and time and the vastness of the cosmos, but the job of a professional scientist isn’t to gape in awe at grand, vague questions – it is to work on the next question. Having ploughed through all the “easier” questions for four centuries, these very deep questions finally confront us: what are space and time? What is the origin and fate of our enormous universe? We are extremely fortunate to live in the era when human beings first get to meaningfully attack these questions. I just wish I could adjust when I was born so that I could be starting as a grad student today! But not everybody shares my enthusiasm. There is cognitive dissonance. Some people are walking around with their heads hanging low, complaining about being disappointed or even depressed that we’ve “only discovered the Higgs and nothing else”.

So who is right?

It boils down to what you think particle physics is really about, and what motivates you to get into this business. One view is that particle physics is the study of the building blocks of matter, in which “new physics” means “new particles”. This is certainly the picture of the 1960s leading to the development of the Standard Model, but it’s not what drew me to the subject. To me, “particle physics” is the study of the fundamental laws of nature, governed by the still mysterious union of space–time and quantum mechanics. Indeed, from the deepest theoretical perspective, the very definition of what a particle is invokes both quantum mechanics and relativity in a crucial way. So if the biggest excitement for you is a cross-section plot with a huge bump in it, possibly with a ticket to Stockholm attached, then, after the discovery of the Higgs, it makes perfect sense to take your ball and go home, since we can make no guarantees of this sort whatsoever. We’re in this business for the long haul of decades and centuries, and if you don’t have the stomach for it, you’d better do something else with your life!

Isn’t the Standard Model a perfect example of the scientific method?

Sure, but part of the reason for the rapid progress in the 1960s is that the intellectual structure of relativity and quantum mechanics was already sitting there to be explored and filled in. But these more revolutionary discoveries took much longer, involving a wide range of theoretical and experimental results far beyond “bump plots”. So “new physics” is much more deeply about “new phenomena” and “new principles”. The discovery of the Higgs particle – especially with nothing else accompanying it so far – is unlike anything we have seen in any state of nature, and is profoundly “new physics” in this sense. The same is true of the other dramatic experimental discovery in the past few decades: that of the accelerating universe. Both discoveries are easily accommodated in our equations, but theoretical attempts to compute the vacuum energy and the scale of the Higgs mass pose gigantic, and perhaps interrelated, theoretical challenges. While we continue to scratch our heads as theorists, the most important path forward for experimentalists is completely clear: measure the hell out of these crazy phenomena! From many points of view, the Higgs is the most important actor in this story amenable to experimental study, so I just can’t stand all the talk of being disappointed by seeing nothing but the Higgs; it’s completely backwards. I find that the physicists who worry about not being able to convince politicians are (more or less secretly) not able to convince themselves that it is worth building the next collider. Fortunately, we do have a critical mass of fantastic young experimentalists who believe it is worth studying the Higgs to death, while also exploring whatever might be at the energy frontier, with no preconceptions about what they might find.

What makes the Higgs boson such a rich target for a future collider?

It is the first example we’ve seen of the simplest possible type of elementary particle. It has no spin, no charge, only mass, and this extreme simplicity makes it theoretically perplexing. There is a striking difference between massive and massless particles that have spin. For instance, a photon is a massless particle of spin one; because it moves at the speed of light, we can’t “catch up” with it, and so we only see it have two “polarisations”, or ways it can spin. By contrast the Z boson, which also has spin one, is massive; since you can catch up with it, you can see it spinning in any of three directions. This “two not equal to three” business is quite profound. As we collide particles at ever increasing energies, we might think that their masses are irrelevant tiny perturbations to their energies, but this is wrong, since something must account for the extra degrees of freedom.

The whole story of the Higgs is about accounting for this “two not equal to three” issue, to explain the extra spin states needed for massive W and Z particles mediating the weak interactions. And this also gives us a good understanding of why the masses of the elementary particles should be pegged to that of the Higgs. But the huge irony is that we don’t have any good understanding for what can explain the mass of the Higgs itself. That’s because there is no difference in the number of degrees of freedom between massive and massless spin-zero particles, and related to this, simple estimates for the Higgs mass from its interactions with virtual particles in the vacuum are wildly wrong. There are also good theoretical arguments, amply confirmed in analogous condensed-matter systems and elsewhere in particle physics, for why we shouldn’t have expected to see such a beast lonely, unaccompanied by other particles. And yet here we are. Nature clearly has other ideas for what the Higgs is about than theorists do.

Is supersymmetry still a motivation for a new collider?

Nobody who is making the case for future colliders is invoking, as a driving motivation, supersymmetry, extra dimensions or any of the other ideas that have been developed over the past 40 years for physics beyond the Standard Model. Certainly many of the versions of these ideas, which were popular in the 1980s and 1990s, are either dead or on life support given the LHC data, but others proposed in the early 2000s are alive and well. The fact that the LHC has ruled out some of the most popular pictures is a fantastic gift to us as theorists. It shows that understanding the origin of the Higgs mass must involve an even larger paradigm change than many had previously imagined. Ironically, had the LHC discovered supersymmetric particles, the case for the next circular collider would be somewhat weaker than it is now, because that would (indirectly) support a picture of a desert between the electroweak and Planck scales. In this picture of the world, most people wanted a linear electron–positron collider to measure the superpartner couplings in detail. It’s a picture people very much loved in the 1990s, and a picture that appears to be wrong. Fine. But when theorists are more confused, it’s the time for more, not less experiments.

What definitive answers will a future high-energy collider give us?

First and foremost, we go to high energies because it’s the frontier, and we look around for new things. While there is absolutely no guarantee we will produce new particles, we will definitely stress test our existing laws in the most extreme environments we have ever probed. Measuring the properties of the Higgs, however, is guaranteed to answer some burning questions. All the drama revolving around the existence of the Higgs would go away if we saw that it had substructure of any sort. But from the LHC, we have only a fuzzy picture of how point-like the Higgs is. A Higgs factory will decisively answer this question via precision measurements of the coupling of the Higgs to a slew of other particles in a very clean experimental environment. After that the ultimate question is whether or not the Higgs looks point-like even when interacting with itself. The simplest possible interaction between elementary particles is when three particles meet at a space–time point. But we have actually never seen any single elementary particle enjoy this simplest possible interaction. For good reasons going back to the basics of relativity and quantum mechanics, there is always some quantum number that must change in this interaction – either spin or charge quantum numbers change. The Higgs is the only known elementary particle allowed to have this most basic process as its dominant self-interaction. A 100 TeV collider producing billions of Higgs particles will not only detect the self-interaction, but will be able to measure it to an accuracy of a few per cent. Just thinking about the first-ever probe of this simplest possible interaction in nature gives me goosebumps.

What are the prospects for future dark-matter searches?

Beyond the measurements of the Higgs properties, there are all sorts of exciting signals of new particles that can be looked for at both Higgs factories and 100 TeV colliders. One I find especially important is WIMP dark matter. There is a funny perception, somewhat paralleling the absence of supersymmetry at the LHC, that the simple paradigm of WIMP dark matter has been ruled out by direct-detection experiments. Nope! In fact, the very simplest models of WIMP dark matter are perfectly alive and well. Once the electroweak quantum numbers of the dark-matter particles are specified, you can unambiguously compute what mass an electroweak charged dark-matter particle should have so that its thermal relic abundance is correct. You get a number between 1–3 TeV, far too heavy to be produced in any sizeable numbers at the LHC. Furthermore, they happen to have miniscule interaction cross sections for direct detection. So these very simplest theories of WIMP dark matter are inaccessible to the LHC and direct-detection experiments. But a 100 TeV collider has just enough juice to either see these particles, or rule out this simplest WIMP picture.

What is the cultural value of a 100 km supercollider?

Both the depth and visceral joy of experiments in particle physics is revealed in how simple it is to explain: we smash things together with the largest machines that have ever been built, to probe the fundamental laws of nature at the tiniest distances we’ve ever seen. But it goes beyond that to something more important about our self-conception as people capable of doing great things. The world has all kinds of long-term problems, some of which might seem impossible to solve. So it’s important to have a group of people who, over centuries, give a concrete template for how to go about grappling with and ultimately conquering seemingly impossible problems, driven by a calling far larger than themselves. Furthermore, suppose it’s 200 years from now, and there are no big colliders on the planet. How can humans be sure that the Higgs or top particles exist? Because it says so in dusty old books? There is an argument to be made that as we advance we should be able to do the things we did in the past. After all, the last time that fundamental knowledge was shoved in old dusty books was in the dark ages, and that didn’t go very well for the West.

What about justifying the cost of the next collider?

There are a number of projects and costs we could be talking about, but let’s call it $5–25 billion. Sounds like a lot, right? But the global economy is growing, not shrinking, and the cost of accelerators as a fraction of GDP has barely changed over the past 40 years – even a 100 TeV collider is in this same ballpark. Meanwhile the scientific issues at stake are more profound than they have been for many decades, so we certainly have an honest science case to make that we need to keep going.

People sometimes say that if we don’t spend billions of dollars on colliders, then we can do all sorts of other experiments instead. I am a huge fan of small-scale experiments, but this argument is silly because science funding is infamously not a zero-sum game. So, it’s not a question of, “do we want to spend tens of billions on collider physics or something else instead”, it is rather “do we want to spend tens of billions on fundamental physics experiments at all”.

Another argument is that we should wait until some breakthrough in accelerator technology, rather than just building bigger machines. This is naïve. Of course miracles can always happen, but we can’t plan doing science around miracles. Similar arguments were made around the time of the cancellation of the Superconducting Super Collider (SSC) 30 years ago, with prominent condensed-matter physicists saying that the SSC should wait for the development of high-temperature superconductors that would dramatically lower the cost. Of course those dreamed-of practical superconductors never materialised, while particle physics continued from strength to strength with the best technology available.

What do you make of claims that colliders are no longer productive?

It would be only to the good to have a no-holds barred, public discussion about the pros and cons of future colliders, led by people with a deep understanding of the relevant technical and scientific issues. It’s funny that non-experts don’t even make the best arguments for not building colliders; I could do a much better job than they do! I can point you to an awesomely fierce debate about future colliders that already took place in China two years ago: (Int. J. Mod. Phys. A 31 1630053 and 1630054). C N Yang, who is one of the greatest physicists of the 20th century and enormously influential in China, came out with a strong attack on colliders, not only in China but more broadly. I was delighted. Having a serious attack meant there could be a serious response, masterfully provided by David Gross. It was the King Kong vs Godzilla of fundamental physics, played out on the pages of major newspapers in China, fantastic!

What are you working on now?

About a decade ago, after a few years of thinking about the cosmology of “eternal inflation” in connection with solutions to the cosmological constant and hierarchy problems, I concluded that these mysteries can’t be understood without reconceptualising what space–time and quantum mechanics are really about. I decided to warm up by trying to understand the dynamics of particle scattering, like collisions at the LHC, from a new starting point, seeing space-time and quantum mechanics as being derived from more primitive notions. This has turned out to be a fascinating adventure, and we are seeing more and more examples of rather magical new mathematical structures, which surprisingly appear to underlie the physics of particle scattering in a wide variety of theories, some close to the real world. I am also turning my attention back to the goal that motivated the warm-up, trying to understand cosmology, as well as possible theories for the origin of the Higgs mass and cosmological constant, from this new point of view. In all my endeavours I continue to be driven, first and foremost, by the desire to connect deep theoretical ideas to experiments and the real world.

To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...

Assessing CERN’s impact on careers

Since the advent of the Large Hadron Collider (LHC), CERN has been recognised as the world’s leading laboratory for experimental particle physics. More than 10,000 people work at CERN on a daily basis. The majority are members of universities and other institutions worldwide, and many are young students and postdocs. The experience of working at CERN therefore plays an important role in their careers, be it in high-energy physics or a different domain.

The value of education

In 2016 the CERN management appointed a study group to collect information about the careers of students who have completed their thesis studies in one of the four LHC experiments. Similar studies were carried out in the past, also including people working on the former LEP experiments, and were mainly based on questionnaires sent to the team leaders of the various collaborator institutes. The latest study collected a larger and more complete sample of up-to-date information from all the experiments, with the aim of addressing young physicists who have left the field. This allows a quantitative measurement of the value of the education and skills acquired at CERN in finding jobs in other domains, which is of prime importance to evaluate the impact and role of CERN’s culture.

Following an initial online questionnaire with 282 respondents, the results were presented to the CERN Council in December 2016. The experience demonstrated the potential for collecting information from a wider population and also to deepen and customise the questions. Consequently, it was decided to enlarge the study to all persons who have been or are still involved with CERN, without any particular restrictions. Two distinct communities were polled with separate questionnaires: past and current CERN users (mainly experimentalists at any stage of their career), and theorists who had collaborated with the CERN theory department. The questionnaires were opened for a period of about four months and attracted 2692 and 167 participants from the experimental and theoretical communities, respectively. A total of 84 nationalities were represented, with German, Italian and US nationals making
up around half, and the distribution of participants by experiments was: ATLAS (994); CMS (977); LHCb (268) ALICE (102); and “other” (87), which mainly included members of the NA62 collaboration.

The questionnaires addressed various professional and sociological aspects: age, nationality, education, domicile and working place, time spent at CERN, acquired expertise, current position, and satisfaction with the CERN environment. Additional points were specific to those who are no longer CERN users, in relation to their current situation and type of activity. The analysis revealed some interesting trends.

For experimentalists, the CERN environment and working experience is considered as satisfactory or very satisfactory by 82% of participants, which is evenly distributed across nationalities. In 70% of cases, people who left high-energy physics mainly did so because of the long and uncertain path for obtaining a permanent position. Other reasons for leaving the field, although quoted by a lower percentage of participants, were: interest in other domains; lack of satisfaction at work; and family reasons. The majority of participants (63%) who left high-energy physics are currently working in the private sector, often in information technology, advanced technologies and finance domains, where they occupy a wide range of positions and responsibilities. Those in the public sector are mainly involved in academia or education.

For persons who left the field, several skills developed during their experience at CERN are considered important in their current work. The overall satisfaction of participants with their current position was high or very high for 78% of respondents, while 70% of respondents considered CERN’s impact on finding a job outside high-energy physics as positive or very positive. CERN’s services and networks, however, are not found to be very effective in helping finding a new job – a situation that is being addressed, for example, by the recently launched CERN alumni programme.

Theorists participating in the second questionnaire mainly have permanent or tenure-track positions. A large majority of them spent time at CERN’s theory department with short- or medium-term contracts, and this experience seems to improve participants’ careers when leaving CERN for a national institution. On average, about 35% of a theorist’s scientific publications originate from collaborations started at CERN, and a large fraction of theorists (96%) declared that they are satisfied or highly satisfied with their experience at CERN.

Conclusions

As with all such surveys, there is an inherent risk of bias due to the formulation of the questions and the number and type of participants. In practice, only between 20 and 30% of the targeted populations responded, depending on the addressed community, which means the results of the poll cannot be considered as representative of the whole CERN population. Nevertheless, it is clear that the impact of CERN on people’s careers is considered by a large majority of the people polled to be mostly positive, with some areas for improvement such as training and supporting the careers of those who choose to leave CERN and high-energy physics.

In the future this study could be made more significant by collecting similar information on larger samples of people, especially former CERN users. In this respect, the CERN alumni programme could help build a continuously updated database of current and former CERN users and also provide more support for people who decide to leave high-energy physics.

The final results of the survey, mostly in terms of statistical plots, together with a detailed description of the methods used to collect and analyse all the data, have been documented in a CERN Yellow Report, and will also be made available through a dedicated web page.

Fixed target, striking physics

As generations of particle colliders have come and gone, CERN’s fixed-target experiments have remained a backbone of the lab’s physics activities. Notable among them are those fed by the Super Proton Synchrotron (SPS). Throughout its long service to CERN’s accelerator complex, the 7 km-circumference SPS has provided a steady stream of high-energy proton beams to the North Area at the Prévessin site, feeding a wide variety of experiments. Sequentially named, they range from the pioneering NA1, which measured the photoproduction of vector and scalar bosons, to today’s NA64, which studies the dark sector. As the North Area marks 40 years since its first physics result, this hub of experiments large and small is as lively and productive as ever. Its users continue to drive developments in detector design, while reaping a rich harvest of fundamental physics results.

Specialised and precise

In fixed-target experiments, a particle beam collides with a target that is stationary in the laboratory frame, in most cases producing secondary particles for specific studies. High-energy machines like the SPS, which produces proton beams with a momentum up to 450 GeV/c, give the secondary products a large forward boost, providing intense sources of secondary and tertiary particles such as electrons, muons and hadrons. With respect to collider experiments, fixed-target experiments tend to be more specialised and focus on precision measurements that demand very high statistics, such as those involving ultra-rare decays.

Fixed-target experiments have a long history at CERN, forming essential building blocks in the physics landscape in parallel to collider facilities. Among these were the first studies of the quark–gluon plasma, the first evidence of direct CP violation and a detailed understanding of how nucleon spin arises from quarks and gluons. The first muons in CERN’s North Area were reported at the start of the commissioning run in March 1978, and the first physics publication – a measurement of the production rate of muon pairs by quark–antiquark annihilation as predicted by Drell and Yan – was published in 1979 by the NA3 experiment. Today, the North Area’s physics programme is as vibrant as ever.

The longevity of the North Area programme is explained by the unique complex of proton accelerators at CERN, where each machine is not only used to inject the protons into the next one but also serves its own research programme (for example, the Proton Synchrotron Booster serves the ISOLDE facility, while the Proton Synchrotron serves the Antiproton Decelerator and the n_TOF experiment). Fixed-target experiments using protons from the SPS started taking data while the ISR collider was already in operation in the late 1970s, continued during SPS operation as a proton–antiproton collider in the early 1980s, and again during the LEP and now LHC eras. As has been the case with collider experiments, physics puzzles and unexpected results were often at the origin of unique collaborations and experiments, pushing limits in several technology areas such as the first use of silicon-microstrip detectors.

The initial experimental programme in the North Area involved two large experimental halls: EHN1 for hadronic studies and EHN2 for muon experiments. The first round of experiments in EHN1 concerned studies of: meson photoproduction (NA1); electromagnetic form factors of pions and kaons (NA7); hadronic production of particles with large transverse momentum (NA3); inelastic hadron scattering (NA5); and neutron scattering (NA6). In EHN2 there were experiments devoted to studies with high-intensity muon beams (NA2 and NA4). A third, underground, area called ECN3 was added in 1980 to host experiments requiring primary proton beams and secondary beams of the highest intensity (up to 1010 particles per cycle).

Experiments in the North Area started a bit later than those in CERN’s West Area, which started operation in 1971 with 28 GeV/c protons supplied by the PS. Built to serve the last stage of the PS neutrino programme and the Omega spectrometer, the West Area zone was transformed into an SPS area in 1975 and is best known for seminal neutrino experiments (by the CDHS and CHARM collaborations, later CHORUS and NOMAD) and hadron-spectroscopy experiments with Omega. We are now used to identifying experimental collaborations by means of fancy acronyms such as ATLAS or ALICE, to mention two of the large LHC collaborations. But in the 1970s and the 1980s, one could distinguish between the experiments (identified by a sequential number) and the collaborations (identified by the list of the cities hosting the collaborating institutes). For instance CDHS stood for the CERN–Dortmund–Heidelberg–Saclay collaboration that operated the WA1 experiment in the West Area.

Los Alamos, SLAC, Fermilab and Brookhaven National Laboratory in the US, JINR and the Institute for High Energy Physics in Russia, and KEK in Japan, for example, also all had fixed-target programmes, some of which date back to the 1960s. As fixed-target programmes got into their stride, however, colliders were commanding the energy frontier. In 1980 the CERN North Area experimental programme was reviewed in a special meeting held in Cogne, Italy, and it was not completely obvious that there was a compelling physics case ahead. But it also led to highly optimised installations thanks to strong collaborations and continuous support from the CERN management. Advances in detectors and innovations such as silicon detectors and aerogel Cherenkov counters, plus the hybrid integration of bubble chambers with electronic detectors, led to a revamp in the study of hadron interactions at fixed-target experiments, especially for charmed mesons.

Physics landscape

Experiments at CERN’s North Area began shortly after the Standard Model had been established, when the scale of experiments was smaller than it is today. According to the 1979 CERN annual report, there were 34 active experiments at the SPS (West and North areas combined) and 14 were completed in 1978. This article cannot do justice to all of them, not even to those in the North Area. But over the past 40 years the experimental programme has clearly evolved into at least four main themes: probing nucleon structure with high-energy muons; hadroproduction and photoproduction at high energy; CP violation in very rare decays; and heavy-ion experiments (see “Forty years of fixed-target physics at CERN’s North Area”).

Aside from seminal physics results, fixed-target experiments at the North Area have driven numerous detector innovations. This is largely a result of their simple geometry and ease of access, which allows more adventurous technical solutions than might be possible with collider experiments. Examples of detector technologies perfected at the North Area include: silicon microstrips and active targets (NA11, NA14); rapid-cycling bubble chambers (NA27); holographic bubble chambers (NA25); Cherenkov detectors (CEDAR, RICH); liquid-krypton calorimeters (NA48); micromegas gas detectors (COMPASS); silicon pixels with 100 ps time resolution (NA62); time-projection chambers with dE/dx measurement (ISIS, NA49); and many more. The sheer amount of data to be recorded in these experiments also led to the very early adoption of PC farms for the online systems of the NA48 and COMPASS experiments.

Another key function of the North Area has been to test and calibrate detectors. These range from the fixed-target experiments themselves to experiments at colliders (such as LHC, ILC and CLIC), space and balloon experiments, and bent-crystal applications (such as UA9 and NA63). New detector concepts such as dual-readout calorimetry (DREAM) and particle-flow calorimetry (CALICE) have also been developed and optimised. Recently the huge EHN1 hall was extended by 60 m to house two very large liquid-argon prototype detectors to be tested for the Deep Underground Neutrino Experiment under construction in the US.

If there is an overall theme concerning the development of the fixed-target programme in the North Area, one could say that it was to be able to quickly evolve and adapt to address the compelling questions of the day. This looks set to remain true, with many proposals for new experiments appearing on the horizon, ranging from the study of very rare decays and light dark matter to the study of QCD with hadron and heavy-ion beams. There is even a study under way to possibly extend the North Area with an additional very-high-intensity proton beam serving a beam dump facility. These initiatives are being investigated by the Physics Beyond Collider study (see p20), and many of the proposals explore the high-intensity frontier complementary to the high-energy frontier at large colliders. Here’s to the next 40 years of North Area physics!

Forty years of fixed-target physics at CERN's North Area

Probing nucleon structure with high-energy muons

High-energy muons are excellent probes with which to investigate the structure of the nucleon. The North Area’s EHN2 hall was built to house two sets of muon experiments: the sequential NA2/NA9/NA28 (also known as the European Muon Collaboration, EMC), which made the observation that nucleons bound in nuclei are different from free nucleons; and NA4 (pictured), which confirmed the electroweak effects between the weak and electromagnetic interactions. A particular success of the North Area’s muon experiments concerned the famous “proton spin crisis”. In the late-1980s, contrary to the expectation by the otherwise successful quark–parton model, data showed that the proton’s spin is not carried by the quark spins. This puzzle interested the community for decades, compelling CERN to further investigate by building the NA47 Spin Muon collaboration experiment in the early 1990s (which established the same result for the neutron) and, subsequently, the COMPASS experiment (which studied the contribution of the gluon spins to the nucleon spin). A second phase of COMPASS still ongoing today, is devoted to nucleon tomography using deeply virtual Compton scattering and, for the first time, polarised Drell–Yan reactions. Hadron spectroscopy is another area of research at the North Area, and among recent important results from COMPASS is the measurement of pion polarisability, which is an important test of low-energy QCD.

Hadroproduction and photoproduction at high energy

Following the first experiment to publish data in the North Area (NA3) concerning the production of μ+μ pairs from hadron collisions, the ingenuity to combine bubble chambers and electronic detectors led to a series of experiments. The European Hybrid Spectrometer facility housed NA13, NA16, NA22, NA23 and NA27, and studied charm production and many aspects of hadronic physics, while photoproduction of heavy bosons was the primary aim of NA1. A measurement of the charm lifetime using the first ever microstrip silicon detectors was pioneered by the ACCMOR collaboration (NA11/NA32; see image of Robert Klanner next to the ACCMOR spectrometer in 1977), and hadron spectroscopy with neutral final states was studied by NA12 (GAMS), which employed a large array of lead glass counters, in particular a search for glueballs. To study μ+μ pairs from pion interactions at the highest possible intensities, the toroidal spectrometer NA10 was housed in the ECN3 underground cavern. Nearby in the same cavern, NA14 used a silicon active target and the first big microstrip silicon detectors (10,000 channels) to study charm photoproduction at high intensity. Later, experiment NA30 enabled a direct measurement of the π0 lifetime by employing thin gold foils to convert the photons from the π0 decays. Today, electron beams are used by NA64 to look for dark photons while hadron spectroscopy is still actively pursued, in particular at COMPASS.

CP violation and very rare decays

The discovery of CP violation in the decay of the long-lived neutral kaon to two pions at Brookhaven National Laboratory in 1964 was unexpected. To understand its origin, physicists needed to make a subtle comparison (in the form of a double ratio) between long- and short-lived neutral kaon decays in pairs of neutral and charged kaons. In 1987 an ambitious experiment (NA31) showed a deviation from one of the double ratios, providing the first evidence of direct CP violation (that is, it happens in the decay of the neutral mesons, not only in the mixing between neutral kaons). A second-generation experiment (NA48, pictured in 1996), located in ECN3 to accept a much higher primary-proton intensity, was able to measure the four decay modes concurrently thanks to the deflection of a tiny fraction of the primary proton beam into a downstream target via channelling in a “bent” crystal. NA48 was approved in 1991 when it became evident that more precision was needed to confirm the original observation (a competing programme at Fermilab called E731 did not find a significant deviation from the unity of the double ratio). Both KTeV (the follow-up Fermilab experiment) and NA48 confirmed NA31’s results, firmly establishing direct CP violation. Continuations of the NA48 experiments studied rare decays of the short-lived neutral kaon and searched for direct CP violation in charged kaons. Nowadays the kaon programme continues with NA62, which is dedicated to the study of very rare K+ π+νν decays and is complementary to the B-meson studies performed by the LHCb experiment.

Heavy-ion experiments

In the mid-1980s, with a view to reproduce in the laboratory the plasma of free quarks and gluons predicted by QCD and believed to have existed in the early universe, the SPS was modified to accelerate beams of heavy ions and collide them with nuclei. The lack of a single striking signature of the formation of the plasma demands that researchers look for as many final states as possible, exploiting the evolution of standard observables (such as the yield of muon pairs from the Drell–Yan process or the production rate of strange quarks) as a function of the degree of overlap of the nuclei that participate in the collision (centrality). By 2000 several experiments had, according to CERN Courier in March that year, found “tantalising glimpses of mechanisms that shaped our universe”. The experiments included NA44, NA45, NA49, NA50, NA52 and NA57, as well as WA97 and WA98 in the West Area. Among the most popular signatures observed was the suppression of the J/ψ yield in ion–nucleus collisions with respect to proton–proton collisions, which was seen by NA50. Improved sensitivity to muon pairs was provided by the successor experiment NA60. The current heavy-ion programme at the North Area includes NA61/SHINE (see image), the successor of NA49, which is studying the onset of phase transitions in dense quark–gluon matter at different beam energies and for different beam species. Studies of the quark–gluon plasma continue today, in particular at the LHC and at RHIC in the US. At the same time, NA61/SHINE is measuring the yield of mesons from replica targets for neutrino experiments worldwide and particle production for cosmic-ray studies.

Inspired by software

High-energy code

Of all the movements to make science and technology more open, the oldest is “open source” software. It was here that the “open” ideals were articulated, and from which all later movements such as open-access publishing derive. Whilst it rightly stands on this pedestal, from another point of view open-source software was simply the natural extension of academic freedom and knowledge-sharing into the digital age.

Open-source has its roots in the free software movement, which grew in the 1980s in response to monopolising corporations and restrictions on proprietary software. The underlying ideal is open collaboration: peers freely, collectively and publicly build software solutions. A second ideal is recognition, in which credit for the contributions made by individuals and organisations worldwide is openly acknowledged. A third ideal concerns rights, specifically the so-called four freedoms granted to users: to use the software for any purpose; to study the source code to understand how it works; to share and redistribute the software; and to improve the software and share the improvements with the community. Users and developers therefore contribute to a virtuous circle in which software is continuously improved and shared towards a common good, minimising vendor lock-in for users.

Today, 20 years after the term “open source” was coined, and despite initial resistance from traditional software companies, many successful open-source business models exist. These mainly involve consultancy and support services for software released under an open-source licence and extend beyond science to suppliers of everyday tools such as the WordPress platform, Firefox browser and the Android operating system. A more recent and unfortunate business model adopted by some companies is “open core”, whereby essential features are deemed premium and sold as proprietary software on top of existing open-source components.

Founding principles

Open collaboration is one of CERN’s founding principles, so it was natural to extend the principle into its software. The web’s invention brought this into sharp focus. Having experienced first-hand its potential to connect physicists around the globe, in 1993 CERN released the web software into the public domain so that developers could collaborate and improve on it (see CERN’s ultimate act of openness). The following year, CERN released the next web-server version under an open-source licence with the explicit goal of preventing private companies from turning it into proprietary software. These were crucial steps in nurturing the universal adoption of the web as a way to share digital information, and exemplars of CERN’s best practice in open-source software.

Nowadays, open-source software can be found in pretty much every corner of CERN, as in other sciences and industry. Indico and Invenio – two of the largest open-source projects developed at CERN to promote open collaboration – rely on the open-source framework Python Flask. Experimental data are stored in CERN’s Exascale Open Storage system, and most of the servers in the CERN computing centre are running on Openstack – an open-source cloud infrastructure to which CERN is an active contributor. Of course, CERN also relies heavily on open-source GNU/Linux as both a server and desktop operating system. On the accelerator and physics analysis side, it’s all about open source. From C2MON, a system at the heart of accelerator monitoring and data acquisition, to ROOT, the main data-analysis framework used to analyse experimental data, the vast majority of the software components behind the science done at CERN are released under an open-source licence.

Open hardware

The success of the open-source model for software has inspired CERN engineers to create an analogous “open hardware” licence, enabling electronics designers to collaborate and use, study, share and improve the designs of hardware components used for physics experiments. This approach has become popular in many sciences, and has become a lifeline for teaching and research in developing countries.

Being a scientist in the digital age means being a software producer and a software consumer. As a result, collaborative software-development platforms such as GitHub and GitLab have become as important to the physics department as they are to the IT department. Until recently, the software underlying an analysis has not been easily shared. CERN has therefore been developing research data-management tools to enable the publication of software and data, forming the basis of an open-data portal (see Preserving the legacy of particle physics). Naturally, this software itself is open source and has also been used to create the worldwide open-data service Zenodo, which is connected to GitHub to make the publication of open-source software a standard part of the research cycle.

Interestingly, as with the early days of open source, many corners of the scientific community are hesitant about open science. Some people are concerned that their software and data are not of sufficient quality or interest to be shared, or that they will be helping others to the next discovery before them. To triumph over the sceptics, open science can learn from the open-source movement, adopting standard licences, credit systems, collaborative development techniques and shared governance. In this way, it too will be able to reap the benefits of open collaboration: transparency, efficiency, perpetuity and flexibility. 

bright-rec iop pub iop-science physcis connect