Comsol -leaderboard other pages

Topics

Centennial conference honours Feynman

2018 marked the 100th anniversary of the birth of Richard Feynman. As one of several events worldwide celebrating this remarkable figure in physics, a memorial conference was held at the Institute of Advanced Studies at Nanyang Technological University in Singapore from 22 to 24 October, co-chaired by Lars Brink, KK Phua and Frank Wilzcek. The format was one-hour talks with 45 minute discussions.

Pierre Ramond began the conference with anecdotes from his time as Feynman’s next-door neighbour at Caltech. He discussed Feynman the MIT undergraduate, his first paper and his work at Princeton as a graduate student. There, Feynman learnt about Dirac’s idea of summing over histories from Herbert Jehle. Jehle asked Feynman about it a few days later. He said that he had understood it and had derived the Schrödinger equation from it. Feynman’s adviser was John Wheeler. Wheeler was toying with the idea of a single electron travelling back and forth in time – were you to look at a slice of time you would observe many electrons and positrons. After his spell at Los Alamos, this led Feynman to the idea of the propagator, which considers antiparticles propagating backwards in time as well as particles propagating forwards. These ideas would soon underpin the quantum description of electromagnetism – QED – for which Feynman shared the 1965 Nobel Prize in Physics with Tomonaga and Schwinger.

Revolutionary diagrams

The propagator was the key to the epony­mous diagrams Feynman then formulated to compute the Lamb shift and other quantities. At the Singapore conference, Lance Dixon exposed how Feynman diagrams revolutionised the calculation of scattering amplitudes. He offered as an example the calculation of the anomalous magnetic moment of the electron, which has now reached five-loop precision and includes 12,672 diagrams. Dixon also discussed the importance of Feynman’s parton picture for understanding deep-inelastic scattering, and the staggeringly complex calculations required to understand data at the LHC.

George Zweig, the most famous of Feynman’s students, and the inventor of “aces” as the fundamental constituents of matter, gave a vivid talk, recounting that it took a long time to convince a sceptical Feynman about them. He described life in the shadows of the great man as a graduate student at Caltech in the 1960s. At that time Feynman wanted to solve quantum gravity, and was giving a course on the subject of gravitation. He asked the students to suppose that Einstein had never lived: how would particle physicists discuss gravity? He quickly explained that there must be a spin-two particle mediating the force; by the second lecture he had computed the precession of the perihelion of Mercury, a juncture that other courses took months to arrive at. Zweig recounted that Feynman’s failure to invent a renormalisable theory of quantum gravity affected him for many years. Though he did not succeed, his insights continue to resound today. As Ramond earlier explained, Feynman’s contribution to a conference in Chapel Hill in 1957, his first public intervention on the subject, is now seen as the starting point for discussions on how to measure gravitational waves.

Cristiane Morais-Smith spoke on Feynman’s path integrals, comparing Hamiltonian and Lagrangian formulations, and showing their importance in perturbative QED. Michael Creutz, the son of one of Feynman’s colleagues at Princeton and Los Alamos, showed how the path integral is also necessary to be able to work on the inherently non-perturbative theory of quantum chromodynamics. Morais-Smith went on to illustrate how Feynman’s path integrals now have a plethora of applications outside particle physics, from graphene to quantum Brownian motion and dissipative quantum tunnelling. Indeed, the conference did not neglect Feynman’s famous interventions outside particle physics. Frank Wilczek recounted Feynman’s famous insight that there is plenty of room at the bottom, telling of his legendary after-dinner talk in 1959 that foreshadowed many developments in nanotechnology. Wilczek concluded that there is plenty of room left in Hilbert space, describing entanglement, quantum cryptography, quantum computation and quantum simulations. Quantum computing is the last subject that Feynman worked hard on. Artur Ekert described the famous conference at MIT in 1981 when Feynman first talked about the subject. His paper from this occasion “Simulating Physics with Computers” was the first paper on quantum computers and set the ground for the present developments.

Biology hangout

Feynman was also interested in biology for a long time. Curtis Callan painted a picture of Feynman “hanging out” in Max Delbruck’s laboratory at Caltech, even taking a sabbatical at the beginning of the 1960s to work there, exploring the molecular workings of heredity. In 1969 he gave the famous Hughes Aerospace lectures, offering a grand overview of biology and chemistry – but this was also the time of the parton model and somehow that interest took over.

Robbert Dijkgraaf spoke about the interplay between art and science in Feynman’s life and thinking. He pointed out how important beauty is, not only in nature, but also in mathematics, for instance whether one uses a geometric or algebraic approach. Another moving moment of this wide-ranging celebration of Feynman’s life and physics was Michelle Feynman’s words about growing up with her father. She showed him both as a family man and also as a scientist, sharing his enthusiasm for so many things in life.

  • Recordings of the presentations are available online.

Serbia becomes CERN Member State

Serbia became the 23rd Member State of CERN, on 24 March, following receipt of formal notification from UNESCO. Ever since the early days of CERN (former Yugoslavia was one of the 12 founding Member States of CERN in 1954, until its departure in 1961), the  Serbian scientific community has made strong contributions to CERN’s projects. This includes at the Synchrocyclotron, Proton Synchrotron and Super Proton Synchrotron facilities. In the 1980s and 1990s, physicists from Serbia worked on the DELPHI experiment at CERN’s LEP collider. In 2001, CERN and Serbia concluded an International Cooperation Agreement, leading to Serbia’s participation in the ATLAS and CMS experiments at the LHC, in the Worldwide LHC Computing Grid, as well as in the ACE and NA61 experiments. Serbia’s main involvement with CERN today is in the ATLAS and CMS experiments, in the ISOLDE facility, and on design studies for future particle colliders – FCC and CLIC – both of which are potentially new flagship projects at CERN.

Serbia was an Associate Member in the pre-stage to membership from March 2012. As a Member State, Serbia will have voting rights in the CERN Council, while the new status will also enhance the recruitment opportunities for Serbian nationals at CERN and for Serbian industry to bid for CERN contracts. “Investing in scientific research is important for the development of our economy and CERN is one of the most important scientific institutions today,” says Ana Brnabić, Prime Minister of Serbia. “I am immensely proud that Serbia has become a fully-fledged CERN Member State. This will bring new possibilities for our scientists and industry to work in cooperation with CERN and fellow CERN Member States.”

Welcome to the Science Gateway

On 8 April, CERN unveiled plans for a major new facility for scientific education and outreach. Aimed at audiences of all ages, the Science Gateway will include exhibition spaces, hands-on scientific experiments for schoolchildren and students, and a large amphitheatre to host science events for experts and non-experts alike. It is intended to satisfy the curiosity of hundreds of thousands visitors every year and is core to CERN’s mission to educate and engage the public in science.

“We will be able to share with everybody the fascination of exploring and learning how matter and the universe work, the advanced technologies we need to develop in order to build our ambitious instruments and their impact on society, and how science can influence our daily life,” says CERN director-general, Fabiola Gianotti. “I am deeply grateful to the donors for their crucial support in the fulfilment of this beautiful project.”

The overall cost of the Science Gateway, estimated at 79 m Swiss Francs, is entirely funded through donations. Almost three quarters of the cost has already been secured, thanks in particular to a contribution of 45 m Swiss Francs from Fiat Chrysler Automobiles. Other donors include a private foundation in Geneva and Loterie Romande, which distributes its profits to public utility projects. CERN is looking for additional donations to cover the full cost of the project.

The Science Gateway will be hosted in iconic buildings with a 7000 m2 footprint, linking CERN’s Meyrin site and the Globe of Science and Innovation. It is being designed by renowned architects Renzo Piano Building Workshop and intends to “celebrate the inventiveness and creativity that characterise the world of research and engineering”. Construction is planned to start in 2020 and be completed in 2022.

SESAME synchrotron goes all-solar

On 26 February, a new solar power plant powering the SESAME light source in Jordan was officially inaugurated. In addition to being the first synchrotron-light facility in the Middle East region, SESAME is now the world’s first major research infrastructure to be fully powered by renewable energy.

Electricity from the solar power plant will be supplied by an on-grid photovoltaic system constructed 30 km away, and its 6.48 MW power capacity is ample to satisfy SESAME’s needs for several years. “As in the case of all accelerators, SESAME is in dire need of energy, and as the number of its users increases so will its electricity bill,” says SESAME director Khaled Toukan. “Given the very high cost of electricity in Jordan, with this solar power plant the centre becomes sustainable.”

Energy efficiency and other environmental factors are coming under growing scrutiny at large research infrastructures worldwide. The necessary funding for the SESAME installation became available in late 2016 when the Government of Jordan agreed to allocate JD 5 million (US$7.05 million) from funds provided by the European Union (EU) to support the deployment of clean energy sources. The power plant, which uses monocrystalline solar panels, was built by the Jordanian company Kawar Energy and power that is transmitted to the grid will be accounted for to the credit of SESAME.

SESAME opened its beamlines to users in July 2018. Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey are currently members of SESAME, with 16 further countries – plus CERN and the EU – listed as observers.

Physicists digest Japan’s ILC statement

The Japanese government has put on hold a decision about hosting the International Linear Collider (ILC), to the disappointment of many hoping for clarity ahead of the update of the European strategy for particle physics. At a meeting in Tokyo on 6–7 March, Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) announced, with input from the Science Council of Japan (SCJ), that it has “not yet reached declaration” for hosting the ILC at this time. A statement from MEXT continued: “The ILC project requires further discussion in formal academic decision-making processes such as the SCJ Master Plan, where it has to be clarified whether the ILC project can gain understanding and support from the domestic academic community… MEXT will continue to discuss the ILC project with other governments while having an interest in the ILC project.”

The keenly awaited announcement was made during the 83rd meeting of the International Committee for Future Accelerators (ICFA) at the University of Tokyo. During a press briefing, ICFA chair Geoffrey Taylor emphasised that colliders are long-term projects. “At the last strategy update in 2013 the ILC was seen as an important development in the field, and we were hoping there would be a definite statement from Japan so that it can be incorporated into the current strategy update,” he said. “We don’t have that positive endorsement, so it will proceed at a slower rate than we hoped. ICFA still supports Japan as hosts of the ILC, and we hope it is built here because Japan has been working hard towards it. If not, we can be sure that there will be somewhere else in the world where the project can be taken up.”

The story of the ILC, an electron–positron collider that would serve as a Higgs factory, goes back more than 15 years. In 2012, physicists in Japan submitted a petition to the Japanese government to host the project. A technical design report was published the following year. In 2017, the original ILC design was revised to reduce its centre-of-mass energy by half, shortening it by around a third and reducing its cost by up to 40%.

Meanwhile, MEXT has been weighing up the ILC project in terms of its scientific significance, technical challenges, cost and other factors. In December 2018, the SCJ submitted a critical report to MEXT highlighting perceived issues with the project, including its cost and international organisation. Asked at the March press briefing why the SCJ should now be expected to change its views on the ILC, KEK director-general Masanori Yamauchi responded: “We can show that we already have solutions for the technical challenges pointed out in the latest SCJ report, and we are going to start making a framework for international cost-sharing.”

Writing in LC NewsLine, Lyn Evans, director of the Linear Collider Collaboration (which coordinates planning and research for the ILC and CERN’s Compact Linear Collider, CLIC), remains upbeat: “We did not get the green light we hoped for. Nevertheless, there was a significant step forward with a strong political statement and, for the first time, a declaration of interest in further discussions by a senior member of the executive. We will continue to push hard.”

Japan’s statement has also been widely interpreted as a polite way for the government to say “no” to the ILC. “The reality is that it is naturally difficult for people outside the machinery of any national administration to understand fully how procedures operate, and this is certainly true of the rest of the world with regard to what is truly happening with ILC in Japan,” says Phil Burrows of the University of Oxford, who is spokesperson for the CLIC accelerator collaboration.

A full spectrum of views was expressed at a meeting of the linear-collider community in Lausanne, Switzerland, on 8–9 April, with around 100 people present. “The global community represented at the Lausanne meeting restated the overwhelming physics case for an electron–positron collider to make precision measurements in the Higgs and top-quark sectors, with superb sensitivity to new physics,” says Burrows. “We are in the remarkable situation that we have not one, but two, mature options for doing this: ILC and CLIC. I hope that the European Strategy Update recommendations will reflect this consensus on the physics case, position Europe to play a leading role, and hence ensure that one of these projects proceeds to realisation.”

Assessing CERN’s impact on careers

Since the advent of the Large Hadron Collider (LHC), CERN has been recognised as the world’s leading laboratory for experimental particle physics. More than 10,000 people work at CERN on a daily basis. The majority are members of universities and other institutions worldwide, and many are young students and postdocs. The experience of working at CERN therefore plays an important role in their careers, be it in high-energy physics or a different domain.

The value of education

In 2016 the CERN management appointed a study group to collect information about the careers of students who have completed their thesis studies in one of the four LHC experiments. Similar studies were carried out in the past, also including people working on the former LEP experiments, and were mainly based on questionnaires sent to the team leaders of the various collaborator institutes. The latest study collected a larger and more complete sample of up-to-date information from all the experiments, with the aim of addressing young physicists who have left the field. This allows a quantitative measurement of the value of the education and skills acquired at CERN in finding jobs in other domains, which is of prime importance to evaluate the impact and role of CERN’s culture.

Following an initial online questionnaire with 282 respondents, the results were presented to the CERN Council in December 2016. The experience demonstrated the potential for collecting information from a wider population and also to deepen and customise the questions. Consequently, it was decided to enlarge the study to all persons who have been or are still involved with CERN, without any particular restrictions. Two distinct communities were polled with separate questionnaires: past and current CERN users (mainly experimentalists at any stage of their career), and theorists who had collaborated with the CERN theory department. The questionnaires were opened for a period of about four months and attracted 2692 and 167 participants from the experimental and theoretical communities, respectively. A total of 84 nationalities were represented, with German, Italian and US nationals making
up around half, and the distribution of participants by experiments was: ATLAS (994); CMS (977); LHCb (268) ALICE (102); and “other” (87), which mainly included members of the NA62 collaboration.

The questionnaires addressed various professional and sociological aspects: age, nationality, education, domicile and working place, time spent at CERN, acquired expertise, current position, and satisfaction with the CERN environment. Additional points were specific to those who are no longer CERN users, in relation to their current situation and type of activity. The analysis revealed some interesting trends.

For experimentalists, the CERN environment and working experience is considered as satisfactory or very satisfactory by 82% of participants, which is evenly distributed across nationalities. In 70% of cases, people who left high-energy physics mainly did so because of the long and uncertain path for obtaining a permanent position. Other reasons for leaving the field, although quoted by a lower percentage of participants, were: interest in other domains; lack of satisfaction at work; and family reasons. The majority of participants (63%) who left high-energy physics are currently working in the private sector, often in information technology, advanced technologies and finance domains, where they occupy a wide range of positions and responsibilities. Those in the public sector are mainly involved in academia or education.

For persons who left the field, several skills developed during their experience at CERN are considered important in their current work. The overall satisfaction of participants with their current position was high or very high for 78% of respondents, while 70% of respondents considered CERN’s impact on finding a job outside high-energy physics as positive or very positive. CERN’s services and networks, however, are not found to be very effective in helping finding a new job – a situation that is being addressed, for example, by the recently launched CERN alumni programme.

Theorists participating in the second questionnaire mainly have permanent or tenure-track positions. A large majority of them spent time at CERN’s theory department with short- or medium-term contracts, and this experience seems to improve participants’ careers when leaving CERN for a national institution. On average, about 35% of a theorist’s scientific publications originate from collaborations started at CERN, and a large fraction of theorists (96%) declared that they are satisfied or highly satisfied with their experience at CERN.

Conclusions

As with all such surveys, there is an inherent risk of bias due to the formulation of the questions and the number and type of participants. In practice, only between 20 and 30% of the targeted populations responded, depending on the addressed community, which means the results of the poll cannot be considered as representative of the whole CERN population. Nevertheless, it is clear that the impact of CERN on people’s careers is considered by a large majority of the people polled to be mostly positive, with some areas for improvement such as training and supporting the careers of those who choose to leave CERN and high-energy physics.

In the future this study could be made more significant by collecting similar information on larger samples of people, especially former CERN users. In this respect, the CERN alumni programme could help build a continuously updated database of current and former CERN users and also provide more support for people who decide to leave high-energy physics.

The final results of the survey, mostly in terms of statistical plots, together with a detailed description of the methods used to collect and analyse all the data, have been documented in a CERN Yellow Report, and will also be made available through a dedicated web page.

Fixed target, striking physics

As generations of particle colliders have come and gone, CERN’s fixed-target experiments have remained a backbone of the lab’s physics activities. Notable among them are those fed by the Super Proton Synchrotron (SPS). Throughout its long service to CERN’s accelerator complex, the 7 km-circumference SPS has provided a steady stream of high-energy proton beams to the North Area at the Prévessin site, feeding a wide variety of experiments. Sequentially named, they range from the pioneering NA1, which measured the photoproduction of vector and scalar bosons, to today’s NA64, which studies the dark sector. As the North Area marks 40 years since its first physics result, this hub of experiments large and small is as lively and productive as ever. Its users continue to drive developments in detector design, while reaping a rich harvest of fundamental physics results.

Specialised and precise

In fixed-target experiments, a particle beam collides with a target that is stationary in the laboratory frame, in most cases producing secondary particles for specific studies. High-energy machines like the SPS, which produces proton beams with a momentum up to 450 GeV/c, give the secondary products a large forward boost, providing intense sources of secondary and tertiary particles such as electrons, muons and hadrons. With respect to collider experiments, fixed-target experiments tend to be more specialised and focus on precision measurements that demand very high statistics, such as those involving ultra-rare decays.

Fixed-target experiments have a long history at CERN, forming essential building blocks in the physics landscape in parallel to collider facilities. Among these were the first studies of the quark–gluon plasma, the first evidence of direct CP violation and a detailed understanding of how nucleon spin arises from quarks and gluons. The first muons in CERN’s North Area were reported at the start of the commissioning run in March 1978, and the first physics publication – a measurement of the production rate of muon pairs by quark–antiquark annihilation as predicted by Drell and Yan – was published in 1979 by the NA3 experiment. Today, the North Area’s physics programme is as vibrant as ever.

The longevity of the North Area programme is explained by the unique complex of proton accelerators at CERN, where each machine is not only used to inject the protons into the next one but also serves its own research programme (for example, the Proton Synchrotron Booster serves the ISOLDE facility, while the Proton Synchrotron serves the Antiproton Decelerator and the n_TOF experiment). Fixed-target experiments using protons from the SPS started taking data while the ISR collider was already in operation in the late 1970s, continued during SPS operation as a proton–antiproton collider in the early 1980s, and again during the LEP and now LHC eras. As has been the case with collider experiments, physics puzzles and unexpected results were often at the origin of unique collaborations and experiments, pushing limits in several technology areas such as the first use of silicon-microstrip detectors.

The initial experimental programme in the North Area involved two large experimental halls: EHN1 for hadronic studies and EHN2 for muon experiments. The first round of experiments in EHN1 concerned studies of: meson photoproduction (NA1); electromagnetic form factors of pions and kaons (NA7); hadronic production of particles with large transverse momentum (NA3); inelastic hadron scattering (NA5); and neutron scattering (NA6). In EHN2 there were experiments devoted to studies with high-intensity muon beams (NA2 and NA4). A third, underground, area called ECN3 was added in 1980 to host experiments requiring primary proton beams and secondary beams of the highest intensity (up to 1010 particles per cycle).

Experiments in the North Area started a bit later than those in CERN’s West Area, which started operation in 1971 with 28 GeV/c protons supplied by the PS. Built to serve the last stage of the PS neutrino programme and the Omega spectrometer, the West Area zone was transformed into an SPS area in 1975 and is best known for seminal neutrino experiments (by the CDHS and CHARM collaborations, later CHORUS and NOMAD) and hadron-spectroscopy experiments with Omega. We are now used to identifying experimental collaborations by means of fancy acronyms such as ATLAS or ALICE, to mention two of the large LHC collaborations. But in the 1970s and the 1980s, one could distinguish between the experiments (identified by a sequential number) and the collaborations (identified by the list of the cities hosting the collaborating institutes). For instance CDHS stood for the CERN–Dortmund–Heidelberg–Saclay collaboration that operated the WA1 experiment in the West Area.

Los Alamos, SLAC, Fermilab and Brookhaven National Laboratory in the US, JINR and the Institute for High Energy Physics in Russia, and KEK in Japan, for example, also all had fixed-target programmes, some of which date back to the 1960s. As fixed-target programmes got into their stride, however, colliders were commanding the energy frontier. In 1980 the CERN North Area experimental programme was reviewed in a special meeting held in Cogne, Italy, and it was not completely obvious that there was a compelling physics case ahead. But it also led to highly optimised installations thanks to strong collaborations and continuous support from the CERN management. Advances in detectors and innovations such as silicon detectors and aerogel Cherenkov counters, plus the hybrid integration of bubble chambers with electronic detectors, led to a revamp in the study of hadron interactions at fixed-target experiments, especially for charmed mesons.

Physics landscape

Experiments at CERN’s North Area began shortly after the Standard Model had been established, when the scale of experiments was smaller than it is today. According to the 1979 CERN annual report, there were 34 active experiments at the SPS (West and North areas combined) and 14 were completed in 1978. This article cannot do justice to all of them, not even to those in the North Area. But over the past 40 years the experimental programme has clearly evolved into at least four main themes: probing nucleon structure with high-energy muons; hadroproduction and photoproduction at high energy; CP violation in very rare decays; and heavy-ion experiments (see “Forty years of fixed-target physics at CERN’s North Area”).

Aside from seminal physics results, fixed-target experiments at the North Area have driven numerous detector innovations. This is largely a result of their simple geometry and ease of access, which allows more adventurous technical solutions than might be possible with collider experiments. Examples of detector technologies perfected at the North Area include: silicon microstrips and active targets (NA11, NA14); rapid-cycling bubble chambers (NA27); holographic bubble chambers (NA25); Cherenkov detectors (CEDAR, RICH); liquid-krypton calorimeters (NA48); micromegas gas detectors (COMPASS); silicon pixels with 100 ps time resolution (NA62); time-projection chambers with dE/dx measurement (ISIS, NA49); and many more. The sheer amount of data to be recorded in these experiments also led to the very early adoption of PC farms for the online systems of the NA48 and COMPASS experiments.

Another key function of the North Area has been to test and calibrate detectors. These range from the fixed-target experiments themselves to experiments at colliders (such as LHC, ILC and CLIC), space and balloon experiments, and bent-crystal applications (such as UA9 and NA63). New detector concepts such as dual-readout calorimetry (DREAM) and particle-flow calorimetry (CALICE) have also been developed and optimised. Recently the huge EHN1 hall was extended by 60 m to house two very large liquid-argon prototype detectors to be tested for the Deep Underground Neutrino Experiment under construction in the US.

If there is an overall theme concerning the development of the fixed-target programme in the North Area, one could say that it was to be able to quickly evolve and adapt to address the compelling questions of the day. This looks set to remain true, with many proposals for new experiments appearing on the horizon, ranging from the study of very rare decays and light dark matter to the study of QCD with hadron and heavy-ion beams. There is even a study under way to possibly extend the North Area with an additional very-high-intensity proton beam serving a beam dump facility. These initiatives are being investigated by the Physics Beyond Collider study (see p20), and many of the proposals explore the high-intensity frontier complementary to the high-energy frontier at large colliders. Here’s to the next 40 years of North Area physics!

Forty years of fixed-target physics at CERN's North Area

Probing nucleon structure with high-energy muons

High-energy muons are excellent probes with which to investigate the structure of the nucleon. The North Area’s EHN2 hall was built to house two sets of muon experiments: the sequential NA2/NA9/NA28 (also known as the European Muon Collaboration, EMC), which made the observation that nucleons bound in nuclei are different from free nucleons; and NA4 (pictured), which confirmed the electroweak effects between the weak and electromagnetic interactions. A particular success of the North Area’s muon experiments concerned the famous “proton spin crisis”. In the late-1980s, contrary to the expectation by the otherwise successful quark–parton model, data showed that the proton’s spin is not carried by the quark spins. This puzzle interested the community for decades, compelling CERN to further investigate by building the NA47 Spin Muon collaboration experiment in the early 1990s (which established the same result for the neutron) and, subsequently, the COMPASS experiment (which studied the contribution of the gluon spins to the nucleon spin). A second phase of COMPASS still ongoing today, is devoted to nucleon tomography using deeply virtual Compton scattering and, for the first time, polarised Drell–Yan reactions. Hadron spectroscopy is another area of research at the North Area, and among recent important results from COMPASS is the measurement of pion polarisability, which is an important test of low-energy QCD.

Hadroproduction and photoproduction at high energy

Following the first experiment to publish data in the North Area (NA3) concerning the production of μ+μ pairs from hadron collisions, the ingenuity to combine bubble chambers and electronic detectors led to a series of experiments. The European Hybrid Spectrometer facility housed NA13, NA16, NA22, NA23 and NA27, and studied charm production and many aspects of hadronic physics, while photoproduction of heavy bosons was the primary aim of NA1. A measurement of the charm lifetime using the first ever microstrip silicon detectors was pioneered by the ACCMOR collaboration (NA11/NA32; see image of Robert Klanner next to the ACCMOR spectrometer in 1977), and hadron spectroscopy with neutral final states was studied by NA12 (GAMS), which employed a large array of lead glass counters, in particular a search for glueballs. To study μ+μ pairs from pion interactions at the highest possible intensities, the toroidal spectrometer NA10 was housed in the ECN3 underground cavern. Nearby in the same cavern, NA14 used a silicon active target and the first big microstrip silicon detectors (10,000 channels) to study charm photoproduction at high intensity. Later, experiment NA30 enabled a direct measurement of the π0 lifetime by employing thin gold foils to convert the photons from the π0 decays. Today, electron beams are used by NA64 to look for dark photons while hadron spectroscopy is still actively pursued, in particular at COMPASS.

CP violation and very rare decays

The discovery of CP violation in the decay of the long-lived neutral kaon to two pions at Brookhaven National Laboratory in 1964 was unexpected. To understand its origin, physicists needed to make a subtle comparison (in the form of a double ratio) between long- and short-lived neutral kaon decays in pairs of neutral and charged kaons. In 1987 an ambitious experiment (NA31) showed a deviation from one of the double ratios, providing the first evidence of direct CP violation (that is, it happens in the decay of the neutral mesons, not only in the mixing between neutral kaons). A second-generation experiment (NA48, pictured in 1996), located in ECN3 to accept a much higher primary-proton intensity, was able to measure the four decay modes concurrently thanks to the deflection of a tiny fraction of the primary proton beam into a downstream target via channelling in a “bent” crystal. NA48 was approved in 1991 when it became evident that more precision was needed to confirm the original observation (a competing programme at Fermilab called E731 did not find a significant deviation from the unity of the double ratio). Both KTeV (the follow-up Fermilab experiment) and NA48 confirmed NA31’s results, firmly establishing direct CP violation. Continuations of the NA48 experiments studied rare decays of the short-lived neutral kaon and searched for direct CP violation in charged kaons. Nowadays the kaon programme continues with NA62, which is dedicated to the study of very rare K+ π+νν decays and is complementary to the B-meson studies performed by the LHCb experiment.

Heavy-ion experiments

In the mid-1980s, with a view to reproduce in the laboratory the plasma of free quarks and gluons predicted by QCD and believed to have existed in the early universe, the SPS was modified to accelerate beams of heavy ions and collide them with nuclei. The lack of a single striking signature of the formation of the plasma demands that researchers look for as many final states as possible, exploiting the evolution of standard observables (such as the yield of muon pairs from the Drell–Yan process or the production rate of strange quarks) as a function of the degree of overlap of the nuclei that participate in the collision (centrality). By 2000 several experiments had, according to CERN Courier in March that year, found “tantalising glimpses of mechanisms that shaped our universe”. The experiments included NA44, NA45, NA49, NA50, NA52 and NA57, as well as WA97 and WA98 in the West Area. Among the most popular signatures observed was the suppression of the J/ψ yield in ion–nucleus collisions with respect to proton–proton collisions, which was seen by NA50. Improved sensitivity to muon pairs was provided by the successor experiment NA60. The current heavy-ion programme at the North Area includes NA61/SHINE (see image), the successor of NA49, which is studying the onset of phase transitions in dense quark–gluon matter at different beam energies and for different beam species. Studies of the quark–gluon plasma continue today, in particular at the LHC and at RHIC in the US. At the same time, NA61/SHINE is measuring the yield of mesons from replica targets for neutrino experiments worldwide and particle production for cosmic-ray studies.

A turning point for open-access publishing

High-energy physics (HEP) has been at the forefront of open-access publishing, the long-sought ideal to make scientific literature freely available. An early precursor to the open-access movement in the late 1960s was the database management system SPIRES (Stanford Physics Information Retrieval System), which aggregated all available (paper-copy) preprints that were sent between different institutions. SPIRES grew to become the first database accessible through the web in 1991 and later evolved into INSPIRE-HEP, hosted and managed by CERN in collaboration with other research laboratories.

The electronic era

The birth of the web in 1989 changed the publishing scene irreversibly. Vast sums were invested to take the industry from paper to online and to digitise old content, resulting in a migration from the sale of printed copies of journals to electronic subscriptions. From 1991, helped by the early adoption by particle physicists, the self-archiving repository arXiv.org allowed rapid distribution of electronic preprints in physics and, later, mathematics, astronomy and other sciences. The first open-access journals then began to sprout up and in early 2000 three major international events – the Budapest Open Access Initiative, Bethesda Statement on Open Access Publishing and the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities – set about leveraging the new technology to grant universal free access to the results of scientific research.

Today, roughly one quarter of all scholarly literature in sciences and humanities is open access. In HEP, the figure is almost 90%. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3), a global partnership between libraries, national funding agencies and publishers of HEP journals, has played an important role in HEP’s success. Designed at CERN, SCOAP3 started operation in 2014 and removes subscription fees for journals and any expenses scientists might incur to publish their articles open access by paying publishers directly. Some 3000 institutions from 43 countries (figure 1) contribute financially according to their scientific output in the field, re-using funds previously spent on subscription fees for journals that are now open access.

“SCOAP3 has demonstrated how open access can increase the visibility of research and ease the dissemination of scientific results for the benefit of everyone,” says SCOAP3 operations manager Alex Kohls of CERN. “This initiative was made possible by a strong collaboration of the worldwide library community, researchers, as well as commercial and society publishers, and it can certainly serve as an inspiration for open access in other fields.”

Plan S

On 4 September 2018, a group of national funding agencies, the European Commission (EC) and the European Research Council – under the name “cOAlition S” – launched a radical initiative called Plan S. Its aim is to ensure that, by 2020, all scientific publications that result from research funded by public grants must be published in compliant open-access journals or platforms. Robert-Jan Smits, the EC’s open-access envoy and one of the architects of Plan S, cites SCOAP3 as an inspiration for the project and says that momentum for Plan S has been building for two decades. “During those years many declarations, such as the Budapest and Berlin ones, were adopted, calling for a rapid transition to full and immediate open access. Even the 28 science ministers of the European Union issued a joint statement in 2016 that open access to scientific publications should be a reality by 2020,” says Smits. “The current situation shows, however, that there is still a long way to go.”

Recently, China released position papers supporting the efforts of Plan S, which could mark a key moment for the project. But the reaction of scientists around the world has been mixed. An open letter published in September by biochemist Lynn Kamerlin of Uppsala University in Sweden, attracting more than 1600 signatures at the time of writing, argues that Plan S would strongly reduce the possibilities to publish in suitable scientific journals of high quality, possibly splitting the global scientific community into two separate systems. Another open letter, published in November by biologist Michael Eisen at University of California Berkeley with around 2000 signatures, backs the principles of Plan S and supports its commitment “to continue working with funders, universities, research institutions and other stakeholders until we have created a stable, fair, effective and open system of scholarly communication.”

Challenges ahead

High-energy physics is already aligned to the Plan S vision thanks to SCOAP3, says Salvatore Mele of CERN, who is one of SCOAP3’s architects. But for other disciplines “the road ahead is likely to be bumpy”. “Funders, libraries and publishers have cooperated through CERN to make SCOAP3 possible. As most of the tens of thousands of scholarly journals today operate on a different model, with access mostly limited to readers paying subscription fees, this vision implies systemic challenges for all players: funders, libraries, publishers and, crucially, the wider research community,” he says.

It is publishers who are likely to face the biggest impact from Plan S. However, the Open Access Scholarly Publishers Association (OASPA) – which includes, among others, the American Physical Society, IOP Publishing (which publishes CERN Courier) and The Royal Society – recently published a statement of support, claiming OASPA “would welcome the opportunity to provide guidance and recommendations for how the funding of open-access publications should be implemented within Plan S”, while emphasising that smaller publishers, scholarly societies and new publishing platforms need to be included in the decision-making process.

Responding to an EC request for Plan S feedback that was open until 8 February, however, publishers have expressed major concerns about the pace of implementation and about the consequences of Plan S for hybrid journals. In a statement on 12 February, the European Physical Society, while supportive of the Plan S rationale, wrote that “several of the governing principles proposed for its implementation are not conducive to a transition to open access that preserves the important assets of today’s scientific publication system”. In another statement, the world’s largest open-access publisher, Springer Nature, released a list of six recommendations for funding bodies worldwide to adopt in order for full open-access to become a reality, highlighting the differences between “geographic, funder and disciplinary needs”. In parallel, a group of learned societies in mathematics and science in Germany has reacted with a statement citing a “precipitous process” that infringes the freedom of science, and urged cOAlition S to “slow down and consider all stakeholders”.

Global growth

Smits thinks traditional publishers, which are a critical element in quality control and rigorous peer review in scholarly literature, should adopt a fresh look, for example by implementing more transparent metrics. “It is obvious that the big publishers that run the subscription journals and make enormous profits prefer to keep the current publishing model. Furthermore, the dream of each scientist is to publish in a so-called prestigious high-impact journal, which shows that the journal impact factor is still very present in the academic world,” says Smits. “To arrive at the necessary change in academic culture, new metrics need to be developed to assess scientific output. The big challenge for cOAlition S is to grow globally, by having more funders signing up.”

Undoubtedly we are at a turning point between the old and new publishing worlds. The EC already requires that all publications from projects receiving its funding be made open access. But Plan S goes further, proposing an outright shift in scholarly publication. It is therefore crucial to ensure a smooth shift that takes into account all the actors, says Mele. “Thanks to SCOAP3, which has so far supported the publication of more than 26,000 articles, the high-energy physics community is fortunate to meet the vision of Plan S, while retaining researcher choice of the most appropriate place to publish their results.” 

Preserving the legacy of particle physics

In the 17th century, Galileo Galilei looked at the moons of Jupiter through a telescope and recorded his observations in his now-famous notebooks. Galileo’s notes – his data – survive to this day and can be reviewed by anyone around the world. Students, amateurs and professionals can replicate Galileo’s data and results – a tenet of the scientific method.

In particle physics, with its unique and expensive experiments, it is practically impossible for others to attempt to reproduce the original work. When it is impractical to gather fresh data to replicate an analysis, we settle for reproducing the analysis with the originally obtained data. However, a 2013 study by researchers at the University of British Columbia, Canada, estimates that the odds of scientific data existing in an analysable form reduce by about 17% each year.

Indeed, just a few years down the line it might not even be possible for researchers to revisit their own data due to changes in formats, software or operating systems. This has led to growing calls for scientists to release and archive their data openly. One motivation is moral: society funds research and so should have access to all of its outputs. Another is practical: a fresh look at data could enable novel research and lead to discoveries that may have eluded earlier searches.

Like open-access publishing (see A turning point for open-access publishing), governments have started to impose demands on scientists regarding the availability and long-term preservation of research data. The European Commission, for example, has piloted the mandatory release of open data as part of its Horizon 2020 programme and plans to invest heavily in open data in the future. An increasing number of data repositories have been established for life and medical sciences as well as for social sciences and meteorology, and the idea is gaining traction across disciplines. Only days after they announced the first observation of gravitational waves, the LIGO and VIRGO collaborations made public their data. NASA also releases data from many of its missions via open databases, such as exoplanet catalogues. The Natural History Museum in London makes data from millions of specimens available via a website and, in the world of art, the Rijksmuseum in Amsterdam provides an interface for developers to build apps featuring historic artworks.

Data levels

The open-data movement is of special interest to particle physics, owing to the uniqueness and large volume of datasets involved in discoveries such as that of the Higgs boson at the Large Hadron Collider (LHC). The four main LHC experiments have started to periodically release their data in an open manner, and these data can be classified into four levels. The first consists of the data shown in final publications, such as plots and tables, while the second concerns datasets in a simplified format that are suitable for “lightweight” analyses in educational or similar contexts. The third level involves the data being used for analysis by the researchers themselves, requiring specialised code and dedicated computing resources, and the final level with the highest complexity is the raw data generated by the detectors, which requires petabytes of storage and, uncalibrated, is not of much use without being fed to the third tier.

In late 2014 CERN launched an open-data portal and released research data from the LHC for the first time. The data, collected by the CMS experiment, represented half the level-three data recorded in 2010. The ALICE experiment has also released level-three data from proton–proton as well as lead–lead collisions, while all four collaborations – including ATLAS and LHCb – have released subsets of level-two data for education and outreach purposes.

Proactive policy

The story of open data at CMS goes back to 2011. “We started drafting an open-data policy, not because of pressure from funding agencies but because defining our own policy proactively meant we did not have an external body defining it for us,” explains Kati Lassila-Perini, who leads the collaboration’s data-preservation project. CMS aims to release half of each year’s level-three data three years after data taking, and 100% of the data within a ten-year window. By guaranteeing that people outside CMS can use these data, says Lassila-Perini, the collaboration can ensure that the knowledge of how to analyse the data is not lost, while allowing people outside CMS to look for things the collaboration might not have time for. To allow external re-use of the data, CMS released appropriate metadata as well as analysis examples. The datasets soon found takers and, in 2017, a group of theoretical physicists not affiliated with the collaboration published two papers using them. CMS has since released half its 2011 data (corresponding to around 200 TB) and half its 2012 data (1 PB), with the first releases of level-three data from the LHC’s Run 2 in the pipeline.

The LHC collaborations have been releasing simpler datasets for educational activities from as early as 2011, for example for the International Physics Masterclasses that involve thousands of high-school students around the globe each year. In addition, CMS has made available several Jupyter notebooks – a browser-based analysis platform named with a nod to Galileo – in assorted languages (programming and human) that allow anyone with an internet connection to perform a basic analysis. “The real impact of open data in terms of numbers of users is in schools,” says Lassila-Perini. “It makes it possible for young people with no previous contact with coding to learn about data analysis and maybe discover how fascinating it can be.” Also available from CMS are more complex examples aimed at university-level students.

Open-data endeavours by ATLAS are very much focused on education, and the collaboration has provided curated datasets for teaching in places that may not have substantial computing resources or internet access. “Not even the documentation can rely on online content, so everything we produce needs to be self-contained,” remarks Arturo Sánchez Pineda, who coordinates ATLAS’s open-data programme. ATLAS datasets and analysis tools, which also rely on Jupyter notebooks, have been optimised to fit on a USB memory stick and allow simplified ATLAS analyses to be conducted just about anywhere in the world. In 2016, ATLAS released simplified open data corresponding to 1 fb–1 at 8 TeV, with the aim of giving university students a feel for what a real particle-physics analysis involves.

ATLAS open data have already found their way into university theses and have been used by people outside the collaboration to develop their own educational tools. Indeed, within ATLAS, new members can now choose to work on preparing open data as their qualification task to become an ATLAS co-author, says Sánchez Pineda. This summer, ATLAS will release 10 fb–1 of level-two data from Run 2, with more than 100 simulated physics processes and related resources. ATLAS does not provide level-three data openly and researchers interested in analysing these can do so through a tailored association programme, which 80 people have taken advantage of so far. “This allows external scientists to rely on ATLAS software, computing and analysis expertise for their project,” says Sánchez Pineda.

Fundamental motivation

CERN’s open-data portal hosts and serves data from the four big LHC experiments, also providing many of the software tools including virtual machines to run the analysis code. The OPERA collaboration recently started sharing its research data via CERN and other particle-physics collaborations are interested in joining the project.

Although high-energy physics has made great strides in providing open access to research publications, we are still in the very early days of open data. Theorist Jesse Thaler of MIT, who led the first independent analysis using CMS open data, acknowledges that it is possible for people to get their hands on coveted data by joining an experimental collaboration, but sees a much brighter future with open data. “What about more exploratory studies where the theory hasn’t yet been invented? What about engaging undergraduate students? What about examining old data for signs of new physics?” he asks. These provocative questions serve as fundamental motivations for making all data in high-energy physics as open as possible. 

CERN’s ultimate act of openness

At a mere 30 years old, the World Wide Web already ranks as one of humankind’s most disruptive inventions. Developed at CERN in the early 1990s, it has touched practically every facet of life, impacting industry, penetrating our personal lives and transforming the way we transact. At the same time, the web is shrinking continents and erasing borders, bringing with it an array of benefits and challenges as humanity adjusts to this new technology.

This reality is apparent to all. What is less well known, but deserves recognition, is the legal dimension of the web’s history. On 30 April 1993, CERN released a memo (see image) that placed into the public domain all of the web’s underlying software: the basic client, basic server and library of common code. The document was addressed “To whom it may concern” – which would suggest the authors were not entirely sure who the target audience was. Yet, with hindsight, this line can equally be interpreted as an unintended address to humanity at large.

The legal implication was that CERN relinquished all intellectual property rights in this software. It was a deliberate decision, the intention being that a no-strings-attached release of the software would “further compatibility, common practices, and standards in networking and computer supported collaboration” – arguably modest ambitions for what turned out to be such a seismic technological step. To understand what seeded this development you need to go back to the 1950s, at a time when “software” would have been better understood as referring to clothing rather than computing.

European project

CERN was born out of the wreckage of World War II, playing a role, on the one hand, as a mechanism for reconciliation between former belligerents, while, on the other, offering European nuclear physicists the opportunity to conduct their research locally. The hope was that this would stem the “brain drain” to the US, from a Europe still recovering from the devastating effects of war.

In 1953, CERN’s future Member States agreed on the text of the organisation’s founding Convention, defining its mission as providing “for collaboration among European States in nuclear research of a pure scientific and fundamental character”. With the public acutely aware of the role that destructive nuclear technology had played during the war, the Convention additionally stipulated that CERN was to have “no concern with work for military requirements” and that the results of its work, were to be “published or otherwise made generally available”.

In the early years of CERN’s existence, the openness resulting from this requirement for transparency was essentially delivered through traditional channels, in particular through publication in scientific journals. Over time, this became the cultural norm at CERN, permeating all aspects of its work both internally and with its collaborating partners and society at large. CERN’s release of the WWW software into the public domain, arguably in itself a consequence of the openness requirement of the Convention, could be seen as a precursor to today’s web-based tools that represent further manifestations of CERN’s openness: the SCOAP3 publishing model, open-source software and hardware, and open data.

Perhaps the best measure for how ingrained openness is in CERN’s ethos as a laboratory is to ask the question: “if CERN would have known then what it knows now about the impact of the World Wide Web, would it still have made the web software available, just as it did in 1993?” We would like to suggest that, yes, our culture of openness would provoke the same response now as it did then, though no doubt a modern, open-source licensing regime would be applied.

A culture of openness

This, in turn, can be viewed as testament and credit to the wisdom of CERN’s founders, and to the CERN Convention, which remains the cornerstone of our work to this day.

bright-rec iop pub iop-science physcis connect