What would you do if you were thrust into a world where suddenly you lacked control over who you were? If you had no way to prove where you were from, who you were related to, or what you had accomplished? If you lost all your documentation in a natural disaster, or were forced to leave your home without taking anything with you? Without proof of identity, people are unable access essential systems such as health, education and banking services, and they are also exceedingly vulnerable to trafficking and incarceration. Having and owning your identity is an essential human right that too many people are lacking.
More than 68 million people worldwide have been displaced by war and conflict, and over 25 million have fled their countries and gone from the designation of “citizen” to “refugee”. They are often prevented from working in their new countries, and, even if they are allowed to work, many nations will not let professional credentials, such as licences to practise law or medicine, follow these people across their borders. We end up stripping away fundamental human dignities and leaving exorbitant amounts of untapped potential on the table. Countries need to recognise not just the right to identity but also that identity is portable across nation states.
The issue of sovereign identities extends much further than documentation. All over the world, individuals are becoming commodified by companies offering “free” services because their actual products are the users and their data. Every individual should have the right to decide to monetise their data if they want. But the speed, scale and stealth of such practises is making it increasingly difficult to retain control of our data.
All of this is happening as we celebrate the 30th anniversary of the web. While there is no doubt that the web has been incredibly beneficial for humanity, it has also turned people into pawns and opened them up to new security risks. I believe that we can not only remedy these harms, but that we’ve yet to harness even a small fraction of the good that the web can do. Enter The Humanized Internet – a non-profit movement founded in 2017 that is working to use new technologies to give every human being secure, sovereign control over their own digital identity.
New technologies like blockchain, which allows digital information to be distributed but not copied, can allow us to tackle this issue. Blockchain has some key differences with today’s databases. First, it allows participants to see and verify all data involved, minimising chances of fraud. Second, all data is verified and encrypted before being added to an individual block in such a way that a hacker would need to have exponentially more computing power to break in than is required in today’s systems. These characteristics allow blockchain to provide public ledgers that participants trust based on the agreed-upon consensus protocol. Once data transactions are on a block, they cannot be overwritten, and no central institution holds control, as these ledgers are visible to all the users connected to them. Users’ identities within a ledger are known only to the users themselves.
The first implication of this technology is that it can help to establish a person’s citizenship in their state of origin and enable registration of official records. Without this many people would be considered stateless and granted almost no rights or diplomatic protections. For refugees, digital identities also allow peer-to-peer donation and transparent public transactions. Additionally, digital identities create the ability to practise selective disclosure, where individuals can choose to share their records only at their own discretion.
We now need more people to get on board. We are already working with experts to discuss the potential of blockchain to improve inclusion in state-authenticated identity programmes and how to combat potential privacy challenges, in addition to e-voting systems that could allow inclusive participation in voting at all policy levels. We should all be the centre of our universe; our identity should be wholly and irrevocably our own.
There has never been a better time to be a physicist. The questions on the table today are not about this-or-that detail, but profound ones about the very structure of the laws of nature. The ancients could (and did) wonder about the nature of space and time and the vastness of the cosmos, but the job of a professional scientist isn’t to gape in awe at grand, vague questions – it is to work on the next question. Having ploughed through all the “easier” questions for four centuries, these very deep questions finally confront us: what are space and time? What is the origin and fate of our enormous universe? We are extremely fortunate to live in the era when human beings first get to meaningfully attack these questions. I just wish I could adjust when I was born so that I could be starting as a grad student today! But not everybody shares my enthusiasm. There is cognitive dissonance. Some people are walking around with their heads hanging low, complaining about being disappointed or even depressed that we’ve “only discovered the Higgs and nothing else”.
So who is right?
It boils down to what you think particle physics is really about, and what motivates you to get into this business. One view is that particle physics is the study of the building blocks of matter, in which “new physics” means “new particles”. This is certainly the picture of the 1960s leading to the development of the Standard Model, but it’s not what drew me to the subject. To me, “particle physics” is the study of the fundamental laws of nature, governed by the still mysterious union of space–time and quantum mechanics. Indeed, from the deepest theoretical perspective, the very definition of what a particle is invokes both quantum mechanics and relativity in a crucial way. So if the biggest excitement for you is a cross-section plot with a huge bump in it, possibly with a ticket to Stockholm attached, then, after the discovery of the Higgs, it makes perfect sense to take your ball and go home, since we can make no guarantees of this sort whatsoever. We’re in this business for the long haul of decades and centuries, and if you don’t have the stomach for it, you’d better do something else with your life!
Isn’t the Standard Model a perfect example of the scientific method?
Sure, but part of the reason for the rapid progress in the 1960s is that the intellectual structure of relativity and quantum mechanics was already sitting there to be explored and filled in. But these more revolutionary discoveries took much longer, involving a wide range of theoretical and experimental results far beyond “bump plots”. So “new physics” is much more deeply about “new phenomena” and “new principles”. The discovery of the Higgs particle – especially with nothing else accompanying it so far – is unlike anything we have seen in any state of nature, and is profoundly “new physics” in this sense. The same is true of the other dramatic experimental discovery in the past few decades: that of the accelerating universe. Both discoveries are easily accommodated in our equations, but theoretical attempts to compute the vacuum energy and the scale of the Higgs mass pose gigantic, and perhaps interrelated, theoretical challenges. While we continue to scratch our heads as theorists, the most important path forward for experimentalists is completely clear: measure the hell out of these crazy phenomena! From many points of view, the Higgs is the most important actor in this story amenable to experimental study, so I just can’t stand all the talk of being disappointed by seeing nothing but the Higgs; it’s completely backwards. I find that the physicists who worry about not being able to convince politicians are (more or less secretly) not able to convince themselves that it is worth building the next collider. Fortunately, we do have a critical mass of fantastic young experimentalists who believe it is worth studying the Higgs to death, while also exploring whatever might be at the energy frontier, with no preconceptions about what they might find.
What makes the Higgs boson such a rich target for a future collider?
It is the first example we’ve seen of the simplest possible type of elementary particle. It has no spin, no charge, only mass, and this extreme simplicity makes it theoretically perplexing. There is a striking difference between massive and massless particles that have spin. For instance, a photon is a massless particle of spin one; because it moves at the speed of light, we can’t “catch up” with it, and so we only see it have two “polarisations”, or ways it can spin. By contrast the Z boson, which also has spin one, is massive; since you can catch up with it, you can see it spinning in any of three directions. This “two not equal to three” business is quite profound. As we collide particles at ever increasing energies, we might think that their masses are irrelevant tiny perturbations to their energies, but this is wrong, since something must account for the extra degrees of freedom.
The whole story of the Higgs is about accounting for this “two not equal to three” issue, to explain the extra spin states needed for massive W and Z particles mediating the weak interactions. And this also gives us a good understanding of why the masses of the elementary particles should be pegged to that of the Higgs. But the huge irony is that we don’t have any good understanding for what can explain the mass of the Higgs itself. That’s because there is no difference in the number of degrees of freedom between massive and massless spin-zero particles, and related to this, simple estimates for the Higgs mass from its interactions with virtual particles in the vacuum are wildly wrong. There are also good theoretical arguments, amply confirmed in analogous condensed-matter systems and elsewhere in particle physics, for why we shouldn’t have expected to see such a beast lonely, unaccompanied by other particles. And yet here we are. Nature clearly has other ideas for what the Higgs is about than theorists do.
Is supersymmetry still a motivation for a new collider?
Nobody who is making the case for future colliders is invoking, as a driving motivation, supersymmetry, extra dimensions or any of the other ideas that have been developed over the past 40 years for physics beyond the Standard Model. Certainly many of the versions of these ideas, which were popular in the 1980s and 1990s, are either dead or on life support given the LHC data, but others proposed in the early 2000s are alive and well. The fact that the LHC has ruled out some of the most popular pictures is a fantastic gift to us as theorists. It shows that understanding the origin of the Higgs mass must involve an even larger paradigm change than many had previously imagined. Ironically, had the LHC discovered supersymmetric particles, the case for the next circular collider would be somewhat weaker than it is now, because that would (indirectly) support a picture of a desert between the electroweak and Planck scales. In this picture of the world, most people wanted a linear electron–positron collider to measure the superpartner couplings in detail. It’s a picture people very much loved in the 1990s, and a picture that appears to be wrong. Fine. But when theorists are more confused, it’s the time for more, not less experiments.
What definitive answers will a future high-energy collider give us?
First and foremost, we go to high energies because it’s the frontier, and we look around for new things. While there is absolutely no guarantee we will produce new particles, we will definitely stress test our existing laws in the most extreme environments we have ever probed. Measuring the properties of the Higgs, however, is guaranteed to answer some burning questions. All the drama revolving around the existence of the Higgs would go away if we saw that it had substructure of any sort. But from the LHC, we have only a fuzzy picture of how point-like the Higgs is. A Higgs factory will decisively answer this question via precision measurements of the coupling of the Higgs to a slew of other particles in a very clean experimental environment. After that the ultimate question is whether or not the Higgs looks point-like even when interacting with itself. The simplest possible interaction between elementary particles is when three particles meet at a space–time point. But we have actually never seen any single elementary particle enjoy this simplest possible interaction. For good reasons going back to the basics of relativity and quantum mechanics, there is always some quantum number that must change in this interaction – either spin or charge quantum numbers change. The Higgs is the only known elementary particle allowed to have this most basic process as its dominant self-interaction. A 100 TeV collider producing billions of Higgs particles will not only detect the self-interaction, but will be able to measure it to an accuracy of a few per cent. Just thinking about the first-ever probe of this simplest possible interaction in nature gives me goosebumps.
What are the prospects for future dark-matter searches?
Beyond the measurements of the Higgs properties, there are all sorts of exciting signals of new particles that can be looked for at both Higgs factories and 100 TeV colliders. One I find especially important is WIMP dark matter. There is a funny perception, somewhat paralleling the absence of supersymmetry at the LHC, that the simple paradigm of WIMP dark matter has been ruled out by direct-detection experiments. Nope! In fact, the very simplest models of WIMP dark matter are perfectly alive and well. Once the electroweak quantum numbers of the dark-matter particles are specified, you can unambiguously compute what mass an electroweak charged dark-matter particle should have so that its thermal relic abundance is correct. You get a number between 1–3 TeV, far too heavy to be produced in any sizeable numbers at the LHC. Furthermore, they happen to have miniscule interaction cross sections for direct detection. So these very simplest theories of WIMP dark matter are inaccessible to the LHC and direct-detection experiments. But a 100 TeV collider has just enough juice to either see these particles, or rule out this simplest WIMP picture.
What is the cultural value of a 100 km supercollider?
Both the depth and visceral joy of experiments in particle physics is revealed in how simple it is to explain: we smash things together with the largest machines that have ever been built, to probe the fundamental laws of nature at the tiniest distances we’ve ever seen. But it goes beyond that to something more important about our self-conception as people capable of doing great things. The world has all kinds of long-term problems, some of which might seem impossible to solve. So it’s important to have a group of people who, over centuries, give a concrete template for how to go about grappling with and ultimately conquering seemingly impossible problems, driven by a calling far larger than themselves. Furthermore, suppose it’s 200 years from now, and there are no big colliders on the planet. How can humans be sure that the Higgs or top particles exist? Because it says so in dusty old books? There is an argument to be made that as we advance we should be able to do the things we did in the past. After all, the last time that fundamental knowledge was shoved in old dusty books was in the dark ages, and that didn’t go very well for the West.
What about justifying the cost of the next collider?
There are a number of projects and costs we could be talking about, but let’s call it $5–25 billion. Sounds like a lot, right? But the global economy is growing, not shrinking, and the cost of accelerators as a fraction of GDP has barely changed over the past 40 years – even a 100 TeV collider is in this same ballpark. Meanwhile the scientific issues at stake are more profound than they have been for many decades, so we certainly have an honest science case to make that we need to keep going.
People sometimes say that if we don’t spend billions of dollars on colliders, then we can do all sorts of other experiments instead. I am a huge fan of small-scale experiments, but this argument is silly because science funding is infamously not a zero-sum game. So, it’s not a question of, “do we want to spend tens of billions on collider physics or something else instead”, it is rather “do we want to spend tens of billions on fundamental physics experiments at all”.
Another argument is that we should wait until some breakthrough in accelerator technology, rather than just building bigger machines. This is naïve. Of course miracles can always happen, but we can’t plan doing science around miracles. Similar arguments were made around the time of the cancellation of the Superconducting Super Collider (SSC) 30 years ago, with prominent condensed-matter physicists saying that the SSC should wait for the development of high-temperature superconductors that would dramatically lower the cost. Of course those dreamed-of practical superconductors never materialised, while particle physics continued from strength to strength with the best technology available.
What do you make of claims that colliders are no longer productive?
It would be only to the good to have a no-holds barred, public discussion about the pros and cons of future colliders, led by people with a deep understanding of the relevant technical and scientific issues. It’s funny that non-experts don’t even make the best arguments for not building colliders; I could do a much better job than they do! I can point you to an awesomely fierce debate about future colliders that already took place in China two years ago: (Int. J. Mod. Phys. A31 1630053 and 1630054). C N Yang, who is one of the greatest physicists of the 20th century and enormously influential in China, came out with a strong attack on colliders, not only in China but more broadly. I was delighted. Having a serious attack meant there could be a serious response, masterfully provided by David Gross. It was the King Kong vs Godzilla of fundamental physics, played out on the pages of major newspapers in China, fantastic!
What are you working on now?
About a decade ago, after a few years of thinking about the cosmology of “eternal inflation” in connection with solutions to the cosmological constant and hierarchy problems, I concluded that these mysteries can’t be understood without reconceptualising what space–time and quantum mechanics are really about. I decided to warm up by trying to understand the dynamics of particle scattering, like collisions at the LHC, from a new starting point, seeing space-time and quantum mechanics as being derived from more primitive notions. This has turned out to be a fascinating adventure, and we are seeing more and more examples of rather magical new mathematical structures, which surprisingly appear to underlie the physics of particle scattering in a wide variety of theories, some close to the real world. I am also turning my attention back to the goal that motivated the warm-up, trying to understand cosmology, as well as possible theories for the origin of the Higgs mass and cosmological constant, from this new point of view. In all my endeavours I continue to be driven, first and foremost, by the desire to connect deep theoretical ideas to experiments and the real world.
To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...
Since the advent of the Large Hadron Collider (LHC), CERN has been recognised as the world’s leading laboratory for experimental particle physics. More than 10,000 people work at CERN on a daily basis. The majority are members of universities and other institutions worldwide, and many are young students and postdocs. The experience of working at CERN therefore plays an important role in their careers, be it in high-energy physics or a different domain.
The value of education
In 2016 the CERN management appointed a study group to collect information about the careers of students who have completed their thesis studies in one of the four LHC experiments. Similar studies were carried out in the past, also including people working on the former LEP experiments, and were mainly based on questionnaires sent to the team leaders of the various collaborator institutes. The latest study collected a larger and more complete sample of up-to-date information from all the experiments, with the aim of addressing young physicists who have left the field. This allows a quantitative measurement of the value of the education and skills acquired at CERN in finding jobs in other domains, which is of prime importance to evaluate the impact and role of CERN’s culture.
Following an initial online questionnaire with 282 respondents, the results were presented to the CERN Council in December 2016. The experience demonstrated the potential for collecting information from a wider population and also to deepen and customise the questions. Consequently, it was decided to enlarge the study to all persons who have been or are still involved with CERN, without any particular restrictions. Two distinct communities were polled with separate questionnaires: past and current CERN users (mainly experimentalists at any stage of their career), and theorists who had collaborated with the CERN theory department. The questionnaires were opened for a period of about four months and attracted 2692 and 167 participants from the experimental and theoretical communities, respectively. A total of 84 nationalities were represented, with German, Italian and US nationals making
up around half, and the distribution of participants by experiments was: ATLAS (994); CMS (977); LHCb (268) ALICE (102); and “other” (87), which mainly included members of the NA62 collaboration.
The questionnaires addressed various professional and sociological aspects: age, nationality, education, domicile and working place, time spent at CERN, acquired expertise, current position, and satisfaction with the CERN environment. Additional points were specific to those who are no longer CERN users, in relation to their current situation and type of activity. The analysis revealed some interesting trends.
For experimentalists, the CERN environment and working experience is considered as satisfactory or very satisfactory by 82% of participants, which is evenly distributed across nationalities. In 70% of cases, people who left high-energy physics mainly did so because of the long and uncertain path for obtaining a permanent position. Other reasons for leaving the field, although quoted by a lower percentage of participants, were: interest in other domains; lack of satisfaction at work; and family reasons. The majority of participants (63%) who left high-energy physics are currently working in the private sector, often in information technology, advanced technologies and finance domains, where they occupy a wide range of positions and responsibilities. Those in the public sector are mainly involved in academia or education.
For persons who left the field, several skills developed during their experience at CERN are considered important in their current work. The overall satisfaction of participants with their current position was high or very high for 78% of respondents, while 70% of respondents considered CERN’s impact on finding a job outside high-energy physics as positive or very positive. CERN’s services and networks, however, are not found to be very effective in helping finding a new job – a situation that is being addressed, for example, by the recently launched CERN alumni programme.
Theorists participating in the second questionnaire mainly have permanent or tenure-track positions. A large majority of them spent time at CERN’s theory department with short- or medium-term contracts, and this experience seems to improve participants’ careers when leaving CERN for a national institution. On average, about 35% of a theorist’s scientific publications originate from collaborations started at CERN, and a large fraction of theorists (96%) declared that they are satisfied or highly satisfied with their experience at CERN.
Conclusions
As with all such surveys, there is an inherent risk of bias due to the formulation of the questions and the number and type of participants. In practice, only between 20 and 30% of the targeted populations responded, depending on the addressed community, which means the results of the poll cannot be considered as representative of the whole CERN population. Nevertheless, it is clear that the impact of CERN on people’s careers is considered by a large majority of the people polled to be mostly positive, with some areas for improvement such as training and supporting the careers of those who choose to leave CERN and high-energy physics.
In the future this study could be made more significant by collecting similar information on larger samples of people, especially former CERN users. In this respect, the CERN alumni programme could help build a continuously updated database of current and former CERN users and also provide more support for people who decide to leave high-energy physics.
The final results of the survey, mostly in terms of statistical plots, together with a detailed description of the methods used to collect and analyse all the data, have been documented in a CERN Yellow Report, and will also be made available through a dedicated web page.
As generations of particle colliders have come and gone, CERN’s fixed-target experiments have remained a backbone of the lab’s physics activities. Notable among them are those fed by the Super Proton Synchrotron (SPS). Throughout its long service to CERN’s accelerator complex, the 7 km-circumference SPS has provided a steady stream of high-energy proton beams to the North Area at the Prévessin site, feeding a wide variety of experiments. Sequentially named, they range from the pioneering NA1, which measured the photoproduction of vector and scalar bosons, to today’s NA64, which studies the dark sector. As the North Area marks 40 years since its first physics result, this hub of experiments large and small is as lively and productive as ever. Its users continue to drive developments in detector design, while reaping a rich harvest of fundamental physics results.
Specialised and precise
In fixed-target experiments, a particle beam collides with a target that is stationary in the laboratory frame, in most cases producing secondary particles for specific studies. High-energy machines like the SPS, which produces proton beams with a momentum up to 450 GeV/c, give the secondary products a large forward boost, providing intense sources of secondary and tertiary particles such as electrons, muons and hadrons. With respect to collider experiments, fixed-target experiments tend to be more specialised and focus on precision measurements that demand very high statistics, such as those involving ultra-rare decays.
Fixed-target experiments have a long history at CERN, forming essential building blocks in the physics landscape in parallel to collider facilities. Among these were the first studies of the quark–gluon plasma, the first evidence of direct CP violation and a detailed understanding of how nucleon spin arises from quarks and gluons. The first muons in CERN’s North Area were reported at the start of the commissioning run in March 1978, and the first physics publication – a measurement of the production rate of muon pairs by quark–antiquark annihilation as predicted by Drell and Yan – was published in 1979 by the NA3 experiment. Today, the North Area’s physics programme is as vibrant as ever.
The longevity of the North Area programme is explained by the unique complex of proton accelerators at CERN, where each machine is not only used to inject the protons into the next one but also serves its own research programme (for example, the Proton Synchrotron Booster serves the ISOLDE facility, while the Proton Synchrotron serves the Antiproton Decelerator and the n_TOF experiment). Fixed-target experiments using protons from the SPS started taking data while the ISR collider was already in operation in the late 1970s, continued during SPS operation as a proton–antiproton collider in the early 1980s, and again during the LEP and now LHC eras. As has been the case with collider experiments, physics puzzles and unexpected results were often at the origin of unique collaborations and experiments, pushing limits in several technology areas such as the first use of silicon-microstrip detectors.
The initial experimental programme in the North Area involved two large experimental halls: EHN1 for hadronic studies and EHN2 for muon experiments. The first round of experiments in EHN1 concerned studies of: meson photoproduction (NA1); electromagnetic form factors of pions and kaons (NA7); hadronic production of particles with large transverse momentum (NA3); inelastic hadron scattering (NA5); and neutron scattering (NA6). In EHN2 there were experiments devoted to studies with high-intensity muon beams (NA2 and NA4). A third, underground, area called ECN3 was added in 1980 to host experiments requiring primary proton beams and secondary beams of the highest intensity (up to 1010 particles per cycle).
Experiments in the North Area started a bit later than those in CERN’s West Area, which started operation in 1971 with 28 GeV/c protons supplied by the PS. Built to serve the last stage of the PS neutrino programme and the Omega spectrometer, the West Area zone was transformed into an SPS area in 1975 and is best known for seminal neutrino experiments (by the CDHS and CHARM collaborations, later CHORUS and NOMAD) and hadron-spectroscopy experiments with Omega. We are now used to identifying experimental collaborations by means of fancy acronyms such as ATLAS or ALICE, to mention two of the large LHC collaborations. But in the 1970s and the 1980s, one could distinguish between the experiments (identified by a sequential number) and the collaborations (identified by the list of the cities hosting the collaborating institutes). For instance CDHS stood for the CERN–Dortmund–Heidelberg–Saclay collaboration that operated the WA1 experiment in the West Area.
Los Alamos, SLAC, Fermilab and Brookhaven National Laboratory in the US, JINR and the Institute for High Energy Physics in Russia, and KEK in Japan, for example, also all had fixed-target programmes, some of which date back to the 1960s. As fixed-target programmes got into their stride, however, colliders were commanding the energy frontier. In 1980 the CERN North Area experimental programme was reviewed in a special meeting held in Cogne, Italy, and it was not completely obvious that there was a compelling physics case ahead. But it also led to highly optimised installations thanks to strong collaborations and continuous support from the CERN management. Advances in detectors and innovations such as silicon detectors and aerogel Cherenkov counters, plus the hybrid integration of bubble chambers with electronic detectors, led to a revamp in the study of hadron interactions at fixed-target experiments, especially for charmed mesons.
Physics landscape
Experiments at CERN’s North Area began shortly after the Standard Model had been established, when the scale of experiments was smaller than it is today. According to the 1979 CERN annual report, there were 34 active experiments at the SPS (West and North areas combined) and 14 were completed in 1978. This article cannot do justice to all of them, not even to those in the North Area. But over the past 40 years the experimental programme has clearly evolved into at least four main themes: probing nucleon structure with high-energy muons; hadroproduction and photoproduction at high energy; CP violation in very rare decays; and heavy-ion experiments (see “Forty years of fixed-target physics at CERN’s North Area”).
Aside from seminal physics results, fixed-target experiments at the North Area have driven numerous detector innovations. This is largely a result of their simple geometry and ease of access, which allows more adventurous technical solutions than might be possible with collider experiments. Examples of detector technologies perfected at the North Area include: silicon microstrips and active targets (NA11, NA14); rapid-cycling bubble chambers (NA27); holographic bubble chambers (NA25); Cherenkov detectors (CEDAR, RICH); liquid-krypton calorimeters (NA48); micromegas gas detectors (COMPASS); silicon pixels with 100 ps time resolution (NA62); time-projection chambers with dE/dx measurement (ISIS, NA49); and many more. The sheer amount of data to be recorded in these experiments also led to the very early adoption of PC farms for the online systems of the NA48 and COMPASS experiments.
Another key function of the North Area has been to test and calibrate detectors. These range from the fixed-target experiments themselves to experiments at colliders (such as LHC, ILC and CLIC), space and balloon experiments, and bent-crystal applications (such as UA9 and NA63). New detector concepts such as dual-readout calorimetry (DREAM) and particle-flow calorimetry (CALICE) have also been developed and optimised. Recently the huge EHN1 hall was extended by 60 m to house two very large liquid-argon prototype detectors to be tested for the Deep Underground Neutrino Experiment under construction in the US.
If there is an overall theme concerning the development of the fixed-target programme in the North Area, one could say that it was to be able to quickly evolve and adapt to address the compelling questions of the day. This looks set to remain true, with many proposals for new experiments appearing on the horizon, ranging from the study of very rare decays and light dark matter to the study of QCD with hadron and heavy-ion beams. There is even a study under way to possibly extend the North Area with an additional very-high-intensity proton beam serving a beam dump facility. These initiatives are being investigated by the Physics Beyond Collider study (see p20), and many of the proposals explore the high-intensity frontier complementary to the high-energy frontier at large colliders. Here’s to the next 40 years of North Area physics!
Forty years of fixed-target physics at CERN's North Area
High-energy muons are excellent probes with which to investigate the structure of the nucleon. The North Area’s EHN2 hall was built to house two sets of muon experiments: the sequential NA2/NA9/NA28 (also known as the European Muon Collaboration, EMC), which made the observation that nucleons bound in nuclei are different from free nucleons; and NA4 (pictured), which confirmed the electroweak effects between the weak and electromagnetic interactions. A particular success of the North Area’s muon experiments concerned the famous “proton spin crisis”. In the late-1980s, contrary to the expectation by the otherwise successful quark–parton model, data showed that the proton’s spin is not carried by the quark spins. This puzzle interested the community for decades, compelling CERN to further investigate by building the NA47 Spin Muon collaboration experiment in the early 1990s (which established the same result for the neutron) and, subsequently, the COMPASS experiment (which studied the contribution of the gluon spins to the nucleon spin). A second phase of COMPASS still ongoing today, is devoted to nucleon tomography using deeply virtual Compton scattering and, for the first time, polarised Drell–Yan reactions. Hadron spectroscopy is another area of research at the North Area, and among recent important results from COMPASS is the measurement of pion polarisability, which is an important test of low-energy QCD.
Hadroproduction and photoproduction at high energy
Following the first experiment to publish data in the North Area (NA3) concerning the production of μ+μ– pairs from hadron collisions, the ingenuity to combine bubble chambers and electronic detectors led to a series of experiments. The European Hybrid Spectrometer facility housed NA13, NA16, NA22, NA23 and NA27, and studied charm production and many aspects of hadronic physics, while photoproduction of heavy bosons was the primary aim of NA1. A measurement of the charm lifetime using the first ever microstrip silicon detectors was pioneered by the ACCMOR collaboration (NA11/NA32; see image of Robert Klanner next to the ACCMOR spectrometer in 1977), and hadron spectroscopy with neutral final states was studied by NA12 (GAMS), which employed a large array of lead glass counters, in particular a search for glueballs. To study μ+μ– pairs from pion interactions at the highest possible intensities, the toroidal spectrometer NA10 was housed in the ECN3 underground cavern. Nearby in the same cavern, NA14 used a silicon active target and the first big microstrip silicon detectors (10,000 channels) to study charm photoproduction at high intensity. Later, experiment NA30 enabled a direct measurement of the π0 lifetime by employing thin gold foils to convert the photons from the π0 decays. Today, electron beams are used by NA64 to look for dark photons while hadron spectroscopy is still actively pursued, in particular at COMPASS.
The discovery of CP violation in the decay of the long-lived neutral kaon to two pions at Brookhaven National Laboratory in 1964 was unexpected. To understand its origin, physicists needed to make a subtle comparison (in the form of a double ratio) between long- and short-lived neutral kaon decays in pairs of neutral and charged kaons. In 1987 an ambitious experiment (NA31) showed a deviation from one of the double ratios, providing the first evidence of direct CP violation (that is, it happens in the decay of the neutral mesons, not only in the mixing between neutral kaons). A second-generation experiment (NA48, pictured in 1996), located in ECN3 to accept a much higher primary-proton intensity, was able to measure the four decay modes concurrently thanks to the deflection of a tiny fraction of the primary proton beam into a downstream target via channelling in a “bent” crystal. NA48 was approved in 1991 when it became evident that more precision was needed to confirm the original observation (a competing programme at Fermilab called E731 did not find a significant deviation from the unity of the double ratio). Both KTeV (the follow-up Fermilab experiment) and NA48 confirmed NA31’s results, firmly establishing direct CP violation. Continuations of the NA48 experiments studied rare decays of the short-lived neutral kaon and searched for direct CP violation in charged kaons. Nowadays the kaon programme continues with NA62, which is dedicated to the study of very rare K+→π+ννdecays and is complementary to the B-meson studies performed by the LHCb experiment.
In the mid-1980s, with a view to reproduce in the laboratory the plasma of free quarks and gluons predicted by QCD and believed to have existed in the early universe, the SPS was modified to accelerate beams of heavy ions and collide them with nuclei. The lack of a single striking signature of the formation of the plasma demands that researchers look for as many final states as possible, exploiting the evolution of standard observables (such as the yield of muon pairs from the Drell–Yan process or the production rate of strange quarks) as a function of the degree of overlap of the nuclei that participate in the collision (centrality). By 2000 several experiments had, according to CERN Courier in March that year, found “tantalising glimpses of mechanisms that shaped our universe”. The experiments included NA44, NA45, NA49, NA50, NA52 and NA57, as well as WA97 and WA98 in the West Area. Among the most popular signatures observed was the suppression of the J/ψ yield in ion–nucleus collisions with respect to proton–proton collisions, which was seen by NA50. Improved sensitivity to muon pairs was provided by the successor experiment NA60. The current heavy-ion programme at the North Area includes NA61/SHINE (see image), the successor of NA49, which is studying the onset of phase transitions in dense quark–gluon matter at different beam energies and for different beam species. Studies of the quark–gluon plasma continue today, in particular at the LHC and at RHIC in the US. At the same time, NA61/SHINE is measuring the yield of mesons from replica targets for neutrino experiments worldwide and particle production for cosmic-ray studies.
Of all the movements to make science and technology more open, the oldest is “open source” software. It was here that the “open” ideals were articulated, and from which all later movements such as open-access publishing derive. Whilst it rightly stands on this pedestal, from another point of view open-source software was simply the natural extension of academic freedom and knowledge-sharing into the digital age.
Open-source has its roots in the free software movement, which grew in the 1980s in response to monopolising corporations and restrictions on proprietary software. The underlying ideal is open collaboration: peers freely, collectively and publicly build software solutions. A second ideal is recognition, in which credit for the contributions made by individuals and organisations worldwide is openly acknowledged. A third ideal concerns rights, specifically the so-called four freedoms granted to users: to use the software for any purpose; to study the source code to understand how it works; to share and redistribute the software; and to improve the software and share the improvements with the community. Users and developers therefore contribute to a virtuous circle in which software is continuously improved and shared towards a common good, minimising vendor lock-in for users.
Today, 20 years after the term “open source” was coined, and despite initial resistance from traditional software companies, many successful open-source business models exist. These mainly involve consultancy and support services for software released under an open-source licence and extend beyond science to suppliers of everyday tools such as the WordPress platform, Firefox browser and the Android operating system. A more recent and unfortunate business model adopted by some companies is “open core”, whereby essential features are deemed premium and sold as proprietary software on top of existing open-source components.
Founding principles
Open collaboration is one of CERN’s founding principles, so it was natural to extend the principle into its software. The web’s invention brought this into sharp focus. Having experienced first-hand its potential to connect physicists around the globe, in 1993 CERN released the web software into the public domain so that developers could collaborate and improve on it (see CERN’s ultimate act of openness). The following year, CERN released the next web-server version under an open-source licence with the explicit goal of preventing private companies from turning it into proprietary software. These were crucial steps in nurturing the universal adoption of the web as a way to share digital information, and exemplars of CERN’s best practice in open-source software.
Nowadays, open-source software can be found in pretty much every corner of CERN, as in other sciences and industry. Indico and Invenio – two of the largest open-source projects developed at CERN to promote open collaboration – rely on the open-source framework Python Flask. Experimental data are stored in CERN’s Exascale Open Storage system, and most of the servers in the CERN computing centre are running on Openstack – an open-source cloud infrastructure to which CERN is an active contributor. Of course, CERN also relies heavily on open-source GNU/Linux as both a server and desktop operating system. On the accelerator and physics analysis side, it’s all about open source. From C2MON, a system at the heart of accelerator monitoring and data acquisition, to ROOT, the main data-analysis framework used to analyse experimental data, the vast majority of the software components behind the science done at CERN are released under an open-source licence.
Open hardware
The success of the open-source model for software has inspired CERN engineers to create an analogous “open hardware” licence, enabling electronics designers to collaborate and use, study, share and improve the designs of hardware components used for physics experiments. This approach has become popular in many sciences, and has become a lifeline for teaching and research in developing countries.
Being a scientist in the digital age means being a software producer and a software consumer. As a result, collaborative software-development platforms such as GitHub and GitLab have become as important to the physics department as they are to the IT department. Until recently, the software underlying an analysis has not been easily shared. CERN has therefore been developing research data-management tools to enable the publication of software and data, forming the basis of an open-data portal (see Preserving the legacy of particle physics). Naturally, this software itself is open source and has also been used to create the worldwide open-data service Zenodo, which is connected to GitHub to make the publication of open-source software a standard part of the research cycle.
Interestingly, as with the early days of open source, many corners of the scientific community are hesitant about open science. Some people are concerned that their software and data are not of sufficient quality or interest to be shared, or that they will be helping others to the next discovery before them. To triumph over the sceptics, open science can learn from the open-source movement, adopting standard licences, credit systems, collaborative development techniques and shared governance. In this way, it too will be able to reap the benefits of open collaboration: transparency, efficiency, perpetuity and flexibility.
The goal of practising science in such a way that others can collaborate, question and contribute – known as “open science” – long predates the web. One could even argue that it began with the first academic journal 350 years ago, which enabled scientists to share knowledge and resources to foster progress. But the web offered opportunities way beyond anything before it, quickly transforming academic publishing and giving rise to greater sharing in areas such as software. Alongside the open-source (Inspired by software), open-access (A turning point for open-access publishing) and open-data (Preserving the legacy of particle physics) movements grew the era of open science, which aims to encompass the scientific process as a whole.
Today, numerous research communities, political circles and funding bodies view open science and reproducible research as vital to accelerate future discoveries. Yet, to fully reap the benefits of open and reproducible research, it is necessary to start implementing tools to power a more profound change in the way we conduct and perceive research. This poses both sociological and technological challenges, starting from the conceptualisation of research projects, through conducting research, to how we ensure peer review and assess the results of projects and grants. New technologies have brought open science within our reach, and it is now up to scientific communities to agree on the extent to which they want to embrace this vision.
Particle physicists were among the first to embrace the open-science movement, sharing preprints and building a deep culture of using and sharing open-source software. The cost and complexity of experimental particle physics, making complete replication of measurements unfeasible, presents unique challenges in terms of open data and scientific reproducibility. It may even be considered that openness itself, in the sense of having an unfettered access to data from its inception, is not particularly advantageous.
Take the existing data-management policies of the LHC collaborations: while physicists generally strive to be open in their research, the complexity of the data and analysis procedures means that data become publicly open only after a certain embargo period that is used to assess its correctness. The science is thus born “closed”. Instead of thinking about “open data” from its inception, it is more useful to speak about FAIR (findable, accessible, interoperable and reusable) data, a term coined by the FORCE11 community. The data should be FAIR throughout the scientific process, from being initially closed to being made meaningfully open later to those outside the experimental collaborations.
True open science demands more than simply making data available: it needs to concern itself with providing information on how to repeat or verify an analysis performed over given datasets, producing results that can be reused by others for comparison, confirmation or simply for deeper understanding and inspiration. This requires runnable examples of how the research was performed, accompanied by software, documentation, runnable scripts, notebooks, workflows and compute environments. It is often too late to try to document research in such detail once it has been published.
True open science demands more than simply making data available
FAIR data repositories for particle physics, the “closed” CERN Analysis Preservation portal and the “open” CERN Open Data portal emerged five years ago to address the community’s open-science needs. These digital repositories enable physicists to preserve, document, organise and share datasets, code and tools used during analyses. A flexible metadata structure helps researchers to define everything from experimental configurations to data samples, from analysis code to software libraries and environments used to analyse the data, accompanied by documentation and links to presentations and publications. The result is a standard way to describe and document an analysis for the purposes of discoverability and reproducibility.
Recent advancements in the IT industry allow us to encapsulate the compute environments where the analysis was conducted. Capturing information about how the analysis was carried out can be achieved via a set of runnable scripts, notebooks, structured workflows and “containerised” pipelines. Complementary to data repositories, a third service named REANA (reusable analyses) allows researchers to submit parameterised computational workflows to run on remote compute clouds. It can be used to reinterpret preserved analyses but also to run “active” analyses before they are published and preserved, with the underlying philosophy that physics analyses should be automated from inception so that they can be executed without manual intervention. Future reuse and reinterpretation starts with the first commit of the analysis code; altering an already-finished analysis to facilitate its eventual reuse after publication is often too late.
Full control
The key guiding principle of the analysis preservation and reuse framework is to leave the decision as to when a dataset or a complete analysis is shared, privately or publicly, in the hands of the researchers. This gives the experiment collaborations full control over the release procedures, and thus fully supports internal processing and review protocols before the results are published on community services, such as arXiv, HEPData and INSPIRE.
The CERN Open Data portal was launched in 2014 amid a discussion as to whether primary particle-physics data would find any use outside of the LHC collaborations. Within a few years, the first paper based on open data from the CMS experiment was published (see Preserving the legacy of particle physics).
Three decades after the web was born, science is being shared more openly than ever and particle physics is at the forefront of this movement. As we have seen, however, simple compliance with data and software openness is not enough: we also need to capture, from the start of the research process, runnable recipes, software environments, computational workflows and notebooks. The increasing demand from funding agencies and policymakers for open data-management plans, coupled with technological progress in information technology, leads us to believe that the time is ripe for this change.
Sharing research in an easily reproducible and reusable manner will facilitate knowledge transfer within and between research teams, accelerating the scientific process. This fills us with hope that three decades from now, even if future generations may not be able to run our current code on their futuristic hardware platforms, they will be at least well equipped to understand the processes behind today’s published research in sufficient detail to be able to check our results and potentially reveal something new.
High-energy physics (HEP) has been at the forefront of open-access publishing, the long-sought ideal to make scientific literature freely available. An early precursor to the open-access movement in the late 1960s was the database management system SPIRES (Stanford Physics Information Retrieval System), which aggregated all available (paper-copy) preprints that were sent between different institutions. SPIRES grew to become the first database accessible through the web in 1991 and later evolved into INSPIRE-HEP, hosted and managed by CERN in collaboration with other research laboratories.
The electronic era
The birth of the web in 1989 changed the publishing scene irreversibly. Vast sums were invested to take the industry from paper to online and to digitise old content, resulting in a migration from the sale of printed copies of journals to electronic subscriptions. From 1991, helped by the early adoption by particle physicists, the self-archiving repository arXiv.org allowed rapid distribution of electronic preprints in physics and, later, mathematics, astronomy and other sciences. The first open-access journals then began to sprout up and in early 2000 three major international events – the Budapest Open Access Initiative, Bethesda Statement on Open Access Publishing and the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities – set about leveraging the new technology to grant universal free access to the results of scientific research.
Today, roughly one quarter of all scholarly literature in sciences and humanities is open access. In HEP, the figure is almost 90%. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3), a global partnership between libraries, national funding agencies and publishers of HEP journals, has played an important role in HEP’s success. Designed at CERN, SCOAP3 started operation in 2014 and removes subscription fees for journals and any expenses scientists might incur to publish their articles open access by paying publishers directly. Some 3000 institutions from 43 countries (figure 1) contribute financially according to their scientific output in the field, re-using funds previously spent on subscription fees for journals that are now open access.
“SCOAP3 has demonstrated how open access can increase the visibility of research and ease the dissemination of scientific results for the benefit of everyone,” says SCOAP3 operations manager Alex Kohls of CERN. “This initiative was made possible by a strong collaboration of the worldwide library community, researchers, as well as commercial and society publishers, and it can certainly serve as an inspiration for open access in other fields.”
Plan S
On 4 September 2018, a group of national funding agencies, the European Commission (EC) and the European Research Council – under the name “cOAlition S” – launched a radical initiative called Plan S. Its aim is to ensure that, by 2020, all scientific publications that result from research funded by public grants must be published in compliant open-access journals or platforms. Robert-Jan Smits, the EC’s open-access envoy and one of the architects of Plan S, cites SCOAP3 as an inspiration for the project and says that momentum for Plan S has been building for two decades. “During those years many declarations, such as the Budapest and Berlin ones, were adopted, calling for a rapid transition to full and immediate open access. Even the 28 science ministers of the European Union issued a joint statement in 2016 that open access to scientific publications should be a reality by 2020,” says Smits. “The current situation shows, however, that there is still a long way to go.”
Recently, China released position papers supporting the efforts of Plan S, which could mark a key moment for the project. But the reaction of scientists around the world has been mixed. An open letter published in September by biochemist Lynn Kamerlin of Uppsala University in Sweden, attracting more than 1600 signatures at the time of writing, argues that Plan S would strongly reduce the possibilities to publish in suitable scientific journals of high quality, possibly splitting the global scientific community into two separate systems. Another open letter, published in November by biologist Michael Eisen at University of California Berkeley with around 2000 signatures, backs the principles of Plan S and supports its commitment “to continue working with funders, universities, research institutions and other stakeholders until we have created a stable, fair, effective and open system of scholarly communication.”
Challenges ahead
High-energy physics is already aligned to the Plan S vision thanks to SCOAP3, says Salvatore Mele of CERN, who is one of SCOAP3’s architects. But for other disciplines “the road ahead is likely to be bumpy”. “Funders, libraries and publishers have cooperated through CERN to make SCOAP3 possible. As most of the tens of thousands of scholarly journals today operate on a different model, with access mostly limited to readers paying subscription fees, this vision implies systemic challenges for all players: funders, libraries, publishers and, crucially, the wider research community,” he says.
It is publishers who are likely to face the biggest impact from Plan S. However, the Open Access Scholarly Publishers Association (OASPA) – which includes, among others, the American Physical Society, IOP Publishing (which publishes CERN Courier) and The Royal Society – recently published a statement of support, claiming OASPA “would welcome the opportunity to provide guidance and recommendations for how the funding of open-access publications should be implemented within Plan S”, while emphasising that smaller publishers, scholarly societies and new publishing platforms need to be included in the decision-making process.
Responding to an EC request for Plan S feedback that was open until 8 February, however, publishers have expressed major concerns about the pace of implementation and about the consequences of Plan S for hybrid journals. In a statement on 12 February, the European Physical Society, while supportive of the Plan S rationale, wrote that “several of the governing principles proposed for its implementation are not conducive to a transition to open access that preserves the important assets of today’s scientific publication system”. In another statement, the world’s largest open-access publisher, Springer Nature, released a list of six recommendations for funding bodies worldwide to adopt in order for full open-access to become a reality, highlighting the differences between “geographic, funder and disciplinary needs”. In parallel, a group of learned societies in mathematics and science in Germany has reacted with a statement citing a “precipitous process” that infringes the freedom of science, and urged cOAlition S to “slow down and consider all stakeholders”.
Global growth
Smits thinks traditional publishers, which are a critical element in quality control and rigorous peer review in scholarly literature, should adopt a fresh look, for example by implementing more transparent metrics. “It is obvious that the big publishers that run the subscription journals and make enormous profits prefer to keep the current publishing model. Furthermore, the dream of each scientist is to publish in a so-called prestigious high-impact journal, which shows that the journal impact factor is still very present in the academic world,” says Smits. “To arrive at the necessary change in academic culture, new metrics need to be developed to assess scientific output. The big challenge for cOAlition S is to grow globally, by having more funders signing up.”
Undoubtedly we are at a turning point between the old and new publishing worlds. The EC already requires that all publications from projects receiving its funding be made open access. But Plan S goes further, proposing an outright shift in scholarly publication. It is therefore crucial to ensure a smooth shift that takes into account all the actors, says Mele. “Thanks to SCOAP3, which has so far supported the publication of more than 26,000 articles, the high-energy physics community is fortunate to meet the vision of Plan S, while retaining researcher choice of the most appropriate place to publish their results.”
In the 17th century, Galileo Galilei looked at the moons of Jupiter through a telescope and recorded his observations in his now-famous notebooks. Galileo’s notes – his data – survive to this day and can be reviewed by anyone around the world. Students, amateurs and professionals can replicate Galileo’s data and results – a tenet of the scientific method.
In particle physics, with its unique and expensive experiments, it is practically impossible for others to attempt to reproduce the original work. When it is impractical to gather fresh data to replicate an analysis, we settle for reproducing the analysis with the originally obtained data. However, a 2013 study by researchers at the University of British Columbia, Canada, estimates that the odds of scientific data existing in an analysable form reduce by about 17% each year.
Indeed, just a few years down the line it might not even be possible for researchers to revisit their own data due to changes in formats, software or operating systems. This has led to growing calls for scientists to release and archive their data openly. One motivation is moral: society funds research and so should have access to all of its outputs. Another is practical: a fresh look at data could enable novel research and lead to discoveries that may have eluded earlier searches.
Like open-access publishing (see A turning point for open-access publishing), governments have started to impose demands on scientists regarding the availability and long-term preservation of research data. The European Commission, for example, has piloted the mandatory release of open data as part of its Horizon 2020 programme and plans to invest heavily in open data in the future. An increasing number of data repositories have been established for life and medical sciences as well as for social sciences and meteorology, and the idea is gaining traction across disciplines. Only days after they announced the first observation of gravitational waves, the LIGO and VIRGO collaborations made public their data. NASA also releases data from many of its missions via open databases, such as exoplanet catalogues. The Natural History Museum in London makes data from millions of specimens available via a website and, in the world of art, the Rijksmuseum in Amsterdam provides an interface for developers to build apps featuring historic artworks.
Data levels
The open-data movement is of special interest to particle physics, owing to the uniqueness and large volume of datasets involved in discoveries such as that of the Higgs boson at the Large Hadron Collider (LHC). The four main LHC experiments have started to periodically release their data in an open manner, and these data can be classified into four levels. The first consists of the data shown in final publications, such as plots and tables, while the second concerns datasets in a simplified format that are suitable for “lightweight” analyses in educational or similar contexts. The third level involves the data being used for analysis by the researchers themselves, requiring specialised code and dedicated computing resources, and the final level with the highest complexity is the raw data generated by the detectors, which requires petabytes of storage and, uncalibrated, is not of much use without being fed to the third tier.
In late 2014 CERN launched an open-data portal and released research data from the LHC for the first time. The data, collected by the CMS experiment, represented half the level-three data recorded in 2010. The ALICE experiment has also released level-three data from proton–proton as well as lead–lead collisions, while all four collaborations – including ATLAS and LHCb – have released subsets of level-two data for education and outreach purposes.
Proactive policy
The story of open data at CMS goes back to 2011. “We started drafting an open-data policy, not because of pressure from funding agencies but because defining our own policy proactively meant we did not have an external body defining it for us,” explains Kati Lassila-Perini, who leads the collaboration’s data-preservation project. CMS aims to release half of each year’s level-three data three years after data taking, and 100% of the data within a ten-year window. By guaranteeing that people outside CMS can use these data, says Lassila-Perini, the collaboration can ensure that the knowledge of how to analyse the data is not lost, while allowing people outside CMS to look for things the collaboration might not have time for. To allow external re-use of the data, CMS released appropriate metadata as well as analysis examples. The datasets soon found takers and, in 2017, a group of theoretical physicists not affiliated with the collaboration published two papers using them. CMS has since released half its 2011 data (corresponding to around 200 TB) and half its 2012 data (1 PB), with the first releases of level-three data from the LHC’s Run 2 in the pipeline.
The LHC collaborations have been releasing simpler datasets for educational activities from as early as 2011, for example for the International Physics Masterclasses that involve thousands of high-school students around the globe each year. In addition, CMS has made available several Jupyter notebooks – a browser-based analysis platform named with a nod to Galileo – in assorted languages (programming and human) that allow anyone with an internet connection to perform a basic analysis. “The real impact of open data in terms of numbers of users is in schools,” says Lassila-Perini. “It makes it possible for young people with no previous contact with coding to learn about data analysis and maybe discover how fascinating it can be.” Also available from CMS are more complex examples aimed at university-level students.
Open-data endeavours by ATLAS are very much focused on education, and the collaboration has provided curated datasets for teaching in places that may not have substantial computing resources or internet access. “Not even the documentation can rely on online content, so everything we produce needs to be self-contained,” remarks Arturo Sánchez Pineda, who coordinates ATLAS’s open-data programme. ATLAS datasets and analysis tools, which also rely on Jupyter notebooks, have been optimised to fit on a USB memory stick and allow simplified ATLAS analyses to be conducted just about anywhere in the world. In 2016, ATLAS released simplified open data corresponding to 1 fb–1 at 8 TeV, with the aim of giving university students a feel for what a real particle-physics analysis involves.
ATLAS open data have already found their way into university theses and have been used by people outside the collaboration to develop their own educational tools. Indeed, within ATLAS, new members can now choose to work on preparing open data as their qualification task to become an ATLAS co-author, says Sánchez Pineda. This summer, ATLAS will release 10 fb–1 of level-two data from Run 2, with more than 100 simulated physics processes and related resources. ATLAS does not provide level-three data openly and researchers interested in analysing these can do so through a tailored association programme, which 80 people have taken advantage of so far. “This allows external scientists to rely on ATLAS software, computing and analysis expertise for their project,” says Sánchez Pineda.
Fundamental motivation
CERN’s open-data portal hosts and serves data from the four big LHC experiments, also providing many of the software tools including virtual machines to run the analysis code. The OPERA collaboration recently started sharing its research data via CERN and other particle-physics collaborations are interested in joining the project.
Although high-energy physics has made great strides in providing open access to research publications, we are still in the very early days of open data. Theorist Jesse Thaler of MIT, who led the first independent analysis using CMS open data, acknowledges that it is possible for people to get their hands on coveted data by joining an experimental collaboration, but sees a much brighter future with open data. “What about more exploratory studies where the theory hasn’t yet been invented? What about engaging undergraduate students? What about examining old data for signs of new physics?” he asks. These provocative questions serve as fundamental motivations for making all data in high-energy physics as open as possible.
At a mere 30 years old, the World Wide Web already ranks as one of humankind’s most disruptive inventions. Developed at CERN in the early 1990s, it has touched practically every facet of life, impacting industry, penetrating our personal lives and transforming the way we transact. At the same time, the web is shrinking continents and erasing borders, bringing with it an array of benefits and challenges as humanity adjusts to this new technology.
This reality is apparent to all. What is less well known, but deserves recognition, is the legal dimension of the web’s history. On 30 April 1993, CERN released a memo (see image) that placed into the public domain all of the web’s underlying software: the basic client, basic server and library of common code. The document was addressed “To whom it may concern” – which would suggest the authors were not entirely sure who the target audience was. Yet, with hindsight, this line can equally be interpreted as an unintended address to humanity at large.
The legal implication was that CERN relinquished all intellectual property rights in this software. It was a deliberate decision, the intention being that a no-strings-attached release of the software would “further compatibility, common practices, and standards in networking and computer supported collaboration” – arguably modest ambitions for what turned out to be such a seismic technological step. To understand what seeded this development you need to go back to the 1950s, at a time when “software” would have been better understood as referring to clothing rather than computing.
European project
CERN was born out of the wreckage of World War II, playing a role, on the one hand, as a mechanism for reconciliation between former belligerents, while, on the other, offering European nuclear physicists the opportunity to conduct their research locally. The hope was that this would stem the “brain drain” to the US, from a Europe still recovering from the devastating effects of war.
In 1953, CERN’s future Member States agreed on the text of the organisation’s founding Convention, defining its mission as providing “for collaboration among European States in nuclear research of a pure scientific and fundamental character”. With the public acutely aware of the role that destructive nuclear technology had played during the war, the Convention additionally stipulated that CERN was to have “no concern with work for military requirements” and that the results of its work, were to be “published or otherwise made generally available”.
In the early years of CERN’s existence, the openness resulting from this requirement for transparency was essentially delivered through traditional channels, in particular through publication in scientific journals. Over time, this became the cultural norm at CERN, permeating all aspects of its work both internally and with its collaborating partners and society at large. CERN’s release of the WWW software into the public domain, arguably in itself a consequence of the openness requirement of the Convention, could be seen as a precursor to today’s web-based tools that represent further manifestations of CERN’s openness: the SCOAP3 publishing model, open-source software and hardware, and open data.
Perhaps the best measure for how ingrained openness is in CERN’s ethos as a laboratory is to ask the question: “if CERN would have known then what it knows now about the impact of the World Wide Web, would it still have made the web software available, just as it did in 1993?” We would like to suggest that, yes, our culture of openness would provoke the same response now as it did then, though no doubt a modern, open-source licensing regime would be applied.
A culture of openness
This, in turn, can be viewed as testament and credit to the wisdom of CERN’s founders, and to the CERN Convention, which remains the cornerstone of our work to this day.
The study of the production of quarkonia, the bound states of heavy quark–antiquark pairs, is an important goal of the ALICE physics programme. The quarkonium yield is suppressed in heavy-ion collisions when compared with proton–proton collisions because the binding force is screened by the hot and dense medium. This suppression is expected to be greatest for events with high “centrality”, when the heavy ions collide head-on.
The ALICE collaboration has recently analysed the suppression of inclusive bottomonium (bb̅) production in lead-lead collisions relative to proton–proton collisions. This reduction is quantified in terms of the nuclear modification factor RAA, which is the ratio of the measured yield in lead-lead to proton–proton collisions corrected by the number of binary nucleon–nucleon collisions. An RAA value of unity would indicate no suppression whereas zero indicates full suppression. The bottomonium states ϒ(1S) and ϒ(2S) were measured via their decays to muon pairs at a centre-of-mass energy per nucleon–nucleon pair of 5.02 TeV, in the rapidity range 2.5 < η< 4, with a maximum transverse momentum of 15 GeV/c. No significant variation of RAA is observed as a function of transverse momentum and rapidity, however, production is suppressed with increasing centrality (figure 1). A decrease in RAA from 0.60±0.10(stat)±0.04(syst) for the peripheral 50–90% of collisions to 0.34±0.03(stat)±0.02(syst) for the 0–10% most central collisions was observed.
Theoretical models must deal with the competing effects of melting and (re)generation of the ϒ within the quark-gluon plasma,the shadowing of parton densities in the initial state and “feed-down” from higher resonance states. Due to uncertainties on the parton density, is not yet known whether the direct production of ϒ(1S) is suppressed, or merely the feed-down from ϒ(2S) and other higher-mass states. Nevertheless, the precision of these measurements imposes significant new constraints on the modelling of ϒ production in lead-lead collisions.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.