A new measurement by the ALICE collaboration has demonstrated for the first time that jets become narrower after “quenching” in quark–gluon plasma (QGP). RHIC and LHC data show that the QGP behaves like a strongly-coupled liquid with very low viscosity, but it is an open question how this arises from the asymptotic limit of weakly-coupled quarks and gluons at short lengths. The new results provide quantitative new insights into the hot and dense medium created in heavy-ion collisions and how it modifies the substructure of jets and dissipates part of their energy.
An important property of the QGP is its ability to “resolve” nearby partons as effectively independent colour charges above the medium’s characteristic resolution scale – a parameter that is very poorly predicted by theory, but thought to be in the vicinity of a femtometre or less. In recent years, jet quenching has been proposed to determine this scale. Jets originate from a single quark or gluon that showers into more partons, either by radiating a gluon or splitting into a quark–antiquark pair. When a jet moves through the medium, each individual splitting results in two distinct colour charges that, depending on their angular separation and the medium’s resolution length, can interact as one coherent object or two independent charges. At the LHC, we can put our understanding of this resolution scale to test using special measurements of the angular structure of jets. This allows us to test whether wider jets are more likely to be resolved.
To identify the relevant two-prong splittings, ALICE “groomed” jets using track clustering. The algorithm reclusters and unwinds the jet shower to find the first parton splitting satisfying a grooming condition (figure 1). The excellent tracking resolution in ALICE allows for very precise measurements of jet substructure even at small angular distance scales. The angular width of the jet was found to be significantly modified in Pb–Pb compared to pp collisions (figure 2). In particular, wider splittings are suppressed in Pb–Pb compared to pp collisions, demonstrating that the interaction of jets with the QGP filters out wide jets.
This measurement is the first of its kind to be fully corrected for large background effects, allowing direct quantitative comparisons with theoretical calculations of jet quenching. Most theoretical models describe the general narrowing trend seen in the data, despite the different implementations of jet-medium interactions. The data is consistent with models implementing an incoherent interaction in which the medium resolves the splittings (Pablos, Lres = 0). Interestingly, however, another calculation demonstrates this narrowing effect with a fully coherent interaction, in which the jet splittings are not resolved, but by modifying the initial quark and gluon fractions (Yuan, quark). While the precision of the data currently precludes a precise extraction of the medium’s resolving power within a given model, the measurement places quantitative constraints on medium properties, and demonstrates for the first time a direct modification to the angular structure of jets in heavy-ion collisions. This opens the door to increasingly precise measurements with the high-precision data anticipated in LHC Run 3.
The possibility that the proton wave function may contain a |uudcc> component in addition to the g → cc splitting arising from perturbative gluon radiation has been debated for decades. In favour of such “intrinsic charm” (IC), light-front QCD (LFQCD) calculations predict that non-perturbative IC manifests as percent-level valence-like charm content in the parton distribution functions (PDFs) of the proton. On the other hand, if the charm–quark content is entirely perturbative in nature, the charm PDF should resemble that of the gluon and decrease sharply at large momentum fractions, x. The proton could also contain intrinsic beauty, but suppressed by a factor of order m2c/m2b. The picture for intrinsic strangeness is somewhat murkier due to the lighter mass of the strange quark.
Measurements of charm-hadron production in deep-inelastic scattering and in fixed-target experiments, with typical momentum transfers below Q = 10 GeV, have been interpreted as evidence both for and against the IC predicted by LFQCD. Even though such experiments are in principle sensitive to valence-like c-quark content, interpreting low-Q data is challenging since it requires a careful theoretical treatment of hadronic and nuclear effects. Recent global PDF analyses, which also include measurements by ATLAS, CMS and LHCb, are inconclusive and can only exclude a relatively large IC component carrying more than a few percent of the momentum of the proton.
Using its Run-2 data, LHCb recently studied IC by making the first measurement of the fraction of Z+jet events that contain a charm jet in the forward region of proton–proton collisions. Since Zc production is inherently at large Q, above the electroweak scale, hadronic effects are small. A leading-order Zc production mechanism is gc → Zc scattering (figure 1), where in the forward region one of the initial partons must have large x, hence Zc production probes the valence-like region.
The spectrum observed by LHCb exhibits a sizable enhancement at forward Z rapidities (figure 2), consistent with the effect expected if the proton wave function contains the |uudcc> component predicted by LFQCD. Incorporating these results into global PDF analyses should strongly constrain the large-x charm PDF, both in size and shape – and could reveal that the proton contains valence-like intrinsic charm.
These results demonstrate the unique sensitivity of the LHCb experiment to the valence-like content of the proton. Looking forward to Run 3, increased luminosity will lead to a substantial improvement in the precision of this measurement, which should provide an even clearer picture of just how charming the proton is.
New ways to detect long-lived particles (LLPs) are opening up avenues for searching for physics beyond the Standard Model (SM). LLPs could provide evidence for a hidden dark sector of particles that includes dark-matter candidates and could be studied via “portal interactions” with the visible universe. By employing the CMS experiment’s muon spectrometer in a novel way, the collaboration has recently deployed a powerful new technique for detecting LLPs that decay between 6 and 10 metres from the primary interaction point.
An LLP decaying in the endcap muon spectrometer volume should produce a particle shower when its decay products interact with the return yoke of the CMS solenoid. The secondary particles produced by the shower would traverse the gaseous regions of the cathode-strip chamber (CSC) detector and produce a large multiplicity of signals on the wire anodes and strip cathodes. Localised hits are reconstructed by combining these signals using a density-based clustering algorithm. This is the first time the CSC detectors have been used as a sampling calorimeter to try to detect and identify LLP decays.
Searching for CSC clusters with a sufficiently large number of hits suppresses background processes while maintaining a high efficiency for detecting potential LLP decays. The large amount of steel in the CMS return yoke nearly eliminates “punch-through” hadrons that are not fully stopped by the calorimeter, potentially mimicking the signature of an LLP. The largest remaining source of backgrounds is known LLPs produced by SM processes such as the neutral kaon, KL. These particles are copiously produced in LHC collisions and, on rare occasions, traverse the material without being stopped. Kaons are predominantly produced with much lower energies than the signal LLPs and therefore result in clusters with a smaller number of hits. Requiring clusters with more than 130 CSC hits suppresses these dominant background events to a negligible level (see figure 1).
This search improves on the previous best results by more than a factor of six
Using the full Run-2 dataset, the CMS collaboration detected no excess of particle-shower events above the expected backgrounds, setting constraints on a benchmark-simplified model of scalar LLP production mediated by the Higgs boson (a so-called Higgs portal model). This search improves on the previous best results by more than a factor of six (two) for an LLP mass of 7 GeV (≥ 15) GeV for a proper decay length (cτ) of the scalar larger than 100 m. It is the first to be sensitive to LLP decays with cτ up to 1000 m and masses between 40 and 55 GeV at branching ratios of the Higgs to a pair of LLPs below 20%.
This novel approach to identifying showers in muon detectors opens up an exciting new programme of searches for LLPs in a wide variety of theoretical models. Potential frameworks range from Higgs-portal models to other portals to a dark sector, including neutrinos, axions and dark photons. The on-going development of a dedicated Level-1 and High-Level Trigger focusing on particle showers detected in the CMS muon spectrometer promises an order of magnitude improvement in the discovery sensitivity for LLPs in the forthcoming run of the LHC.
Launched one year ago, the CERN Quantum Technology Initiative (QTI) will see high-energy physicists and others play their part in a global effort to bring about the next “quantum revolution”, whereby phenomena such as superposition and entanglement are exploited to build novel computing, communication, sensing and simulation devices (CERN Courier September/October 2020 p47).
On 14 October, the CERN QTI coordination team announced a strategy and roadmap to establish joint research, educational and training activities, set up a supporting resource infrastructure, and provide dedicated mechanisms for exchange of knowledge and technology. Oversight for the CERN QTI will be provided by a newly established advisory board composed of international experts nominated by CERN’s 23 Member States.
As an international, open and neutral platform, describes the roadmap document, CERN is uniquely positioned to act as an “honest broker” to facilitate cross-disciplinary discussions between CERN Member States and to foster innovative ideas in high-energy physics and beyond. This is underpinned by several R&D projects that are already under way at CERN across four main areas: quantum computing and algorithms; quantum theory and simulation; quantum sensing, metrology and materials; and quantum communication and networks. These projects target applications such as quantum-graph neural networks for track reconstruction, quantum support vector machines for particle classification, and quantum generative adversarial networks for physics simulation, as well as new sensors and materials for future detectors, and quantum-key-distribution protocols for distributed data analysis.
Education and training are also at the core of the CERN QTI. Building on the success of its first online course on quantum computing, the initiative plans to extend its academia–industry training programme to build competencies across different R&D and engineering activities for the new generation of scientists, from high-school students to senior researchers.
Co-chairs of the CERN QTI advisory board, Kerstin Borras and Yasser Omar, stated: “The road map builds on high-quality research projects already ongoing at CERN, with top-level collaborations, to advance a vision and concrete steps to explore the potential of quantum information science and technologies for high-energy physics”.
On 5 October, Syukuro Manabe (Princeton), Klaus Hasselmann (MPI for Meteorology) and Giorgio Parisi (Sapienza University of Rome) were announced as the winners of the 2021 Nobel Prize in Physics for their groundbreaking contributions to the understanding of complex physical systems, which provided rigorous scientific foundations to our understanding of Earth’s climate. Sharing half the 10 million Swedish kronor award, Manabe and Hasselmann were recognised “for the physical modelling of Earth’s climate, quantifying variability and reliably predicting global warming”. Parisi, who started out in high-energy physics, received the other half of the award “for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales”.
In the early 1960s, Manabe developed a radiative-convective model of the atmosphere and explored the role of greenhouse gases in maintaining and changing the atmosphere’s thermal structure. It was the beginning of a decades-long research programme on global warming that he undertook in collaboration with the Geophysical Fluid Dynamics Laboratory, NOAA. Hasselmann, who was founding director of the Max Planck Institute for Meteorology in Hamburg from 1975 to 1999, developed techniques that helped establish the link between anthropogenic CO2 emissions and rising global temperatures. He published a series of papers in the 1960s on non-linear interactions in ocean waves, in which he adapted Feynman-diagram formalism to classical random-wave fields.
Parisi, a founder of the study of complex systems, enabled the understanding and description of many different and apparently entirely random materials and phenomena in physics, biology and beyond, including the flocking of birds. Early in his career, he also made fundamental contributions to particle physics, the most well-known being the derivation, together with the late Guido Altarelli and others, of the “DGLAP” QCD evolution equations for parton densities. “My mentor Nicola Cabibbo was usually saying that we should work on a problem only if working on the problem is fun,” said Parisi following the announcement. “So I tried to work on something that was interesting and which I believed that had some capacity to add something.”
As per last year, the traditional December award ceremony will take place online due to COVID-19 restrictions.
The diffuse photon background that fills the universe does not limit itself to the attention-hogging cosmic microwave background, but spans a wide spectrum extending up to TeV energies. The origin of the photon emission at X-ray and gamma-ray wavelengths, first discovered in the 1970s, remains poorly understood. Many possible sources have been proposed, ranging from active galactic nuclei to dark-matter annihilation. Thanks to many years of gamma-ray data from the Fermi Large Area Telescope (Fermi-LAT), a group from Australia and Italy has now produced a model that links part of the diffuse emission to star-forming galaxies (SFGs).
As their name implies, SFGs are galaxies in which stars are formed, and therefore also die through supernova events. Such sources, which include our own Milky Way, have gained interest from gamma-ray astronomers during the past decade because several resolvable SFGs have been shown to emit in the 100 MeV to 1 TeV energy range. Given their preponderance, SFGs are thus a prime-suspect source of the diffuse gamma-ray background.
Clear correlation
The source of gamma rays within SFGs is very likely the interaction between cosmic rays and the interstellar medium (ISM). The cosmic rays, in turn, are thought to be accelerated within the shockwaves of supernova remnants, after which they interact with the ISM to produce a hadronic cascade. The cascade includes neutral pions, which decay into gamma rays. This connection between supernova remnants and gamma rays is strengthened by a clear correlation between the star-formation rate in a galaxy and the gamma-ray flux they emit. Additionally, such sources are theorised to be responsible for the neutrino emission detected by the IceCube observatory over the past few years, which also appears to be highly isotropic.
Based on additional SFG gamma-ray sources found by Fermi–LAT, which could be used for validation, the Australian/Italian group developed a physical model to study the contribution of SFGs to the cosmic diffuse gamma-ray background. The model used to predict the gamma-ray emission from galaxies starts with the spectra of charged cosmic-rays produced in the numerous supernovae remnants within a galaxy, and greatly benefits from data collected from several such remnants present in the Milky Way. Subsequently the production and energies of gamma rays through their interaction of cosmic rays with the ISM is modelled, followed by the gamma-ray transport to Earth, which includes losses due to interactions with low-energy photons leading to pair production.
The main uncertainty in previous models was the efficiency of a galaxy to transform the energy from cosmic rays into gamma rays, since it is not possible to use our own galaxy to measure it. The big breakthrough in the new work is a more thorough theoretical modelling of this efficiency, which was first tested extensively using data from resolved SFG sources. After such tests proved successful, the model could be applied to predict the gamma-ray emission properties of galaxies spanning the history of the universe. These predictions indicate that the low-energy part of the spectrum can be largely attributed to galaxies from the so-called cosmic noon: the period when star formation in large galaxies was at its peak, about 10 billion years ago. Nearby galaxies, on the other hand, explain the high-energy part of the spectrum, which, for old and distant sources, is absorbed in the intergalactic medium by low-energy photons undergoing pair production with TeV emission. Overall, the model predicts not only the spectral shape but also the overall flux (see “Good fit” figure), negating the need for other possible sources such as active galactic nuclei or dark matter.
These new results once again indicate the importance of star-forming regions for astrophysics, after also recently being proposed as a possible source of PeV cosmic rays by LHAASO (CERN Courier July/August 2021 p11). Furthermore, it shows the potential for an expansion to other astrophysical messengers, with the authors stating their ambition to apply the same model to radio-emission and high-energy neutrinos.
Twenty-five years. That is the time we have from now to ensure a smooth transition between the LHC and the next major collider at CERN. Twenty-five years to approve a project, find the necessary funding, solve administrative problems and define a governance model; to dig a tunnel, equip it with a cutting-edge accelerator, and design and build experiments at the limits of technology.
One of the most memorable moments of my time as president of the CERN Council came on 19 June 2020, when delegates from CERN’s Member States adopted a resolution updating the European strategy for particle physics. The implementation of the European strategyrecommendations is now in full swing, based around two major topics for CERN’s long-term future: the Future Circular Collider (FCC) feasibility study with an organisational schema in place, and the elaboration of roadmaps for advanced accelerator and detector technologies. At the next strategy update towards the middle of the decade, we should be able to decide if the first phase of the FCC – an electron–positron collider operating at centre-of-mass energies from 91 GeV (the Z mass) to 365 GeV (above the tt production threshold) – can be built, paving the way for a hadron collider with an energy of at least 100 TeV in the same tunnel. By then, we should also have a clearer picture of the potential of novel accelerator technologies, such as muon colliders or plasma acceleration.
Besides the purely technical questions, many other challenges lie ahead. It will be indispensable to attract major interregional partners to CERN’s next large project. Together with the scientific impact, the socioeconomic benefits of skills and technologies built through large research infrastructures are increasingly recognised, which makes a new collider an appealing prospect for states to participate in. But what collaboration model can we elaborate together that is fair and efficient? How can we build bridges to other projects currently discussed, such as the ILC? The US recently started its own “Snowmass” strategy process, which may also impact the decisions ahead.
Neither the implementation of the technology roadmaps nor the FCC feasibility study, and far less its construction, can be carried out by CERN alone. Without a tight network of collaboration and exchanges it will not be possible to find the brains, the hands and the financial resources to ensure that CERN continues to thrive in the long term. The collaboration and support from laboratories and institutes in CERN’s Member and Associate Member States and beyond are crucial. Can we imagine new ways to enhance and to intensify the collaboration, to spread the quality and to share the savoir faire? Understanding where difficulties may lay merits continued efforts.
For projects that reach far into the century, we will need the curiosity, creativity and motivation of young people entering our field. Efforts such as the recent ECFA early-career researcher survey are salutary. But are there other means through which we can broaden the freedom and creativity for young scientists within our highly organised collaborations? If there are silver linings to the pandemic, one is surely the increased accessibility to scientific discourse for a greater range of young and diverse researchers that our adaptation to virtual meetings has demonstrated.
Societal acceptance will also be crucial in convincing local communities to accept the impact of a new, big project. Developing environmentally friendly technologies is one factor, especially if we can contribute with innovative solutions. In this context, the launch in September 2020 of CERN’s first public environment report (with a second report about to be published) is timely. CERN’s new education and outreach centre, the Science Gateway, will also significantly increase the number of people who can visit and be inspired by CERN.
For projects that reach far into the century, we will need the curiosity, creativity and motivation of young people entering
our field
The enormous amount of work that has taken place during Long Shutdown 2 lays the foundation for the HL-LHC later this decade. However, beyond ensuring the success of this flagship programme, and that of CERN’s large and diverse portfolio of non-collider experiments, we must clearly and carefully explain the case for continued exploration at the energy frontier. To other scientists: we all benefit from mutual exchange and stimulation. To teachers and educators: we can contribute to make science fascinating and help attract young people into STEM subjects. To society: we can help increase scientific literacy, which is crucial for democracies to distinguish sense from, well, nonsense.
Twenty-five years is not long. And no matter our individual roles at CERN, we each have our work cut out. Together, we need to stand behind this unique laboratory, be proud of its past achievements, and embrace the changes necessary to build its – and our – future.
Having led the SKAO for almost a decade, how did it feel to get the green light for construction in June this year?
The project has been a long time in gestation and I have invested much of my professional life in the SKA project. When the day came, I was 95% confident that the SKAO council would give us the green light to proceed, as we were still going through ratification processes in national parliaments. I sent a message to my senior team saying: “This is the most momentous week of my career” because of the collective effort of so many people in the observatory and across the entire partnership over so many years. It was a great feeling, even if we couldn’t celebrate properly because of the pandemic.
What will the SKA telescopes do that previous radio telescopes couldn’t?
The game changer is the sheer size of the facility. Initially, we’re building 131,072 low-frequency antennas in Western Australia (“SKA-Low”) and 197 15 m-class dishes in South Africa (“SKA-Mid”). This will provide us with up to a factor of 10 improvement in our ability to see fainter details in the universe. The long-term SKA vision will increase the sensitivity by a further factor of 10. We’ve got many science areas, but two are going to be unique to us. One is the ability to detect hydrogen all the way back to the epoch of reionisation, also called the “cosmic-dawn”. The frequency range that we cover, combined with the large collecting area and the sensitivity of the two radio telescopes, will allow us to make a “movie” of the universe evolving from a few hundred million years after the Big Bang to the present day. We probably won’t see the first stars but will see the effect of the first stars, and we may see some of the first galaxies and black holes.
We put a lot of effort into conveying the societal impact of the SKA
The second key science goal is the study of pulsars, especially millisecond pulsars, which emit radio pulses extremely regularly, giving astronomers superb natural clocks in the sky. The SKA will be able to detect every pulsar that can be detected on Earth (at least every pulsar that is pointing in our direction and within the ~70% of the sky visible by the SKA). Pulsars will be used as a proxy to detect and study gravitational waves from extreme phenomena. For instance, when there’s a massive galaxy merger that generates gravitational waves, we will be able to detect the passage of the waves through a change in the pulse arrival times. The SKA telescopes will be a natural extension of existing pulsar-timing arrays, and will be working as a network but also individually.
Another goal is to better understand the influence of dark matter on galaxies and how the universe evolves, and we will also be able to address questions regarding the nature of neutrinos through cosmological studies.
How big is the expected SKA dataset, and how will it be managed?
It depends where you look in the data stream, because the digital signal processing systems will be reducing the data volume as much as possible. Raw data coming out of SKA-Low will be 2 Pb per second – dramatically exceeding the entire internet data rate. That data goes from our fibre network into data processing, all on-site, with electronics heavily shielded to protect the telescopes from interference. Coming out from there, it’s about 5 Tb of data per second being transferred to supercomputing facilities off-site, which is pretty much equivalent to the output generated by SKA-Mid in South Africa. From that point the data will flow into supercomputers for on-the-fly calibration and data processing, emerging as “science-ready” data. It all flows into what we call the SKA Regional Centre network, basically supercomputers dotted around the globe, very much like that used in the Worldwide LHC Computing Grid. By piping the data out to a network of regional centres at a rate of 100 Gb per second, we are going to see around 350 Pb per year of science data from each telescope.
And you’ve been collaborating with CERN on the SKA data challenge?
Very much so. We signed a memorandum of understanding three years ago, essentially to learn how CERN distributes its data and how its processing systems work. There are things we were able to share too, as the SKA will have to process a larger amount of data than even the High-Luminosity LHC will produce. Recently we have entered into a further, broader collaboration with CERN, GÉANT and PRACE [the Partnership for Advanced Computing in Europe] to look at the collaborative use of supercomputer centres in Europe.
SKAO’s organisational model also appears to have much in common with CERN’s?
If you were to look at the text of our treaty you would see its antecedents in those of CERN and ESO (the European Southern Observatory). We are an intergovernmental organisation with a treaty and a convention signed in Rome in March 2019. Right now, we’ve got seven members who have ratified the convention, which was enough for us to kick-off the observatory, and we’ve got countries like France, Spain and Switzerland on the road to accession. Other countries like India, Sweden, Canada and Germany are also following their internal processes and we expect them to join the observatory as full members in the months to come; Japan and South Korea are observers on the SKAO council at this stage. Unlike CERN, we don’t link member contributions directly to gross domestic product (GDP) – one reason being the huge disparity in GDP amongst our member states. We looked at a number of models and none of them were satisfactory, so in the end we invented something that we use as a starting point for negotiation and that’s a proxy for the scientific capacity within countries. It’s actually the number of scientists that an individual country has who are members of the International Astronomical Union. For most of our members it correlates pretty well with GDP.
Is there a sufficient volume of contracts for industries across the participating nations?
Absolutely. The SKA antennas, dishes and front-ends are essentially evolutions of existing designs. It’s the digital hardware and especially the software where there are huge innovations with the SKA. We have started a contracting process with every country and they’re guaranteed to get at least 70% of their investment in the construction funds back. The SKAO budget for the first 10 years – which includes the construction of the telescopes, the salaries of observatory staff and the start of first operations – is €2 billion. The actual telescope itself costs around €1.2 billion.
Why did it take 30 years for the SKA project to be approved?
Back in the late 1980s/early 1990s, radio astronomers were looking ahead to the next big questions. The first mention of what we call the SKA was at a conference in Albuquerque, New Mexico, celebrating the 10th anniversary of the Very Large Array, which is still a state-of-the-art radio telescope. A colleague pulled together discussions and wrote a paper proposing the “Hydrogen Array”. It was clear we would need approximately one square kilometre of collecting area, which meant there had to be a lot of innovation in the telescopes to keep things affordable. A lot of the early design work was funded by the European Commission and we formed an international steering committee to coordinate the effort. But it wasn’t until 2011 that the SKA Organisation was formed, allowing us to go out and raise the money, put the organisational structure in place, confirm the locations, formalise the detailed design and then go and build the telescopes. There was a lot of exploration surrounding the details of the intergovernmental organisation – at one point we were discussing joining ESO.
Building the SKA 10 years earlier would have been extremely difficult, however. One reason is that we would have missed out on the big-data technology and innovation revolution. Another relates to the cost of power in these remote regions: SKA’s Western Australia site is 200 km from the nearest power grid, so we are powering things with photovoltaics and batteries, the cost of which has dropped dramatically in the past five years.
What are the key ingredients for the successful management of large science projects?
One has to have a diplomatic manner. We’ve got 16 countries involved all the way from China to Canada and in both hemispheres, and you have to work closely with colleagues and diverse people all the way up to ministerial level. Being sure the connections with the government are solid and having the right connections are key. We also put a lot of effort into conveying the societal impact of the SKA. Just as CERN invented the web, Wi-Fi came out of radio astronomy, as did a lot of medical imaging technology, and we have been working hard to identify future knowledge-transfer areas.
It also would have been much harder if I did not have a radio-astronomy background, because a lot of what I had to do in the early days was to rely on a network of radio-astronomy contacts around the world to sign up for the SKA and to lobby their governments. While I have no immediate plans to step aside, I think 10 or 12 years is a healthy period for a senior role. When the SKAO council begins the search for my successor, I do hope they recognise the need to have at least an astronomer, if not radio astronomer.
I look at science as an interlinked ecosystem
Finally, it is critical to have the right team, because projects like this are too large to keep in one person’s head. The team I have is the best I’ve ever worked with. It’s a fantastic effort to make all this a reality.
What are the long-term operational plans for the SKA?
The SKA is expected to operate for around 50 years, and our science case is built around this long-term aspiration. In our first phase, whose construction has started and should end in 2028/2029, we will have just under 200 dishes in South Africa, whereas we’d like to have potentially up to 2500 dishes there at the appropriate time. Similarly, in Western Australia we have a goal of up to a million low-frequency antennas, eight times the size of what we’re building now. Fifty years is somewhat arbitrary, and there are not yet any funded plans for such an expansion, but the dishes and antennas themselves will easily last for that time. The electronics are a different matter. That’s why the Lovell Telescope, which I can see outside my window here at SKAO HQ, is still an active science instrument after 65 years, because the electronics inside are state of the art. In terms of its collecting area, it is still the third largest steerable dish on Earth!
How do you see the future of big science more generally?
If there is a bright side to the COVID-19 pandemic, it has forced governments to recognise how critical science and expert knowledge are to survive, and hopefully that has translated into more realism regarding climate change for example. I look at science as an interlinked ecosystem: the hard sciences like physics build infrastructures designed to answer fundamental questions and produce technological impact, but they also train science graduates who enter other areas. The SKAO governments recognise the benefits of what South African colleagues call human capital development: that scientists and engineers who are inspired by and develop through these big projects will diffuse into industry and impact other areas of society. My experience of the senior civil servants that I have come across tells me that they understand this link.
Describing itself as a big-data graph-analytics start-up, gluoNNet seeks to bring data analysis from CERN into “real-life” applications. Just two years old, the 12-strong firm based in Geneva and London has already aided clients with decision making by simplifying open-to-public datasets. With studies predicting that in three to four years almost 80% of data and analytics innovations may come from graph technologies, the physicist-based team aims to be the “R&D department” for medium-sized companies and help them evaluate massive volumes of data in a matter of minutes.
gluoNNet co-founder and president Daniel Dobos, an honorary researcher at the Lancaster University, first joined CERN in 2002, focusing on diamond and silicon detectors for the ATLAS experiment. A passion to share technology with a wider audience soon led him to collaborate with organisations and institutes outside the field. In 2016 he became head of foresight and futures for the United Nations-hosted Global Humanitarian Lab, which strives to bring up-to-date technology to countries across the world. Together with co-founder and fellow ATLAS collaborator Karolos Potamianos, an Ernest Rutherford Fellow at the University of Oxford, the pair have been collaborating on non-physics projects since 2014. An example is the THE Port Association, which organises in-person and online events together with CERN IdeaSquare and other partners, including “humanitarian hackathons”.
CERN’s understanding of big data is different to other’s
Daniel Dobos
gluoNNet was a natural next step to bring data analysis from high-energy physics into broader applications. It began as a non-profit, with most work being non-commercial and helping non-governmental organisations (NGOs). Working with UNICEF, for example, gluoNNet tracked countries’ financial transactions on fighting child violence to see if governments were standing by their commitments. “Our analysis even made one country – which was already one of the top donors – double their contribution, after being embarrassed by how little was actually being spent,” says Dobos.
But Dobos was quick to realise that for gluoNNet to become sustainable it had to incorporate, which it did in 2020. “We wanted to take on jobs that were more impactful, however they were also more expensive.” A second base was then added in the UK, which enabled more ambitious projects to be taken on.
Tracking flights
One project arose from an encounter at CERN IdeaSquare. The former head of security of a major European airline had visited CERN and noticed the particle-tracking technology as well as the international and collaborative environment; he believed something similar was needed in the aviation industry. During the visit a lively discussion about the similarities between data in aviation and particle tracking emerged. This person later became a part of the Civil Aviation Administration of Kazakhstan, which gluoNNet now works with to create a holistic overview of global air traffic (see image above). “We were looking for regulatory, safety and ecological misbehaviour, and trying to find out why some airplanes are spending more time in the air than they were expected to,” says Kristiane Novotny, a theoretical physicist who wrote her PhD thesis at CERN and is now a lead data scientist at gluoNNet. “If we can find out why, we can help reduce flight times, and therefore reduce carbon-dioxide emissions due to shorter flights.”
Using experience acquired at CERN in processing enormous amounts of data, gluoNNet’s data-mining and machine-learning algorithms benefit from the same attitude as that at CERN, explains Dobos. “CERN’s understanding of big data is different to other’s. For some companies, what doesn’t fit in an Excel sheet is considered ‘big data’, whereas at CERN this is miniscule.” Therefore, it is no accident that most in the team are CERN alumni. “We need people who have the CERN spirit,” he states. “If you tell people at CERN that we want to get to Mars by tomorrow, they will get on and think about how to get there, rather than shutting down the idea.”
Though it’s still early days for gluoNNet, the team is undertaking R&D to take things to the next level. Working with CERN openlab and the Middle East Technical University’s Application and Research Center for Space and Accelerator Technologies, for example, gluoNNet is exploring the application of quantum-computing algorithms (namely quantum-graph neural networks) for particle-track reconstruction, as well as industrial applications, such as the analysis of aviation data. Another R&D effort, which originated at the Pan European Quantum Internet Hackathon 2019, aims to make use of quantum key distribution to achieve a secure VPN (virtual private network) connection.
One of gluoNNet’s main future projects is a platform that can provide an interconnected system for analysts and decision makers at companies. The platform would allow large amounts of data to be uploaded and presented clearly, with Dobos explaining, “Companies have meetings with data analysts back and forth for weeks on decisions; this could be a place that shortens these decisions to minutes. Large technology companies start to put these platforms in place, but they are out of reach for small and medium sized companies that can’t develop such frameworks internally.”
The vast amounts of data we have available today hold invaluable insights for governments, companies, NGOs and individuals, says Potamianos. “Most of the time only a fraction of the actual information is considered, missing out on relationships, dynamics and intricacies that data could reveal. With gluoNNet, we aim to help stakeholders that don’t have in-house expertise in advanced data processing and visualisation technologies to get insights from their data, making its complexity irrelevant to decision makers.”
Helmut Weber, CERN director of administration from 1992 to 1994, passed away on 16 July. Born in 1947, he obtained his PhD from the Technical University of Vienna, after which he pursued a steep career in the aerospace industry, where he acquired considerable managerial proficiency. Prior to joining CERN, Helmut had been chairman of the board of directors of Skyline Products (US), and member of the board of directors of the ERC (France).
Helmut played a significant role during CERN’s transition from the LEP era to the LHC project. During his three-year appointment, as successor to Georges Vianès and predecessor to Maurice Robin, he was able to implement many necessary improvements to the CERN administration. Examples include the reorganisation of the finance division (split into procurement and accounting divisions) and the creation of a CERNwide working group to standardise administrative procedures using a common online database. He also resolved a number of looming issues carried forward from the LEP era, such as the debt to the CERN Pension Fund and the financial claims made by the Euro–LEP consortium.
Furthermore, together with Meinhard Regler and the active support of CERN (including Kurt Hübner and Philip Bryant), Helmut promoted AUSTRON, a project proposal for a pulsed high-flux neutron spallation source as an international research centre for central Europe. Although this project could unfortunately not be realised due to lack of funding, the MedAustron facility for proton/ion therapy and research was eventually built as an alternative in Wiener Neustadt. It is now fully operational, serving as a successful example of technology transfer from elementary particle physics to medical applications.
Helmut Weber’s most important legacy is, however, his straightforward, uncompromising and honest character that helped to resolve many contentious internal issues at CERN. When he left the organisation, he had made many friends amongst his former colleagues, who will always remember him and miss him.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.