Graham Ross, a distinguished Scottish theoristwho worked mainly on fundamental particle physics and its importance for the evolution of the universe, passed away suddenly on 31 October 2021.
Born in Aberdeen in 1944, Graham studied physics at the University of Aberdeen, where he met his future wife Ruth. In 1966 he moved to Durham University where he worked with Alan Martin on traditional aspects of the strong interactions for his PhD. His first postdoctoral position began in 1969 at Rutherford Appleton Laboratory (RAL). It was around the time that interest in gauge theories began to flourish, for which he and Alex Love were among the first to investigate the phenomenology. He continued working on this theme after he moved to CERN in 1974 for a two-year fellowship. Among the papers he wrote there was one in 1976 with John Ellis and Mary Gaillard suggesting how to discover the gluon in three-jet events due to “gluestrahlung” in electron–positron annihilation. This proposal formed the basis of the experimental discovery of the gluon a few years later at DESY.
After CERN, Graham worked for two years at Caltech, where he participated in a proof of the factorisation theorem that underlies the application of perturbative QCD to hard-scattering processes at the LHC. He then returned to the UK, to a consultancy at RAL held jointly with a post at the University of Oxford, where he was appointed lecturer in 1984. Here he applied his expertise on QCD in collaborations with Frank Close, Dick Roberts and also Bob Jaffe, showing how the evolution of valence quark distributions in heavy nuclei are in effect rescaled relative to what is observed in hydrogen and deuterium. This work hinted at an enhanced freedom of partons in dense nuclei.
In 1992 Graham became a professor at Oxford, where he remained for the rest of his career as a pillar of the theoretical particle-physics group, working on several deep questions and mentoring younger theorists. Among the many fundamental problems he worked on was the hierarchical ratio between the electroweak scale and the Planck or grand-unification scale, suggesting together with Luis Ibañez that it might arise from radiative corrections in a supersymmetric theory. The pair also pioneered the calculation of the electroweak mixing angle in a supersymmetric grand unified theory, obtaining a result in excellent agreement with subsequent measurements at LEP. Graham wrote extensively on the hierarchy of masses of different matter particles, and the mixing pattern of their weak interactions, with Pierre Ramond in particular, and pioneered phenomenological string models of particles and their interactions. In recent years, Graham worked on models of inflation with Chris Hill, his Oxford colleagues and others.
Among his formal recognitions were his election as fellow of the Royal Society in 1991 and his award of the UK Institute of Physics Dirac Medal in 2012. The citation is an apt summary of Graham’s talents: “for theoretical work developing both the Standard Model of fundamental particles and forces, and theories beyond the Standard Model, that have led to new insights into the origins and nature of the universe”.
Graham had a remarkable ability to think outside the box, and to analyse new ideas critically and systematically. His work was characterised by a combination of deep thought, originality and careful analysis. He was never interested in theoretical speculation or mathematical developments for their own sakes, but as means towards the ultimate end of understanding nature.
Many theoretical physicists are competitive and pursue their ambitions aggressively. But this was not Graham’s way. Pursuing his ambitions with persistence and good humour, he was greatly admired as a talented physicist but also universally liked and admired, particularly by the many younger physicists whom he mentored at Oxford. He was a great teacher and an inspiration, not just to his formal students but also his daughters, Gilly and Emma, and latterly his grandchildren, James, Charlie and Wilfie.
Gennady Zinovjev, a prominent theorist in the field of quantum chromodynamics (QCD) and the physics of strongly-interacting matter, a pioneer in experimental studies of relativistic heavy-ion collisions and a leader of the Ukraine–CERN collaboration, passed away on 19 October 2021 at the age of 80. In a career spanning more than 50 years, Genna, as he was known to most of his friends, made important theoretical contributions to many different topics, ranging from analytical and perturbative QCD to phenomenology, and from hard probes and photons to hadrons and particle chemistry. His scientific activities were concentrated around experimental facilities at CERN and the Joint Institute for Nuclear Research (JINR), Dubna. He was one of the key initiators of the NICA complex at JINR, played a pivotal role in Ukraine becoming an Associate Member State of CERN in 2016 and was one of the founding members of the ALICE collaboration.
Born in 1941 in Birobidzhan (Russian Far East), in 1963, Zinovjev graduated from Dnepropetrovsk State University, a branch of Moscow State University. From 1964 to 1967 he studied at the graduate school of the Laboratory of Theoretical Physics of JINR, after which he spent a year at the Institute of Mathematics and Computer Science of the Academy of Sciences of the Moldavian SSR (Kishinev now Chisinau). He was awarded a PhD in physics and mathematics in 1975 at the Dubna Laboratory of Theoretical Physics and then joined the Kiev Institute for Theoretical Physics (both now the Bogolyubov Institute for Theoretical Physics) of the National Academy of Sciences of Ukraine, firstly as a staff member and then, from 1986, as head of the department of high-energy-density physics. In 2006 he was awarded the Certificate of Honour of the Verkhovna Rada (Parliament) of Ukraine, and in 2008 was awarded the Davydov Prize of the National Academy of Sciences of Ukraine becoming a member of the Academy in 2012.
In the mid-1990s Zinovjev initiated Ukraine’s participation in ALICE, and soon started to play a key role in the conception and construction of the Inner Tracking System (ITS), and more generally in the creation of both the ALICE experiment and the collaboration. Overcoming innumerable practical and bureaucratic obstacles, he identified technical and technological expertise within the Ukrainian academic and research environment, and then managed and led the development and fabrication of novel ultra-lightweight electrical substrates for vertex and tracking detectors. These developments, which took place at the Kharkiv Scientific Research Technological Institute of Instrument Engineering, resulted in technologies and components that formed the backbone of the ITS 1 and ITS 2 detectors. He was the deputy chair of the ALICE collaboration board from 2011 to 2013 and also served as a member of the ALICE management board during that time.
Genna was one of those rare people who are equally comfortable with theory, experiment, science, politics and human interactions. He was a passionate scientist, deeply committed to the Ukrainian scientific community. He did not hesitate to make great personal sacrifices to pursue what he considered important for science, his students and colleagues. Equally influential was his prominent role as a teacher and mentor for a steady stream of talent, both experimentalists and theorists. Many of us in the heavy-ion physics community owe him a great deal. We will always remember him for his charismatic personality, great kindness, openness and generosity.
When it was first published in 1984, James E Dodd’s The Ideas of Particle Physics used very little maths, but was full of clear and concise explanations – a strong contrast with the few other reference books that were available at the time. The first edition was written prior to the start of LEP, just after the discovery of the W and Z bosons. The fourth edition, published in 2021, brings it up to date while keeping its signature style.
At the time of my PhD, 30 years ago, Dodd’s book was revolutionary and helped me enormously. Over the years I have recommended it to countless students, to complement lectures and internet resources. But I had not looked at the updated versions until now. In keeping with the original, the new edition states explicitly that it is not a textbook: it contains no mathematical derivations, and no complicated formulae are written down. This is not at all to say that it is an easy read – it is not! But Dodd and Ben Gripaios, who joins the original author for this expanded fourth edition, convey the beauty of fundamental physics, and some of the phrases border on poetic: “Viewed picturesquely, it is as if the world of physical reality conducts itself while hovering over an unseen sea of negative-energy electrons.”
Some of the phrases border on poetic
The second half of the updated book follows on from where the first edition left off. Precision measurements at LEP and the discovery of the gauge bosons and the top quark are all described with the same excitement and eye for beauty as the earlier discoveries. However, the LHC receives fewer words than the World Wide Web, with its almost five-decades-long journey reduced to a couple of milestones. The hunt for the Higgs boson is also glossed over and fails to capture the excitement of the past couple of decades. More problematically, the description of the role the Higgs boson plays in spontaneous symmetry breaking is muddled.
The latter chapters redeem the text by detailing many of the theories that have arisen over the past 30 to 40 years, and how they may address the many remaining questions in fundamental physics. Indeed, while the first edition perhaps gave the impression that there was not much more to learn about the universe, the fourth edition shows how little we understand, and gives good pointers to where we may find answers.
As a tome on the evolutionary nature of particle physics, with concepts rather than mathematics at the forefront, The Ideas of Particle Physics remains an excellent book, predominantly aimed at graduate students, as a complement to courses and other reference works.
The inaugural CERN Flavour Anomalies Workshop took place on 20 October as part of this year’s Implications of LHCb Measurements and Future Prospects meeting. More than 500 experimentalists and theorists met in a hybrid format via Zoom and in person. Discussion centered on the longstanding tensions in B-physics measurements, and new project ideas. The workshop was dedicated to the memory of long-time LHCb collaborator Sheldon Stone (Syracuse), who made a plentiful contribution to CERN’s flavour programme.
The central topic of the workshop was the b anomalies: a persistent set of tensions between predictions and measurements in a number of semileptonic b-decays which are not as clear as unexpected peaks in invariant mass distributions. Instead, they manifest themselves as modifications to the branching fractions and angular distributions of certain flavour-changing neutral-current (FCNC) b-decays which have become more significant over the past decade. The latest LHCb measurement of the ratio (RK) of B+ decays to a kaon and a muon or electron pair differs from the Standard Model (SM) by more than 3σ, and the ratio (RK*) of B0 decays to an excited kaon and a muon or electron pair differs by more than 2σ. LHCb has also seen several departures from theory in measurements of angular distributions at the level of roughly 3σ significance. Finally, and coherent with these FCNC effects, BaBar, Belle and LHCb analyses of charged-current b→cτ–ν̄ decays support lepton-flavour-universality (LFU) violation at a combined significance of roughly 3σ. Though no single measurement is statistically significant, the collective pattern is intriguing.
Four of the major fitting groups showed a stunning agreement in fits to effective-field-theory parameters
But how robust are the SM predictions for these observables? Efforts include both theory-only and data-driven approaches for distinguishing genuine signs of beyond-the-SM (BSM) effects from hard-to-understand hadronic effects. A further aim is to understand what type of BSM models could produce the observed effects. Of particular interest was the question of how to incorporate information from high-pT searches at the LHC experiments. ATLAS and CMS are ramping up their efforts, and their ongoing B-physics programmes will hopefully soon confirm and complement LHCb’s results. Both experiments reported on work to address the main bottlenecks: the reconstruction of low-momentum leptons, and trigger challenges foreseen as a result of increased luminosities in Run 3. The complementarity of B-physics and direct searches was clear from results such as ATLAS and CMS searches for leptoquarks compatible with the flavour anomalies.
Theory consensus
The workshop saw, for the first time, a joint theory presentation by four of the major b→sℓ+ℓ– fitting groups. They showed a stunning agreement in fits to effective-field-theory parameters which register as nonzero in the presence of BSM physics (see figure). The fits use observables that either probe LFU or help to constrain troublesome hadronic uncertainties. The observables include the now famous RK, RK* and RpK (which studies Λb0 baryon decays to a proton, a charged kaon and a pair of muons or electrons), whose measurements are dominated by LHCb results; and results on the branching fraction for Bs→μ+μ– from ATLAS, CMS and LHCb. Though the level of agreement diminishes when other observables and measurements are included, dominantly due to the different theoretical assumptions made by the four groups, all agree that substantial tensions with the SM are unavoidable.
New results from LHCb included first measurements of the LFU-sensitive ratios RK*+ (which concerns B+→K*+ℓ+ℓ– decays) and RKs (which concerns B0→KS0ℓ+ℓ– decays), and new measurements of branching fractions and angular observables for the decay Bs→ϕμ+μ–, which is at present hampered by significant theory uncertainties.By contrast, many theoretical predictions for b→cτ–ν̄ processes are now more precise than measurements, with the promise of further improvements thanks to dedicated lattice-QCD studies. Larger and more diverse datasets will be needed to reduce the experimental uncertainties.
As the end of the year approaches, it may not be too early to collect wishes for 2022. The most prevalent wishes involve new analysis results from ATLAS, CMS and LHCb on these burning topics, and a 2022 workshop to happen in person!
On 5 October, Syukuro Manabe (Princeton), Klaus Hasselmann (MPI for Meteorology) and Giorgio Parisi (Sapienza University of Rome) were announced as the winners of the 2021 Nobel Prize in Physics for their groundbreaking contributions to the understanding of complex physical systems, which provided rigorous scientific foundations to our understanding of Earth’s climate. Sharing half the 10 million Swedish kronor award, Manabe and Hasselmann were recognised “for the physical modelling of Earth’s climate, quantifying variability and reliably predicting global warming”. Parisi, who started out in high-energy physics, received the other half of the award “for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales”.
In the early 1960s, Manabe developed a radiative-convective model of the atmosphere and explored the role of greenhouse gases in maintaining and changing the atmosphere’s thermal structure. It was the beginning of a decades-long research programme on global warming that he undertook in collaboration with the Geophysical Fluid Dynamics Laboratory, NOAA. Hasselmann, who was founding director of the Max Planck Institute for Meteorology in Hamburg from 1975 to 1999, developed techniques that helped establish the link between anthropogenic CO2 emissions and rising global temperatures. He published a series of papers in the 1960s on non-linear interactions in ocean waves, in which he adapted Feynman-diagram formalism to classical random-wave fields.
Parisi, a founder of the study of complex systems, enabled the understanding and description of many different and apparently entirely random materials and phenomena in physics, biology and beyond, including the flocking of birds. Early in his career, he also made fundamental contributions to particle physics, the most well-known being the derivation, together with the late Guido Altarelli and others, of the “DGLAP” QCD evolution equations for parton densities. “My mentor Nicola Cabibbo was usually saying that we should work on a problem only if working on the problem is fun,” said Parisi following the announcement. “So I tried to work on something that was interesting and which I believed that had some capacity to add something.”
As per last year, the traditional December award ceremony will take place online due to COVID-19 restrictions.
Twenty-five years. That is the time we have from now to ensure a smooth transition between the LHC and the next major collider at CERN. Twenty-five years to approve a project, find the necessary funding, solve administrative problems and define a governance model; to dig a tunnel, equip it with a cutting-edge accelerator, and design and build experiments at the limits of technology.
One of the most memorable moments of my time as president of the CERN Council came on 19 June 2020, when delegates from CERN’s Member States adopted a resolution updating the European strategy for particle physics. The implementation of the European strategyrecommendations is now in full swing, based around two major topics for CERN’s long-term future: the Future Circular Collider (FCC) feasibility study with an organisational schema in place, and the elaboration of roadmaps for advanced accelerator and detector technologies. At the next strategy update towards the middle of the decade, we should be able to decide if the first phase of the FCC – an electron–positron collider operating at centre-of-mass energies from 91 GeV (the Z mass) to 365 GeV (above the tt production threshold) – can be built, paving the way for a hadron collider with an energy of at least 100 TeV in the same tunnel. By then, we should also have a clearer picture of the potential of novel accelerator technologies, such as muon colliders or plasma acceleration.
Besides the purely technical questions, many other challenges lie ahead. It will be indispensable to attract major interregional partners to CERN’s next large project. Together with the scientific impact, the socioeconomic benefits of skills and technologies built through large research infrastructures are increasingly recognised, which makes a new collider an appealing prospect for states to participate in. But what collaboration model can we elaborate together that is fair and efficient? How can we build bridges to other projects currently discussed, such as the ILC? The US recently started its own “Snowmass” strategy process, which may also impact the decisions ahead.
Neither the implementation of the technology roadmaps nor the FCC feasibility study, and far less its construction, can be carried out by CERN alone. Without a tight network of collaboration and exchanges it will not be possible to find the brains, the hands and the financial resources to ensure that CERN continues to thrive in the long term. The collaboration and support from laboratories and institutes in CERN’s Member and Associate Member States and beyond are crucial. Can we imagine new ways to enhance and to intensify the collaboration, to spread the quality and to share the savoir faire? Understanding where difficulties may lay merits continued efforts.
For projects that reach far into the century, we will need the curiosity, creativity and motivation of young people entering our field. Efforts such as the recent ECFA early-career researcher survey are salutary. But are there other means through which we can broaden the freedom and creativity for young scientists within our highly organised collaborations? If there are silver linings to the pandemic, one is surely the increased accessibility to scientific discourse for a greater range of young and diverse researchers that our adaptation to virtual meetings has demonstrated.
Societal acceptance will also be crucial in convincing local communities to accept the impact of a new, big project. Developing environmentally friendly technologies is one factor, especially if we can contribute with innovative solutions. In this context, the launch in September 2020 of CERN’s first public environment report (with a second report about to be published) is timely. CERN’s new education and outreach centre, the Science Gateway, will also significantly increase the number of people who can visit and be inspired by CERN.
For projects that reach far into the century, we will need the curiosity, creativity and motivation of young people entering
our field
The enormous amount of work that has taken place during Long Shutdown 2 lays the foundation for the HL-LHC later this decade. However, beyond ensuring the success of this flagship programme, and that of CERN’s large and diverse portfolio of non-collider experiments, we must clearly and carefully explain the case for continued exploration at the energy frontier. To other scientists: we all benefit from mutual exchange and stimulation. To teachers and educators: we can contribute to make science fascinating and help attract young people into STEM subjects. To society: we can help increase scientific literacy, which is crucial for democracies to distinguish sense from, well, nonsense.
Twenty-five years is not long. And no matter our individual roles at CERN, we each have our work cut out. Together, we need to stand behind this unique laboratory, be proud of its past achievements, and embrace the changes necessary to build its – and our – future.
Having led the SKAO for almost a decade, how did it feel to get the green light for construction in June this year?
The project has been a long time in gestation and I have invested much of my professional life in the SKA project. When the day came, I was 95% confident that the SKAO council would give us the green light to proceed, as we were still going through ratification processes in national parliaments. I sent a message to my senior team saying: “This is the most momentous week of my career” because of the collective effort of so many people in the observatory and across the entire partnership over so many years. It was a great feeling, even if we couldn’t celebrate properly because of the pandemic.
What will the SKA telescopes do that previous radio telescopes couldn’t?
The game changer is the sheer size of the facility. Initially, we’re building 131,072 low-frequency antennas in Western Australia (“SKA-Low”) and 197 15 m-class dishes in South Africa (“SKA-Mid”). This will provide us with up to a factor of 10 improvement in our ability to see fainter details in the universe. The long-term SKA vision will increase the sensitivity by a further factor of 10. We’ve got many science areas, but two are going to be unique to us. One is the ability to detect hydrogen all the way back to the epoch of reionisation, also called the “cosmic-dawn”. The frequency range that we cover, combined with the large collecting area and the sensitivity of the two radio telescopes, will allow us to make a “movie” of the universe evolving from a few hundred million years after the Big Bang to the present day. We probably won’t see the first stars but will see the effect of the first stars, and we may see some of the first galaxies and black holes.
We put a lot of effort into conveying the societal impact of the SKA
The second key science goal is the study of pulsars, especially millisecond pulsars, which emit radio pulses extremely regularly, giving astronomers superb natural clocks in the sky. The SKA will be able to detect every pulsar that can be detected on Earth (at least every pulsar that is pointing in our direction and within the ~70% of the sky visible by the SKA). Pulsars will be used as a proxy to detect and study gravitational waves from extreme phenomena. For instance, when there’s a massive galaxy merger that generates gravitational waves, we will be able to detect the passage of the waves through a change in the pulse arrival times. The SKA telescopes will be a natural extension of existing pulsar-timing arrays, and will be working as a network but also individually.
Another goal is to better understand the influence of dark matter on galaxies and how the universe evolves, and we will also be able to address questions regarding the nature of neutrinos through cosmological studies.
How big is the expected SKA dataset, and how will it be managed?
It depends where you look in the data stream, because the digital signal processing systems will be reducing the data volume as much as possible. Raw data coming out of SKA-Low will be 2 Pb per second – dramatically exceeding the entire internet data rate. That data goes from our fibre network into data processing, all on-site, with electronics heavily shielded to protect the telescopes from interference. Coming out from there, it’s about 5 Tb of data per second being transferred to supercomputing facilities off-site, which is pretty much equivalent to the output generated by SKA-Mid in South Africa. From that point the data will flow into supercomputers for on-the-fly calibration and data processing, emerging as “science-ready” data. It all flows into what we call the SKA Regional Centre network, basically supercomputers dotted around the globe, very much like that used in the Worldwide LHC Computing Grid. By piping the data out to a network of regional centres at a rate of 100 Gb per second, we are going to see around 350 Pb per year of science data from each telescope.
And you’ve been collaborating with CERN on the SKA data challenge?
Very much so. We signed a memorandum of understanding three years ago, essentially to learn how CERN distributes its data and how its processing systems work. There are things we were able to share too, as the SKA will have to process a larger amount of data than even the High-Luminosity LHC will produce. Recently we have entered into a further, broader collaboration with CERN, GÉANT and PRACE [the Partnership for Advanced Computing in Europe] to look at the collaborative use of supercomputer centres in Europe.
SKAO’s organisational model also appears to have much in common with CERN’s?
If you were to look at the text of our treaty you would see its antecedents in those of CERN and ESO (the European Southern Observatory). We are an intergovernmental organisation with a treaty and a convention signed in Rome in March 2019. Right now, we’ve got seven members who have ratified the convention, which was enough for us to kick-off the observatory, and we’ve got countries like France, Spain and Switzerland on the road to accession. Other countries like India, Sweden, Canada and Germany are also following their internal processes and we expect them to join the observatory as full members in the months to come; Japan and South Korea are observers on the SKAO council at this stage. Unlike CERN, we don’t link member contributions directly to gross domestic product (GDP) – one reason being the huge disparity in GDP amongst our member states. We looked at a number of models and none of them were satisfactory, so in the end we invented something that we use as a starting point for negotiation and that’s a proxy for the scientific capacity within countries. It’s actually the number of scientists that an individual country has who are members of the International Astronomical Union. For most of our members it correlates pretty well with GDP.
Is there a sufficient volume of contracts for industries across the participating nations?
Absolutely. The SKA antennas, dishes and front-ends are essentially evolutions of existing designs. It’s the digital hardware and especially the software where there are huge innovations with the SKA. We have started a contracting process with every country and they’re guaranteed to get at least 70% of their investment in the construction funds back. The SKAO budget for the first 10 years – which includes the construction of the telescopes, the salaries of observatory staff and the start of first operations – is €2 billion. The actual telescope itself costs around €1.2 billion.
Why did it take 30 years for the SKA project to be approved?
Back in the late 1980s/early 1990s, radio astronomers were looking ahead to the next big questions. The first mention of what we call the SKA was at a conference in Albuquerque, New Mexico, celebrating the 10th anniversary of the Very Large Array, which is still a state-of-the-art radio telescope. A colleague pulled together discussions and wrote a paper proposing the “Hydrogen Array”. It was clear we would need approximately one square kilometre of collecting area, which meant there had to be a lot of innovation in the telescopes to keep things affordable. A lot of the early design work was funded by the European Commission and we formed an international steering committee to coordinate the effort. But it wasn’t until 2011 that the SKA Organisation was formed, allowing us to go out and raise the money, put the organisational structure in place, confirm the locations, formalise the detailed design and then go and build the telescopes. There was a lot of exploration surrounding the details of the intergovernmental organisation – at one point we were discussing joining ESO.
Building the SKA 10 years earlier would have been extremely difficult, however. One reason is that we would have missed out on the big-data technology and innovation revolution. Another relates to the cost of power in these remote regions: SKA’s Western Australia site is 200 km from the nearest power grid, so we are powering things with photovoltaics and batteries, the cost of which has dropped dramatically in the past five years.
What are the key ingredients for the successful management of large science projects?
One has to have a diplomatic manner. We’ve got 16 countries involved all the way from China to Canada and in both hemispheres, and you have to work closely with colleagues and diverse people all the way up to ministerial level. Being sure the connections with the government are solid and having the right connections are key. We also put a lot of effort into conveying the societal impact of the SKA. Just as CERN invented the web, Wi-Fi came out of radio astronomy, as did a lot of medical imaging technology, and we have been working hard to identify future knowledge-transfer areas.
It also would have been much harder if I did not have a radio-astronomy background, because a lot of what I had to do in the early days was to rely on a network of radio-astronomy contacts around the world to sign up for the SKA and to lobby their governments. While I have no immediate plans to step aside, I think 10 or 12 years is a healthy period for a senior role. When the SKAO council begins the search for my successor, I do hope they recognise the need to have at least an astronomer, if not radio astronomer.
I look at science as an interlinked ecosystem
Finally, it is critical to have the right team, because projects like this are too large to keep in one person’s head. The team I have is the best I’ve ever worked with. It’s a fantastic effort to make all this a reality.
What are the long-term operational plans for the SKA?
The SKA is expected to operate for around 50 years, and our science case is built around this long-term aspiration. In our first phase, whose construction has started and should end in 2028/2029, we will have just under 200 dishes in South Africa, whereas we’d like to have potentially up to 2500 dishes there at the appropriate time. Similarly, in Western Australia we have a goal of up to a million low-frequency antennas, eight times the size of what we’re building now. Fifty years is somewhat arbitrary, and there are not yet any funded plans for such an expansion, but the dishes and antennas themselves will easily last for that time. The electronics are a different matter. That’s why the Lovell Telescope, which I can see outside my window here at SKAO HQ, is still an active science instrument after 65 years, because the electronics inside are state of the art. In terms of its collecting area, it is still the third largest steerable dish on Earth!
How do you see the future of big science more generally?
If there is a bright side to the COVID-19 pandemic, it has forced governments to recognise how critical science and expert knowledge are to survive, and hopefully that has translated into more realism regarding climate change for example. I look at science as an interlinked ecosystem: the hard sciences like physics build infrastructures designed to answer fundamental questions and produce technological impact, but they also train science graduates who enter other areas. The SKAO governments recognise the benefits of what South African colleagues call human capital development: that scientists and engineers who are inspired by and develop through these big projects will diffuse into industry and impact other areas of society. My experience of the senior civil servants that I have come across tells me that they understand this link.
Describing itself as a big-data graph-analytics start-up, gluoNNet seeks to bring data analysis from CERN into “real-life” applications. Just two years old, the 12-strong firm based in Geneva and London has already aided clients with decision making by simplifying open-to-public datasets. With studies predicting that in three to four years almost 80% of data and analytics innovations may come from graph technologies, the physicist-based team aims to be the “R&D department” for medium-sized companies and help them evaluate massive volumes of data in a matter of minutes.
gluoNNet co-founder and president Daniel Dobos, an honorary researcher at the Lancaster University, first joined CERN in 2002, focusing on diamond and silicon detectors for the ATLAS experiment. A passion to share technology with a wider audience soon led him to collaborate with organisations and institutes outside the field. In 2016 he became head of foresight and futures for the United Nations-hosted Global Humanitarian Lab, which strives to bring up-to-date technology to countries across the world. Together with co-founder and fellow ATLAS collaborator Karolos Potamianos, an Ernest Rutherford Fellow at the University of Oxford, the pair have been collaborating on non-physics projects since 2014. An example is the THE Port Association, which organises in-person and online events together with CERN IdeaSquare and other partners, including “humanitarian hackathons”.
CERN’s understanding of big data is different to other’s
Daniel Dobos
gluoNNet was a natural next step to bring data analysis from high-energy physics into broader applications. It began as a non-profit, with most work being non-commercial and helping non-governmental organisations (NGOs). Working with UNICEF, for example, gluoNNet tracked countries’ financial transactions on fighting child violence to see if governments were standing by their commitments. “Our analysis even made one country – which was already one of the top donors – double their contribution, after being embarrassed by how little was actually being spent,” says Dobos.
But Dobos was quick to realise that for gluoNNet to become sustainable it had to incorporate, which it did in 2020. “We wanted to take on jobs that were more impactful, however they were also more expensive.” A second base was then added in the UK, which enabled more ambitious projects to be taken on.
Tracking flights
One project arose from an encounter at CERN IdeaSquare. The former head of security of a major European airline had visited CERN and noticed the particle-tracking technology as well as the international and collaborative environment; he believed something similar was needed in the aviation industry. During the visit a lively discussion about the similarities between data in aviation and particle tracking emerged. This person later became a part of the Civil Aviation Administration of Kazakhstan, which gluoNNet now works with to create a holistic overview of global air traffic (see image above). “We were looking for regulatory, safety and ecological misbehaviour, and trying to find out why some airplanes are spending more time in the air than they were expected to,” says Kristiane Novotny, a theoretical physicist who wrote her PhD thesis at CERN and is now a lead data scientist at gluoNNet. “If we can find out why, we can help reduce flight times, and therefore reduce carbon-dioxide emissions due to shorter flights.”
Using experience acquired at CERN in processing enormous amounts of data, gluoNNet’s data-mining and machine-learning algorithms benefit from the same attitude as that at CERN, explains Dobos. “CERN’s understanding of big data is different to other’s. For some companies, what doesn’t fit in an Excel sheet is considered ‘big data’, whereas at CERN this is miniscule.” Therefore, it is no accident that most in the team are CERN alumni. “We need people who have the CERN spirit,” he states. “If you tell people at CERN that we want to get to Mars by tomorrow, they will get on and think about how to get there, rather than shutting down the idea.”
Though it’s still early days for gluoNNet, the team is undertaking R&D to take things to the next level. Working with CERN openlab and the Middle East Technical University’s Application and Research Center for Space and Accelerator Technologies, for example, gluoNNet is exploring the application of quantum-computing algorithms (namely quantum-graph neural networks) for particle-track reconstruction, as well as industrial applications, such as the analysis of aviation data. Another R&D effort, which originated at the Pan European Quantum Internet Hackathon 2019, aims to make use of quantum key distribution to achieve a secure VPN (virtual private network) connection.
One of gluoNNet’s main future projects is a platform that can provide an interconnected system for analysts and decision makers at companies. The platform would allow large amounts of data to be uploaded and presented clearly, with Dobos explaining, “Companies have meetings with data analysts back and forth for weeks on decisions; this could be a place that shortens these decisions to minutes. Large technology companies start to put these platforms in place, but they are out of reach for small and medium sized companies that can’t develop such frameworks internally.”
The vast amounts of data we have available today hold invaluable insights for governments, companies, NGOs and individuals, says Potamianos. “Most of the time only a fraction of the actual information is considered, missing out on relationships, dynamics and intricacies that data could reveal. With gluoNNet, we aim to help stakeholders that don’t have in-house expertise in advanced data processing and visualisation technologies to get insights from their data, making its complexity irrelevant to decision makers.”
Helmut Weber, CERN director of administration from 1992 to 1994, passed away on 16 July. Born in 1947, he obtained his PhD from the Technical University of Vienna, after which he pursued a steep career in the aerospace industry, where he acquired considerable managerial proficiency. Prior to joining CERN, Helmut had been chairman of the board of directors of Skyline Products (US), and member of the board of directors of the ERC (France).
Helmut played a significant role during CERN’s transition from the LEP era to the LHC project. During his three-year appointment, as successor to Georges Vianès and predecessor to Maurice Robin, he was able to implement many necessary improvements to the CERN administration. Examples include the reorganisation of the finance division (split into procurement and accounting divisions) and the creation of a CERNwide working group to standardise administrative procedures using a common online database. He also resolved a number of looming issues carried forward from the LEP era, such as the debt to the CERN Pension Fund and the financial claims made by the Euro–LEP consortium.
Furthermore, together with Meinhard Regler and the active support of CERN (including Kurt Hübner and Philip Bryant), Helmut promoted AUSTRON, a project proposal for a pulsed high-flux neutron spallation source as an international research centre for central Europe. Although this project could unfortunately not be realised due to lack of funding, the MedAustron facility for proton/ion therapy and research was eventually built as an alternative in Wiener Neustadt. It is now fully operational, serving as a successful example of technology transfer from elementary particle physics to medical applications.
Helmut Weber’s most important legacy is, however, his straightforward, uncompromising and honest character that helped to resolve many contentious internal issues at CERN. When he left the organisation, he had made many friends amongst his former colleagues, who will always remember him and miss him.
Norwegian experimental particle physicist Egil Sigurd Lillestøl passed away in Valence, France, on 27 September. He will be remembered as a passionate colleague with exceptional communication and teaching skills, and a friend with many personal interests. He was able to explain the most complex systems and mechanisms in physics so that even the layperson felt they understood it.
Egil Lillestøl obtained his PhD from the University of Bergen in 1970. By which time he had already spent three years (1964–1967) as a fellow at CERN. He was appointed associate professor at his alma mater the same year, and then left for Paris in 1973 where he was a guest researcher at Collège de France. In 1984 Lillestøl was appointed full professor in experimental particle physics in Bergen, where he became central in the PLUTO collaboration at DESY, DELPHI and then ATLAS at CERN.
Over time, CERN became Lillestøl’s main laboratory, first as a paid associate, later as a guest professor and eventually as a staff member, contributing to the management of the experimental programme and significantly improving the conditions for the visiting scientists at the laboratory.
In Norway he acted as national coordinator of CERN activities in preparation for the LHC. He was instrumental in the organisation of the community and discussions of future funding models at the national level, in particular to accommodate the long-term commitments needed for the ATLAS and ALICE construction projects.
Egil Lillestøl played a pivotal role in the CERN Schools of Physics from 1992 until 2009, relaunching the European School of High-Energy Physics as annual events organised in collaboration with JINR, and establishing a new biennial series of schools in Latin America from 2001. He worked tirelessly on preparations for each event, in collaboration with local organisers in each host country, as well as on-site during the two-week-long events.
The Latin-American schools were an important element in increasing the involvement of scientists and institutes from the region in the CERN experimental programme, for which he deserves much credit. Beyond his official duties, he took great pleasure in interacting with the participants of the schools during their free time, and in the evenings he could often be found playing piano to accompany their singing.
As a founding member of the International Thorium Energy Committee, Lillestøl was a strong proponent for thorium-based nuclear power. He was also one of the main drivers behind the UNESCO-supported travelling exhibition “Science bringing nations together”, organised jointly by JINR and CERN.
As a teacher and a lecturer, Lillestøl was a role model. He always tailored his presentations to match the audience. His tabletop book The Search for Infinity, co-authored with Gordon Fraser and Inge Sellevåg, became a bestseller and has been published in nine language editions.
Egil Lillestøl was a bon viveur who spread joy around him. He had an impressive repertoire of anecdotes, including topics such as how to cold-smoke salmon. He enjoyed sports and was active in the CERN clubs for cycling, skiing and sailing. He leaves behind his wife and former colleague, Danielle, and two adult children from his first marriage.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.