Comsol -leaderboard other pages

Topics

Why CLIC?

The proposed CLIC collider

There is an increasing consensus that the next large accelerator after the LHC should be an electron–positron collider. Several proposals are on the table, circular and linear. Around 75 collaborating institutes worldwide are involved in the CERN-hosted studies for the Compact Linear Collider (CLIC), which offers a long-term and flexible physics programme that is able to react to discoveries and technological developments.

The 11 km-long initial stage of CLIC is proposed for operation at a centre-of-mass energy of 380 GeV, providing a rich programme of precision Higgs-boson and top-quark measurements that reach well beyond the projections for the high-luminosity LHC. From a technical point of view, operation of the initial stage by around 2035 is possible, with a cost of approximately 5.9 billion Swiss francs. This is similar to the cost of the LHC and of the proposed International Linear Collider in Japan, and considerably  less than that of future circular lepton colliders.

Extensions beyond the initial CLIC energy into the multi-TeV regime allow much improved precision on Standard Model (SM) measurements and greater reach for physics beyond the SM, with upgrade costs of 5 (to 1.5 TeV) and 7 billion Swiss francs (to 3 TeV) for the two further stages. A key part of the CLIC study has been the physics and detector studies showing that beam-induced backgrounds can indeed be mitigated, compatible with the clean experimental conditions expected in electron–positron colliders.

Steinar Stapnes

The question of what follows after the initial stage of a linear electron–positron collider is premature to answer. It is crucial to choose the most flexible approach now, and to develop technically mature and affordable options, encouraging a broad and exciting R&D programme. The option of expanding CLIC from its initial phase is already built into its staging scheme. Novel acceleration technologies can potentially push linear colliders even further in energy, although significantly more work is needed on beam qualities and energy efficiency for such options. High-energy proton and muon colliders are also potential future directions that need to be developed. While for protons the challenges are related to magnet performance, collider size and costs, for muons the technical design concepts need to mature and the radiation and experimental conditions need to be better understood.

CLIC offers a unique combination of precision and energy reach, and has a long history dating back to around 1985. At that time, the LEP tunnel was under construction and the first LHC workshop had just taken place. The motivation then was to move well beyond the W- and Z-boson studies foreseen at LEP to search for and study the top quark, Higgs boson and possible supersymmetric particles in a mass range from hundreds of GeV to several TeV. After the top-quark discovery at Fermilab and Higgs-boson discovery at CERN, we know that CLIC can do exactly that – even though the search arena for new physics is much more open than considered at that time.

Successful formula

High-energy electron–positron collisions, together with proton–proton or proton–antiproton collisions, have been a successful formula for progress in particle physics for half a century. Increasing the energy and luminosity of such machines is challenging. CLIC’s drive-beam concept was instrumental in providing a credible and scalable powering option at multi-TeV energies. A cost optimisation combined with the practical need of radio-frequency (RF) power units for R&D and testing led to the present normal-conducting 12 GHz “X-band” accelerating structures with an accelerating gradient of up to 100 MV/m. In parallel, CLIC’s energy use at 380 GeV has been scrutinised to keep it well below CERN’s annual consumption today, and less than 50% of the estimation for a future circular electron–positron collider.

The question of what follows after the initial stage of a linear electron–positron collider is premature to answer

The next steps needed for CLIC are clear. The project-implementation plan foresees a five-year preparation phase prior to construction, which is envisaged to start by 2026. The preparation phase would focus on further design optimisation and technical and industrial development of critical parts of the accelerator. System verification in free-electron-laser linacs and low-emittance rings will be increasingly important for performance studies, while civil engineering and infrastructure preparation will become progressively more detailed, in parallel with an environmental impact study. Detector preparation will need to be scaled up, too.

The increasing use of X-band technology – either as the main RF acceleration for CLIC or for compact test facilities, light sources, medical accelerators or low-energy particle physics studies – provides new collaborative opportunities towards a technical design report for the CLIC accelerator.

It is for the broader particle-physics community and CERN to decide whether CLIC proceeds. We are therefore eagerly looking forward to the conclusion of the European strategy process next year. For now, it is important to communicate how CLIC-380 can be implemented rapidly involving many collaborative partners, and at the same time provide unique and timely opportunities for R&D to keep future options open.

The consolations of physics

The Consolations of Physics

As someone who lives and breathes physics every day, I have to confess that when I curl up with a book, it’s rarely a popularisation of science. But when I saw that Tim Radford had written such a book, and that it was all about how physics can make you happy, it went straight to the top of my reading list.

Despite Radford’s refusal to be pigeonholed as a science journalist, insisting instead that a good journalist moves from beat to beat, never colonising any individual space, he was science correspondent for The Guardian for a quarter of a century. Now retired, he remains one of the most respected science writers around.

The book is a joy to read. More a celebration of human curiosity than a popular science book, it’s an antidote to the kind of narrow populism so prevalent in popular discourse today: a timely reminder of what we humans are capable of when we put differences aside and work together to achieve common goals.

Boethius, who took consolation in philosophy as he languished in a sixth-century jail is another recurring presence

The Voyager mission, along with LIGO and the LHC, serves as a guiding thread through Radford’s vast and winding exploration of human curiosity. Right from the opening lines, the reader is taken on a breathtaking tour of the full spectrum of human inventiveness, from science to religion, and from art to philosophy. On the way, we encounter thinkers as diverse as St Augustine, Dante and H G Wells. Boethius, who took consolation in philosophy as he languished in a sixth-century jail is another recurring presence, the book’s title being a nod to him.

We’re treated to a concise and clear consideration of the roles of science and religion in human societies. “Religious devotion demands unquestioning faith,” says Radford, whereas “science demands a state of mind that is always open to doubt”. While many can enjoy both, he concludes that it may be easier to enjoy science because it represents truth in a way that can be tested.

No sooner have we dealt with religion than we find ourselves listening to echoes of the great Richard Feynman as Radford considers the beauty of a dew-laden cobweb on an English autumn morning. “Does it make it any less magical a sight to know that this web was spun from a protein inside the spider?”, he asks, bringing to mind Feynman’s wonderful monologue about the beauty of a flower in Christopher Sykes’ equally wonderful 1981 documentary, The Pleasure of Finding Things Out. Both conclude that science can only enhance the aesthetic beauty of the natural world.

The overall effect is a bit like a roller-coaster ride in the dark: you’re never quite sure when the next turn will come, or where it will take you. That’s part of the joy of the book. There are few writers who could pull so many diverse threads together, spanning such a broad spectrum of time and subjects. Radford pulls it off brilliantly.

Someone expecting a popularisation of physics might be disappointed. Indeed, the physics is sometimes a little cursory. Yes, the LHC takes us back to the first unimaginably brief instants of the universe’s life, and that’s indeed something that catches the imagination. But that’s just a part of what the LHC does – it’s also about the here and now, and it’s about the future as well. But to dwell on such things would be to miss the point of this book entirely.

An elegant manifesto for physics is how the publisher describes this book, but it’s more than that. It’s a celebration of the best in humanity, built around the successes of CERN, LIGO and most of all the Voyager mission. What such projects bring us may be intangible and uncertain, but their results are available to all, and they enrich anyone who cares to look. Like any good roller coaster, when you get off, you just want to get right back on again, because if there’s something else that can make you happy, it’s Tim Radford’s writing.

Spiro appointed IUPAP president

Michel Spiro

Prominent French particle physicist Michel Spiro has been appointed president of the International Union of Pure and Applied Physics (IUPAP), replacing theorist Kennedy Reed of Lawrence Livermore National Laboratory. IUPAP, which aims to stimulate and promote international cooperation in physics, was established in 1922 with 13 member countries and now has close to 60 members. Spiro, who participated in the UA1 experiment, the GALLEX solar-neutrino experiment and the EROS microlensing dark-matter search, among other experiments, has held senior positions in the French CNRS and CEA, and was president of the CERN Council from 2010 to 2013.

Breakthrough Prize for black-hole image

Image of a black hole

The first direct image of a black hole, obtained by the Event Horizon Telescope (EHT, a network of eight radio dishes that creates an Earth-sized interferometer) earlier this year, has been recognised by the 2020 Breakthrough Prize in Fundamental Physics. The $3 million prize will be shared equally between 347 researchers who were co-authors of the six papers published by the EHT collaboration on 10 April. Also announced were six New Horizons Prizes worth $100,000 each, which recognise early-career achievements. In physics, Jo Dunkley (Princeton), Samaya Nissanke (University of Amsterdam) and Kendrick Smith (Perimeter Institute) were rewarded for the development of novel techniques to extract fundamental physics from astronomical data. Simon Caron-Huot (McGill University) and Pedro Vieira (Perimeter Institute) were recognised for their “profound contributions to the understanding of quantum field theory”.

Flying high after the Higgs

Eleni Mountricha

In 2018, Eleni Mountricha’s career in particle physics was taking off. Having completed a master’s thesis at the National Technical University of Athens (NTUA), a PhD jointly with NTUA and Université Paris-Sud, and a postdoc with Brookhaven National Laboratory, she had just secured a fellowship at CERN and was about to select a research topic. A few weeks later, she ditched physics for a career in industry. Having been based at CERN for more than a decade, and as a member of the ATLAS team working on the Higgs boson at the time of its discovery in 2012, leaving academia was one of the toughest decisions she has faced.

“On the one hand I was looking for a more permanent position, which looked quite hard to achieve in research, and on the other, in the years after the Higgs-boson discovery, my excitement and expectation about more new physics had started to fade,” she says. “There was always the hope of staying in academia, conducting research and exploring new fields of physics. But when the idea of possibly leaving kicked in, I decided that I should explore the potential of all alternatives.”

Mountricha had just completed initial discussions about her CERN research project when she received an offer of a permanent contract at Inmarsat – a provider of mobile satellite communications based in the nearby Swiss town of Nyon. It was unexpected, given how few positions she had applied for. “I felt a mixture of happiness and satisfaction at having succeeded in something that I didn’t expect I had many chances for, and frustration at the prospect of leaving something that I had spent many years on with a lot of dedication,” she explains. “What made it even harder was the discussions with other CERN experiments during the first month of the fellowship, which sparked my physics excitement again.”

New pastures

Mountricha’s idea to leave physics first formed after attending, out of curiosity, a career networking event for LHC-experiment physicists in November 2017. “The main benefit I got out of the event was a feeling that, even if I left, this would not be the end of the world; and that, if I searched enough, I could always find exciting things to do.” The networking event now takes place annually.

The Inmarsat job was brought to Mountricha’s attention by a fellow CERN alumnus and it was the only job that she had applied for outside physics. “I believe that I was lucky but I also had invested a lot of personal time to polish my skills, prepare for the interview and, in the end, it all came together,” she says.

People should not feel disappointed for having to move outside physics

Today, Mountricha’s official job title is “aero-service performance manager”. She works in the data-science team of the company’s aviation department collecting and reporting on data about aircraft connectivity and usage. This involves the use of Python to develop custom applications, analysing data using Python and SQL, and developing reporting and monitoring tools such as web applications. Her daily tasks vary from data analysis to developing new products. “Much of the work that I do, I had no clue about in the past and I had to learn. Some other pieces of work, like the data analytics, I used to do in a research context. However, the level at which I was doing it at CERN was much more sophisticated and complex. Many people in my team are physicists, all of them from CERN. Besides the technical aspects though, it is really at CERN that I learned how to collaborate, discuss with people, bring and collect ideas, solve problems, present arguments, and all those soft skills that are very important in my current job.”

As for advice to others who are considering taking the leap, Mountricha thinks that people should not feel disappointed for having to move outside physics. “Fundamental research is a lot of fun and does equip us with much sought-after skills and experience. On the other hand, there are many exciting projects out there, where we can apply everything that we have learned and develop much further.”

Higgs nostalgia

While happy to be on a new career path at the age of 37, working on the search for the Higgs boson will take some beating. “The announcement of the discovery was made in July, the papers were published in August and I defended my PhD thesis in September, so there was much pressure to finalise my work for all of those deadlines,” recalls Mountricha. “Even the times when I was sleeping on top of my PC, exhausted, I still remember them with love and nostalgia. In particular, I remember the day of the announcement of the discovery, there were people sleeping outside the main auditorium the night before in order to make it to the presentation. As a result, I ended up watching it remotely from building 40 together with the whole analysis team. I was slightly disappointed not to be physically present in the packed auditorium, but this nevertheless remains such an important moment of my life.”

Giorgi and Nakada share Fermi Prize

Marcello Giorgi and Tatsuya Nakada

Marcello Giorgi of the University of Pisa and Tatsuya Nakada of the Swiss Federal Institute of Technology in Lausanne (EPFL) have been awarded the Enrico Fermi Prize from the Italian Physical Society for their outstanding contributions to the experimental evidence of CP violation in the heavy-quark sector. Giorgi is cited “for his leading role in experimental high-energy particle physics with particular regard to the BaBar experiment and the discovery of CP symmetry violation in the B meson systems with beauty quarks”, while Nakada is recognised for his conception and crucial leading role in the realisation of the LHCb experiment that led earlier this year to the discovery of CP violation in D mesons with charm quarks. The prize was presented on 23 September during the opening ceremony of the 105th national congress of the Italian Physical Society in L’Aquila, Italy.

Building Gargantua

Oliver James is chief scientist of the world’s biggest visual effects studio, DNEG, which produced the spectacular visual effects for Interstellar. DNEG’s work, carried out in collaboration with theoretical cosmologist Kip Thorne, led to some of the most physically-accurate images of a spinning black hole ever created, earning the firm an Academy Award and a BAFTA. For James, it all began with an undergraduate degree in physics at the University of Oxford in the late 1980s – a period that he describes as one of the most fascinating and intellectually stimulating of his life. “It confronted me with the gap between what you observe and reality. I feel it was the same kind of gap I faced while working for Interstellar. I had to study a lot to understand the physics of black holes and curved space time.”

A great part of visual effects is understanding how light interacts with surfaces and volumes and eventually enters a camera’s lens and as a student, Oliver was interested in atomic physics, quantum mechanics and modern optics. This, in addition to his two other passions – computing and photography – led him to his first job in a small photographic studio in London where he became familiar with the technical and operational aspects of the industry. Missing the intellectual challenge offered by physics, in 1995 he contacted and secured a role in the R&D team of the Computer Film Company – a niche studio specialising in digital film which was part of the emerging London visual effects industry.

Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about

Oliver James

A defining moment came in 2001, when one of his ex-colleagues invited him to join Warner Bros’ ESC Entertainment at Alameda California to work on The Matrix Reloaded & Revolutions. His main task was to work on rigid-body simulations – not a trivial task given the many fight scenes. “There’s a big fight scene, called the Burly Brawl, where hundreds of digital actors get thrown around like skittles,” he says. “We wanted to add realism by simulating the physics of these colliding bodies. The initial tests looked physical, but lifeless, so we enhanced the simulation by introducing torque at every joint, calculated from examples of real locomotion. Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about”. The sequences took dozens of artists and technicians months of work to create just a few seconds of the movie.

DNEG chief scientist Oliver James

Following his work in ESC Entertainment, James moved back to London and, after a short period at the Moving Picture Company, he finally joined “Double Negative” in 2004 (renamed DNEG in 2018). He’d been attracted by Christopher Nolan’s film Batman Begins, for which the firm was creating visual effects, and it was the beginning of a long and creative journey that would culminate in the sci-fi epic Interstellar, which tells the story of an astronaut searching for habitable planets in outer space.

Physics brings the invisible to life
“We had to create a new imagery for black holes; a big challenge even for someone with a physics background,” recalls James. Given that he hadn’t studied general relativity as an undergraduate and had only touched upon special relativity, he decided to call Kip Thorne of Caltech for help. “At one point I asked [Kip] a very concrete question: ‘Could you give me an equation that describes the trajectory of light from a distant star, around the black hole and finally into an observer’s eye?’ This must have struck the right note as the next day I received an email—it was more like a scientific paper that included the equations answering my questions.” In total, James and Thorne exchanged some 1000 emails, often including detailed mathematical formalism that DNEG could then use in its code. “I often phrased my questions in a rather clumsy way and Kip insisted: “What precisely do you mean”? says James. “This forced me to rethink what was lying at the heart of my questions.”

The result for the wormhole was like a crystal ball reflecting each point the universe

Oliver James

DNEG was soon able to develop new rendering software to visualise black holes and wormholes. The director had wanted a wormhole with an adjustable shape and size and thus we designed one with three free parameters, namely the length and radius of the wormhole’s interior as well as a third variant describing the smoothness of the transition from its interior to its exteriors, explains James. “The result for the wormhole was like a crystal ball reflecting each point the universe; imagine a spherical hole in space–time.” Simulating a black hole represented a bigger challenge as, by definition, it is an object that doesn’t allow light to escape. With his colleagues, he developed a completely new renderer that simulates the path of light through gravitationally warped space–time – including gravitational lensing effects and other physical phenomena that take place around a black hole.

Quality standards
On the internet, one can find many images of black holes “eating” other stars of stars colliding to form a black hole. But producing an image for a motion picture requires totally different quality standards. The high quality demanded of an IMAX image meant that the team had to eliminate any artefacts that could show up in the final picture, and consequently rendering times were up to 100 hours compared to the typical 5–6 hours needed for other films. Contrary to the primary goal of most astrophysical visualisations to achieve a fast throughput, their major goal was to create images that looked like they might really have been filmed. “This goal led us to employ a different set of visualisation techniques from those of the astrophysics community—techniques based on propagation of ray bundles (light beams) instead of discrete light rays, and on carefully designed spatial filtering to smooth the overlaps of neighbouring beams,” says James.

Gravitationally-lensed accretion disks

DNEG’s team generated a flat, multicoloured ring standing for the accretion disk and positioned it surrounding the spinning black hole. The result was a warped spac–time around the black hole including its accretion disk. Thorne later wrote in his 2014 book The Science of Interstellar: “You cannot imagine how ecstatic I was when Oliver sent me his initial film clips. For the first time ever –and before any other scientist– I saw in ultra-high definition what a fast-spinning black hole looks like. What it does, visually, to its environment.” The following year, James and his DNEG colleagues published two papers with Thorne on the science and visualisation of these objects (Am. J. Phys 83 486 and Class. Quantum Grav. 32 065001).

Another challenge was to capture the fact that the film camera should be traveling at a substantial fraction of the speed of light. Relativistic aberration, Doppler shifts and gravitational redshifts had to be integrated in the rendering code, influencing how the disk layers would look close to the camera as well as the colour grading and brightness changes in the final image. Things get even more complicated closer to the black hole where space–time is more distorted; gravitational lensing gets more extreme and the computation takes more steps. Thorne developed procedures describing how to map a light ray and a ray bundle from the light source to the camera’s local sky, and produced low-quality images in Mathematica to verify his code before giving it to DNEG to create the fast and high-resolution render. This was used to simulate all the images to be lensed: fields of stars, dust clouds and nebulae and the accretion disk around the Gargantua, Interstellar’s gigantic black hole. In total, the movie notched up almost 800 TB of data. To simulate the starry background, DNEG used the Tycho-2 catalogue star catalogue from the European Space Agency containing about 2.5 million stars, and more recently the team has adopted the Gaia catalogue containing 1.7 billion stars.

Creative industry
With the increased use of visual effects, more and more scientists are working in the field including mathematicians and physicists. And visual effects are not vital only for sci-fi movies but are also integrated in drama or historical films. Furthermore, there are a growing number of companies creating tailored simulation packages for specific processes. DNEG alone has increased from 80 people in 2004 to more than 5000 people today. At the same time, this increase in numbers means that software needs to be scalable and adaptable to meet a wide range of skilled artists, James explains. “Developing specialised simulation software that gets used locally by a small group of skilled artists is one thing but making it usable by a wide range of artists across the globe calls for a much bigger effort – to make it robust and much more accessible”.

DNEG CERN Colloquium

Asked if computational resources are a limiting factor for the future of visual effects, James thinks any increase in computational power will quickly be swallowed up by artists adding extra detail or creating more complex simulations. The game-changer, he says, will be real-time simulation and rendering. Today, video games are rendered in real-time by the computer’s video card, whereas visual effects in movies are almost entirely created as batch-processes and afterwards the results are cached or pre-rendered so they can be played back in real-time. “Moving to real-time rendering means that the workflow will not rely on overnight renders and would allow artists many more iterations during production. We have only scratched the surface and there are plenty of opportunities for scientists”. Even machine learning promises to play a role in the industry, and James is currently involved in R&D to use it to enable more natural body movements or facial expressions. Open data and open access is also an area which is growing, and in which DNEG is actively involved.

“Visual effects is a fascinating industry where technology and hard-science are used to solve creative problems,” says James. “Occasionally the roles get reversed and our creativity can have a real impact on science.”

Gaurang Bhaskar Yodh 1928–2019

Gaurang Yodh

Gaurang Yodh, a passionate particle and cosmic-ray physicist and musician, passed away on 3 June at age 90. He was born in Ahmedabad in India. After graduating from the University of Bombay in 1948, he was recruited by the University of Chicago to join the group of Enrico Fermi and Herb Anderson. After Fermi’s death in 1954, he finished his PhD with Anderson in 1955, after which he moved to Stanford where he worked with Wolfgang Panofsky.

He and his wife returned to Bombay (Mumbai) in 1956, where he started accelerator physics programmes at the Tata Institute of Fundamental Research, but he was lured back to the US and took a physics faculty job at the Carnegie Institute of Technology (later Carnegie Mellon University). In 1961 he joined the physics and astronomy department at the University of Maryland and stayed there until 1988, when he moved to the University of California at Irvine, where he finished his career.

Gaurang’s PhD research work at Chicago with Anderson and Fermi studied the interactions of pions with protons and neutrons. With Panofsky he studied electron–nucleon scattering. He continued this work until the late 1960s when his interests shifted from accelerators to cosmic rays. In 1972, with Yash Pal and James Trefil, he showed that the proton–proton cross section increased with energy – a finding later confirmed at CERN.

Prominent work followed with the development of transition radiation detectors for particle identification. His 1975 paper “Practical theory of the multilayered transition radiation detector” is still a standard reference in high-energy and cosmic-ray physics. In the 1980s, Gaurang’s interests shifted again, in this case to study high-energy gamma rays from space. His ideas led to the development of ground-based water Cherenkov telescopes for the study of gamma rays and searches for sources of cosmic rays. In the 1990s and 2000s, Gaurang and collaborators pursued these detection techniques, and their high-altitude offspring, in two major collaborations – MILAGRO and HAWC – and at UC Irvine Gaurang was a contributor to the IceCube collaboration. He was also a strong advocate for the ARIANNA project, which is developing radio techniques to look for astrophysical neutrinos. Throughout his career, Gaurang mentored many PhD students and post-docs who went on to successful careers.

Gaurang was a renowned sitar player who gave concerts at universities and physics conferences, and in 1956 recorded one of the very first albums of Indian music in the US: Music of India (volumes 1 & 2). He was a gentle and caring man with an infectious optimism and a joy for life. His friends enjoyed his good humour, charm and enthusiasm. He is survived by his three children, eight grandchildren and his sister. 

FPGAs that speak your language

Visualisation of logic gates firing

Teeming with radiation and data, the heart of a hadron collider is an inhospitable environment in which to make a tricky decision. Nevertheless, the LHC experiment detectors have only microseconds after each proton–proton collision to make their most critical analysis call: whether to read out the detector or reject the event forever. As a result of limitations in read-out bandwidth, only 0.002% of the terabits per second of data generated by the detectors can be saved for use in physics analyses. Boosts in energy and luminosity – and the accompanying surge in the complexity of the data from the high-luminosity LHC upgrade – mean that the technical challenge is growing rapidly. New techniques are therefore needed to ensure that decisions are made with speed, precision and flexibility so that the subsequent physics measurements are as sharp as possible.

The front-end and read-out systems of most collider detectors include many application-specific integrated circuits (ASICs). These custom-designed chips digitise signals at the interface between the detector and the outside world. The algorithms are baked into silicon at the foundries of some of the biggest companies in the world, with limited prospects for changing their functionality in the light of changing conditions or detector performance. Minor design changes require substantial time and money to fix, and the replacement chip must be fabricated from scratch. In the LHC era, the tricky trigger electronics are therefore not implemented with ASICs, as before, but with field-programmable gate arrays (FPGAs). Previously used to prototype the ASICs, FPGAs may be re-programmed “in the field”, without a trip to the foundry. Now also prevalent in high-performance computing, with leading tech companies using them to accelerate critical processing in their data centres, FPGAs offer the benefits of task-specific customisation of the computing architecture without having to set the chip’s functionality in stone – or in this case silicon.

Architecture of a chip

Xilinx Virtex 7 FPGA

FPGAs can compete with other high-performance computing chips due to their massive capability for parallel processing and relatively low power consumption per operation. The devices contain many millions of programmable logic gates that can be configured and connected together to solve specific problems. Because of the vast numbers of tiny processing units, FPGAs can be programmed to work on many different parts of a task simultaneously, thereby achieving massive throughput and low latency – ideal for increasingly popular machine-learning applications. FPGAs can also support high bandwidth inputs and outputs of up to about 100 dedicated high-speed serial links, making them ideal workhorses to process the deluge of data that streams out of particle detectors (see CERN Courier September 2016 p21).

The difficulty is that programming FPGAs is traditionally the preserve of engineers coding low-level languages such as VHDL and Verilog, where even simple tasks can be tricky. For example, a function to sum two numbers together requires several lines of code in VHDL, with the designer even required to define when the operations happen relative to the processor clock (figure 1). Outsourcing the coding is impractical, given the imminent need to implement elaborate algorithms featuring machine learning in the trigger to quickly analyse data from high-granularity detectors in high-luminosity environments. During the past five years, however, tools have matured, allowing FPGAs to be programmed in variants of high-level languages such as C++ and Java, and bringing FPGA coding within the reach of physicists themselves.

Fig. 1.

But can high-level tools produce FPGA code with low-enough latency for trigger applications? And can their resource usage compete with professionally developed low-level code? During the past couple of years CMS physicists have trialled the use of a Java-based language, MaxJ, and tools from Maxeler Technologies, a leading company in accelerated computing and data-flow engines, who were partners in the studies. More recently the collaboration has also gained experience with the C++-based Vivado high-level synthesis (HLS) tool of the FPGA manufacturer Xilinx. The work has demonstrated the potential for ground-breaking new tools to be used in future triggers, without significantly increasing resource usage and latency.

Track and field-programmable

Tasked with finding hadronic jets and calculating missing transverse energy in a few microseconds, the trigger of the CMS calorimeter handles an information throughput of 6.5 terabits per second. Data are read out from the detector into the trigger-system FPGAs in the counting room in a cavern adjacent to CMS. The official FPGA code was implemented in VHDL over several months each of development, debugging and testing. To investigate whether high-level FPGA programming can be practical, the same algorithms were implemented in MaxJ by an inexperienced doctoral student (figure 2), with the low-level clocking and management of high-speed serial links still undertaken by the professionally developed code. The high-level code had comparable latency and resource usage with one exception: the hand-crafted VHDL was superior when it came to quickly sorting objects by their transverse momentum. With this caveat, the study suggests that using high-level development tools can dramatically lower the bar for developing FPGA firmware, to the extent that students and physicists can contribute to large parts of the development of labyrinthine electronics systems.

Fig. 2.

Kalman filtering is an example of an algorithm that is conventionally used for offline track reconstruction on CPUs, away from the low-latency restrictions of the trigger. The mathematical aspects of the algorithm are difficult to implement in a low-level language, for example requiring trajectory fits to be iteratively optimised using sequential matrix algebra calculations. But the advantages of a high-level language could conceivably make Kalman filtering tractable in the trigger. To test this, the algorithm was implemented for the phase-II upgrade of the CMS tracker in MaxJ. The scheduler of Maxeler’s tool, MaxCompiler, automatically pipelines the operations to achieve the best throughput, keeping the flow of data synchronised. This saves a significant amount of effort in the development of a complicated new algorithm compared to a low-level language, where this must be done by hand. Additionally, MaxCompiler’s support for fixed-point arithmetic allows the developer to make full use of the capability of FPGAs to use custom data types. Tailoring the data representation to the problem at hand results in faster, more lightweight processing, which would be prohibitively labour-intensive in a low-level language. The result of the study was hundreds of simultaneous track fits in a single FPGA in just over a microsecond.

Ghost in the machine

Deep neural networks, which have become increasingly prevalent in offline analysis and event reconstruction thanks to their ability to exploit tangled relationships in data, are another obvious candidate for processing data more efficiently. To find out if such algorithms could be implemented in FPGAs, and executed within the tight latency constraints of the trigger, an example application was developed to identify fake tracks – the inevitable byproducts of overlapping particle trajectories – in the output of the MaxJ Kalman filter described above. Machine learning has the potential to distinguish such bogus tracks better than simple selection cuts, and a boosted decision tree (BDT) proved effective here, with the decision step, which employs many small and independent decision trees, implemented with MaxCompiler. A latency of a few hundredths of a microsecond – much shorter than the iterative Kalman filter as BDTs are inherently very parallelisable – was achieved using only a small percentage of the silicon area of the FPGA, so leaving room for other algorithms. Another tool capable of executing machine-learning models in tens of nanoseconds is the “hls4ml” FPGA inference engine for deep neural networks, built on the Vivado HLS compiler of Xilinx. With the use of such tools, non-FPGA experts can trade-off latency and resource usage – two critical metrics of performance, which would require significant extra effort to balance in collaboration with engineers writing low-level code.

Calorimeter-trigger cards processing CMS events

Though requiring a little extra learning and some knowledge of the underlying technology, it is now possible for ordinary physicists to programme FPGAs in high-level languages, such as Maxeler’s MaxJ and Xilinx’s Vivado HLS. Development time can be cut significantly, while maintaining latency and resource usage at a similar level to hand-crafted FPGA code, with the fast development of mathematically intricate algorithms an especially promising use case. Opening up FPGA programming to physicists will allow offline approaches such as machine learning to be transferred to real-time detector electronics.

Novel approaches will be critical for all aspects of computing at the high-luminosity LHC. New levels of complexity and throughput will exceed the capability of CPUs alone, and require the extensive use of heterogenous accelerators such as FPGAs, graphics processing units (GPUs) and perhaps even tensor processing units (TPUs) in offline computing. Recent developments in FPGA interfaces are therefore most welcome as they will allow particle physicists to execute complex algorithms in the trigger, and make the critical initial selection more effective than ever before. 

Superconductivity heats up in Dresden

International Conference on RF Superconductivity

Accelerator experts from around the world met from 30 June – 5 July in Dresden’s historic city centre for six days of intense discussions on superconducting radio-frequency (SRF) science and technology. The Helmholtz-Zentrum Dresden-Rossendorf (HZDR) hosted the 19th conference in the biannual series, which demonstrated that SRF has matured to become the enabling technology for many applications. New SRF-based large-scale facilities throughout the world include the European XFEL in Germany, LCLS-II and FRIB in the US, ESS in Sweden, RAON in Korea, and SHINE in China.

The conference opened on Germany’s hottest day of the year with a “young scientists” session comprising 40 posters. The following week featured a programme packed with 67 invited oral presentations, more than 300 posters and an industrial exhibition. Keynote lecturer Thomas Tschentscher (European XFEL) discussed applications of high-repetition-rate SRF-based X-ray sources, while Andreas Maier (University of Hamburg) reviewed rapidly advancing laser-plasma accelerators, emphasising their complementarity with SRF-based systems.

Much excitement in the community was generated by new, fundamental insights into power dissipation in RF superconductors. A better understanding of the role of magnetic flux trapping and impurities for RF losses has pushed state-of-the-art niobium to near its theo­retical limit. However, recent advances with Nb3Sn (CERN Courier July/August 2019 p9) have demonstrated performance levels commensurate with established niobium systems, but at a much higher operating temperature (≥ 4.2 K rather than ≤ 2 K). Such performance was unthinkable just a few years ago. Coupled with technological developments for tuners, digital control systems and cavity processing, turn-key high-field and continuous-wave operation at 4.2 K and above appears within reach. The potential benefit for both large-scale facilities as well as compact SRF-based accelerators in terms of cost and complexity is enormous.

The SRF conference traditionally plays an important role in attracting new, young researchers and engineers to the field, and provides them with a forum to present their results. In the three days leading up to the conference, HZDR hosted tutorials covering all aspects from superconductivity fundamentals to cryomodule design, which attracted 89 students and young scientists. During the conference, 18 young investigators were invited to give presentations. Bianca Giaccone (Fermilab) and Ryan Porter (Cornell University) received prizes for the best talks, alongside Guilherme Semione (DESY) for best poster.

The SRF conference rotates between Europe, Asia and the Americas. SRF 2021 will be hosted by Michigan State University/FRIB, while SRF 2023 moves on to Japan’s Riken Nishina Center.

bright-rec iop pub iop-science physcis connect